Powered by OpenAIRE graph

Intuitive Surgical Inc

Intuitive Surgical Inc

9 Projects, page 1 of 2
  • Funder: UK Research and Innovation Project Code: EP/Z534754/1
    Funder Contribution: 1,103,370 GBP

    Robotic-assisted surgery (RAS) is carried out in almost 2M cases worldwide, delivering advantages to both patients and surgeons. Its adoption in the UK is limited by the availability of training: RAS requires high levels of skill and rigorous training, which is not widely available. One critical aspect of training is the performance evaluation of trainees, which is currently done through manual review by senior surgeons: it is inefficient and difficult to standardise. To address this issue the multidisciplinary HuMIRoS project will pioneer AI technology to automatically evaluate the performance of robotic surgeon trainees and provide them meaningful feedback. This will lead to improved training experience and accelerated learning curve. Following a user-centred approach, the project will develop the first AI-based system for automated action quality assessment (AQA): automatically measure the quality of actions carried out with the surgical robot. The core idea for AQA is to fuse information from heterogeneous data sources (video, robot joint kinematics, semantic information) and pioneer multimodal artificial intelligence (MMAI) methods that link performance assessment to surgical actions (e.g., instrument movements) and consequences (e.g., errors). AQA outcomes will be presented to RAS trainees as constructive feedback through a user interface (UI), co-designed with users and stakeholders to deliver optimum training experience and outcomes. Closing the loop, machine understanding will be rationalised and made accountable on the UI using eXplainable AI (XAI), to facilitate the formulation of causal interpretation of the observed surgeons' actions and derived performance estimates. HuMIRoS has the following key objectives: To architect new MMAI technology for estimating and classifying RAS performance by representing core skill attributes (tool manipulation, camera (endoscope) navigation, respect for tissue, surgical outcome) and detecting instances of surgical errors (or suboptimal execution). To develop novel human-computer interactions (HCI) around the MMAI, including the development of novel XAI methods, to communicate AQA predictions to end-users and make them accountable, thereby optimising RAS training. To validate the HuMIRoS approach with experiments in RAS training courses and real cases, demonstrate and quantify the benefits introduced, release datasets to promote AQA research. The research will leverage datasets collected from the largest UK RAS training hub (project partner The Griffin Institute, Northwick Park Institute for Medical Research) and annotated with clinically validated performance metrics and error description tools. Experimentation in structured training sessions, end-user workshops, and real RAS cases will evaluate the face and construct validity and ability to generalise, of developed solutions. The project will regularly engage patient advisory groups to inform AI development and purpose. Direct beneficiaries of project outcomes include: Patients and healthcare systems: Accelerate the learning curve of new surgeons trained faster and more efficiently. Increase uptake and democratisation of RAS as it will be accessible in more surgical sites. RAS trainees: Optimise the training experience for trainee surgeons, now able to receive expert-level feedback readily and develop skills more efficiently. Faculty time and resources will also be streamlined. Facilitate integration of new technologies (e.g., telementoring) in RAS training. RAS surgeons: Decrease intraoperative complications by identifying moments of high risk for error.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/X013898/1
    Funder Contribution: 717,032 GBP

    This proposal is centred on the development of a novel technology platform comprising interventional imaging probes to guide endoscopic lung biopsies. These probes comprise novel, highly miniaturised pulse-echo optical ultrasound (US) sensors developed for optically transmitting and receiving US. Integrated within a robotic bronchoscopy system, the imaging probes will provide real-time B-mode (2D) images to accurately identify deep lung nodules and provide real-time guidance of biopsy needles. This Healthcare Technologies Investigator-Led project comprises an ambitious plan to significantly advance all-optical ultrasound along the clinical translational path by exploring its use in robotic-guided endobronchial ultrasound imaging. All-optical pulse-echo ultrasound imaging is an emerging technology platform for guiding minimally invasive procedures. Fibre optic optical ultrasound (OpUS) transducers comprise distinct mechanisms for generating and receiving ultrasound. Optical generation of US is performed via the photoacoustic effect, in which modulated incident light is provided to a highly absorbing material and the resulting thermal energy deposition leads to a transmitted US wave. The advantages of OpUS and the promising proof-of-concept data obtained with tissue imaging in recent studies, offer a major opportunity to develop novel probes that are used clinically for robotic bronchoscopy. The high level of miniaturisation to reach distal regions of the lung, and the novelty of integrating OpUS with needle biopsies and robotic navigation, offer the prospects of effecting significant improvements to clinical practice and achieving high performance with low-cost components, for rapid clinical uptake and patient benefits.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/P012841/1
    Funder Contribution: 1,239,250 GBP

    The paradigm of modern surgical treatment is to reduce the invasive trauma of procedures by using small keyhole ports to enter the body. Robotic assistant systems provide tele-manipulated instruments that facilitate minimally invasive surgery by improving the ergonomics, dexterity and precision of controlling manual keyhole surgery instruments. Robotic surgery is now common for minimally invasive prostate and renal cancer procedures. But imaging inside the body is currently restricted by the access port and only provides information at visible organ surfaces which is often insufficient for easy localisation within the anatomy and avoiding inadvertent damage to healthy tissues. This project will develop robotic assisted imaging which will exploit the autonomy and actuation capabilities provided by robotic platforms, to optimise the images that can be acquired by current surgical imaging modalities. In the context of robotic assisted surgery, now an established surgical discipline, advanced imaging can help the surgeon to operate more safely and efficiently by allowing the identification of structures that need to be preserved while guiding the surgeon to anatomical targets that need to be removed. Providing better imaging and integration with the robotic system will result in multiple patient benefits by ensuring safe, accurate surgical actions that lead to improved outcomes. To expose this functionality, new theory, computing, control algorithms and real-time implementations are needed to underpin the integration of imaging and robotic systems within dynamic environments. Information observed by the imaging sensor needs to feed back into the robotic control loop to guide automatic sensor positioning and movement that maintains the alignment of the sensor to moving organs and structures. This level of automation is largely unexplored in robotic assisted surgery at present because it involves multiple challenges in visual inference, reconstruction and tracking; calibration and re-calibration of sensors and various robot kinematic strategies; integration with surgical workflow and user studies. Combined with the use of pre-procedural planning, robotic assisted imaging can lead to pre-planned imaging choices that are motivated by different clinical needs. As well as having direct applications in surgery, the robotic assisted imaging paradigm will be applicable to many other sectors transformed by robotics, for example manufacturing or inspection, especially when working within non-rigid environments. For this cross sector impact to be achieved the project will build the deep theoretical and robust software platforms that are ideally suited for foundational fellowship support.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/V047914/1
    Funder Contribution: 8,000,770 GBP

    There is a pressing need to improve the precision, control and selectivity of surgical procedures addressing several high-incidence cancers. For example in the UK, the incidence of basal cell carcinoma (BCC) has increased by approximately 250% since the 1990s, with 137,000 new cases of BCC each year. Bowel cancer is the 4th most common cancer and is the second most common cause of cancer death. Some 15% of new bowel cancer cases are early stage and amenable to potential endoluminal surgery; this proportion is increasing with national screening programs. Delayed diagnosis and incomplete excision of tumours are key drivers of patient morbidity, and squander limited surgical resources. Streamlining screening and early diagnosis processes is now even more important with more patient backlog caused by Covid-19. The default surgical practice is to remove cancers wherever possible, along with a margin of healthy tissue. Leaving cancer cells behind leads to reoccurrence, but removing too much healthy tissue increases both the risk of complications and the loss of normal function. Trying to optimise this balance is a global challenge. For example, BCCs often spread out beneath the surface of the skin such that their entirety cannot be detected until surgery. Moh's micrographic surgery is the gold standard for treating BCCs: the tumour is removed section by section and examined under the microscope until no further tumour can be seen. This is both time consuming and traumatic for the patient, typically resulting in larger skin grafts than expected. If the extent of the tumour could be accurately determined, using terahertz (THz) imaging prior to surgery, the procedure would be faster, and grafts better planned. Similarly, if a diagnostic THz imaging capability could be added to a flexible endoscope, more colorectal tumours could be identified in situ and resected without waiting for histology results (typically 2 weeks) and a follow-up procedure. In this programme, a highly interdisciplinary team consisting of investigators at Universities of Warwick, Exeter and Leeds in Physics, Engineering and Medicine, and at the University Hospital of Coventry and Warwickshire and the Leeds Teaching Hospitals NHS Trust, join forces to optimise patient diagnosis and treatment. The team is supported by industry partners including TeraView Ltd, Intuitive Surgical, Kuka (world leader of industrial robots), QinetiQ, the National Physical Laboratory and Lubrizol (an international cosmetics company). THz light is non-ionising, uses low power levels such that thermal effects are insignificant and is consequently safe for in vivo imaging of humans. It is very sensitive to intermolecular interactions such as hydrogen bonds, and probes processes that occur on picosecond timescales. Owing to the high sensitivity of THz light to tissue hydration and composition, THz spectroscopic imaging can help locate and diagnose lesions that cannot be seen by other imaging modalities. In Terabotics, we will integrate THz technology into robotic probes to develop improved platforms for cancer detection and surgical removal. We will develop probes that can be used on the skin as well as in the abdominal cavity and, by miniaturising the technology, we will also develop a new flexible probe for robotic colonoscopy. In this way the project will lead to more efficient cancer diagnosis and surgery, saving surgeons' operating time and reducing the number of surgeries needed. This is because accurately determining the extent of cancers prior to surgery will enable better surgical planning and reduce the need for a second surgery. Being able to diagnose cancers in situ will also give a faster diagnosis to treatment time. These factors will reduce trauma, costs, patient backlog and waiting lists, and improve patient outcomes. In short, our breakthrough in developing in situ diagnosis will bring step changes in the detection and treatment of cancer for many years to come.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/P012779/1
    Funder Contribution: 6,236,360 GBP

    As minimally invasive surgery is being adopted in a wide range of surgical specialties, there is a growing trend in precision surgery, focussing on early malignancies with minimally invasive intervention and greater consideration on patient recovery and quality of life. This requires the development of sophisticated micro-instruments integrated with imaging, sensing, and robotic assistance for micro-surgical tasks. This facilitates management of increasingly small lesions in more remote locations with complex anatomical surroundings. The proposed programme grant seeks to harness different strands of engineering and clinical developments in micro-robotics for precision surgery to establish platform technologies in: 1) micro-fabrication and actuation; 2) micro-manipulation and cooperative robotic control; 3) in vivo microscopic imaging and sensing; 4) intra-operative vision and navigation; and 5) endoluminal platform development. By using endoluminal micro-surgical intervention for gastrointestinal, cardiovascular, lung and breast cancer as the exemplars, we aim to establish a strong technological platform with extensive industrial and wider academic collaboration to support seamless translational research and surgical innovation that are unique internationally.

    more_vert
  • chevron_left
  • 1
  • 2
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.