Doctors at Addenbrooke’s Hospital in Cambridge aim to drastically cut cancer waiting times by using artificial intelligence (AI) to automate lengthy radiotherapy preparations.
The AI technology, known as InnerEye, is the result of an eight-year collaboration between researchers at Cambridge-based Microsoft Research, Addenbrooke’s Hospital and the University of Cambridge.
InnerEye aims to save clinicians many hours of time laboriously marking up patient scans prior to radiotherapy. The team has demonstrated how machine learning (ML) models built using the InnerEye open-source technology can cut this preparation time by up to 90% – meaning that waiting times for starting potentially life-saving radiotherapy treatment can be dramatically reduced.
Health and Social Care Secretary Matt Hancock said: “New innovations like this can make all the difference to patients and I am proud to see we are once again leading the way in new cancer treatments.
“Helping people receive treatment faster is incredibly important and will not only improve recovery rates but will save clinicians precious time so they can focus on caring for patients.
“Embracing new technologies will help save lives and is vital for the sustainability of the NHS, and our NHS Long Term Plan will continue to deliver the best possible care for patients so that we can offer faster, more personalized and effective cancer treatment for all.”
Dr. Raj Jena from the Department of Oncology at the University of Cambridge and an oncologist at Addenbrooke’s, who co-leads InnerEye, said: “These results are a game-changer. To be diagnosed with a tumor of any kind is an incredibly traumatic experience for patients. So as clinicians we want to start radiotherapy promptly to improve survival rates and reduce anxiety. Using machine learning tools can save time for busy clinicians and help get our patients onto treatment as quickly as possible.”
Dr. Yvonne Rimmer, also from at Addenbrooke’s, said: “There is no doubt that InnerEye is saving me time. It’s very good at understanding where tumors and healthy organs are. It’s speeding up the process so I can concentrate on looking at a patient’s diagnostic images and tailoring treatment to them.
https://www.youtube.com/embed/AHchuDX0nn4?color=whiteCredit: University of Cambridge
“But it’s important for patients to know that the AI is helping me do my job; it’s not replacing me in the process. I double check everything the AI does and can change it if I need to. The key thing is that most of the time, I don’t need to change anything.”
Up to half of the population in the UK will be diagnosed with cancer at some point in their lives. Of those, half will be treated with radiotherapy, often in combination with other treatments such as surgery, chemotherapy, and increasingly immunotherapy.
Radiotherapy involves focusing high-intensity radiation beams to damage the DNA of hard cancerous tumors while avoiding surrounding healthy organs. This is a critical tool, with around 40% of successfully treated patients undergoing some form of radiotherapy.
Planning radiotherapy treatment can be a lengthy process. It starts with a 3-D CT (Computed Tomography) imaging scan of the part of the body to be targeted. These CT images come in the form of stacks of 2-D images, dozens of images deep, each of which must be examined and marked up by a radiation oncologist or specialist technician. This process is called contouring. In each image, an expert must manually draw a contour line around the tumors and key healthy organs at risk in the target area using dedicated computer software. For complex cases, this can take several hours in the planning of a single patient’s treatment.
This image segmentation task is a rate-limiting factor in the cancer treatment pathway for radiotherapy, which increases the burden of time on clinicians and the financial cost to hospitals. As this task is subjective, there can be significant variability across experts and institutions where acquisition protocols and patient demographics vary. This is a limitation to the use of imaging in clinical trials and can introduce variability in patient care.
Research published by the team in JAMA Network Open confirms that the InnerEye ML models can accurately and rapidly carry out the otherwise lengthy ‘image segmentation’ requiring hours of expert clinicians’ time.
Head of Health Intelligence at Microsoft Research, Aditya Nori, said: “This is the first time, we believe, that an NHS Trust has implemented its own deep learning solution trained on their own data, so it can be used on their patients. It paves the way for more NHS Trusts to take advantage of open-source AI tools to help reduce cancer treatment times.”
The InnerEye Deep Learning Toolkit has been made freely available as open-source software by Microsoft.
While ML models developed using the tool need to be tested and validated in each individual healthcare setting, doctors at Cambridge University Hospitals (CUH) have demonstrated how the technology can be applied in clinical settings.
University of Cambridge