Medical image analysis using AI has developed rapidly in recent years. Now, one of the largest studies to date has been carried out using AI-assisted image analysis of lymphoma, cancer of the lymphatic system. Researchers at Chalmers University of Technology in Sweden, have developed a computer model that can successfully find signs of lymph node cancer in 90 percent of cases.
New computer-aided methods for interpreting medical images are being developed for various medical conditions. They can reduce the workload for radiologists, by giving a second opinion or ranking which patients need treatment the fastest.
"An AI-based computer system for interpreting medical images also contributes to increased equality in healthcare by giving patients access to the same expertise and being able to have their images reviewed within a reasonable time, regardless of which hospital they are in. Since an AI system has access to much more information, it also makes it easier in rare diseases where radiologists rarely see images," says Ida Häggström, Associate Professor at the Department of Electrical Engineering at Chalmers.
In close collaboration with Sahlgrenska Academy at the University of Gothenburg and Sahlgrenska University Hospital, she has participated in the development of medical imaging in the field of cancer, as well as in a number of other medical conditions, such as cardiovascular disease, stroke and osteoporosis.
Large study to track cancer in the lymphatic system
Together with clinically active researchers at, among others, Memorial Sloan Kettering Cancer Center in New York, Ida Häggström has developed a computer model that was recently presented in The Lancet Digital Health.
"Based on more than 17,000 images from more than 5,000 lymphoma patients, we have created a learning system in which computers have been trained to find visual signs of cancer in the lymphatic system," says Häggström.
In the study, the researchers examined image archives that stretched back more than ten years. They compared the patients’ final diagnosis with scans from positron emission tomography (PET) and computed tomography (CT) taken before and after treatment. This information was then used to help train the AI computer model to detect signs of lymph node cancer in an image.
Supervised training
The computer model that Ida Häggström has developed is called Lars, Lymphoma Artificial Reader System, and is a so-called deep learning system based on artificial intelligence. It works by inputting an image from positron emission tomography (PET) and analysing this image using the AI model. It is trained to find patterns and features in the image, in order to make the best possible prediction of whether the image is positive or negative, i.e. whether it contains lymphoma or not.
"I have used what is known as supervised training, where images are shown to the computer model, which then assesses whether the patient has lymphoma or not. The model also gets to see the true diagnosis, so if the assessment is wrong, the computer model is adjusted so that it gradually gets better and better at determining the diagnosis,” says Häggström.
In practice, what does it actually mean that the computer model uses artificial intelligence and deep learning to make a diagnosis?
"It's about the fact that we haven't programmed predetermined instructions in the model about what information in the image it should look at, but let it teach itself which image patterns are important in order to get the best predictions possible.
Support for radiologists
Ida Häggström describes the process of teaching the computer to detect, in this case, cancer in the images as time-consuming, and says that it has taken several years to complete the study. One challenge has been to produce such a large amount of image material. It has also been challenging to adapt the computer model so that it can distinguish between cancer and the temporary treatment-specific changes that can be seen in the images after radiotherapy and chemotherapy.
"In the study, we estimated the accuracy of the computer model to be about ninety per cent, and especially in the case of images that are difficult to interpret, it could support radiologists in their assessments.”
However, there is still a great deal of work to be done to validate the computer model if it is to be used in clinical practice.
"We have made the computer code available now so that other researchers can continue to work on the basis of our computer model, but the clinical tests that need to be done are extensive," says Häggström.
More about the research
• The scientific article Deep learning for [¹⁸F]fluorodeoxyglucose-PET-CT classification in patients with lymphoma: a dual-centre retrospective analysis has been published in The Lancet Digital Health.
• The authors of the study are Ida Häggström, Doris Leithner, Jennifer Alvén, Gabriele Campanella, Murad Abusamra, Honglei Zhang, Shalini Chhabra, Lucian Beer, Alexander Haug, Gilles Salles, Markus Raderer, Philipp B. Staber, Anton Becker, Hedvig Hricak, Thomas J. Fuchs, Heiko Schöder and Marius E. Mayerhoefer.
• The researchers are active at Chalmers University of Technology, Memorial Sloan Kettering Cancer Center in New York, Medical University in Vienna, Icahn School of Medicine at Mount Sinai in New York and NYU Langone Health in New York.
For more information, please contact:
Ida Häggström, Associate Professor, Division of Signal Processing and Biomedical Engineering, Department of Electrical Engineering, Chalmers University of Technology
+46 31 772 22 19, idah@chalmers.se
The contact person Ida Häggström speaks English and Swedish, and is available for live and pre-recorded interviews. At Chalmers, we have podcast studios and broadcast filming equipment on site and would be able to assist a request for a television, radio or podcast interview.
Caption to the figure showing the principle of how the AI model is trained:
An image from positron emission tomography (PET) is entered and analysed by the AI model. It is trained to find patterns and features in the image in order to make the best possible prediction of whether the image is positive or negative, i.e. whether it contains lymphoma or not. Illustration: Ida Häggström
NOTE TO THE EDITOR:
Images provided in Chalmers University of Technology press releases are, unless specified otherwise, free for download and publication as long as credit is given to the University and the individual creator. Cropping and rescaling of the images is permitted when required for adaptation to the publication’s format, but modifications that would influence the message and content of the original are not. The material is primarily intended for journalistic and informative use, to assist in communication and coverage of Chalmers’ research and education. Commercial usage, for example the marketing of goods and services, is not permitted.
We kindly request credit to be given in the following format where possible:
Image/Graphic/Illustration: Chalmers University of Technology | Name Surname
Journal
The Lancet Digital Health
Method of Research
Computational simulation/modeling
Article Title
Deep learning for [18F]fluorodeoxyglucose-PET-CT classification in patients with lymphoma: a dual-centre retrospective analysis
Article Publication Date
21-Dec-2023
COI Statement
MEM received honoraria for lectures from Siemens, General Electric, and Bristol Myers Squibb. GS received consulting fees from AbbVie, Beigene, Bristol-Myers Squibb Celgene, Epizyme, Genentech Roche, Genmab, Incyte, Janssen, Kite Gilead, Loxo, Milteniy, Molecular Partners, Morphosys, Nordic Nanovector, Novartis, Rapt, Takeda, and Ipsen; honoraria from AbbVie, Bayer, Incyte, Kite Gilead, Morphosys, Novartis, and Regeneron; participated on a Data Safety Monitoring Board or Advisory Board of Beigene; and has stock or stock options in Owkin. HH participated on Advisory Boards of Sidney Kimmel Comprehensive Cancer Center at Johns Hopkins University, the Medical University of Vienna, and the Scientific Committee and Board of Trustees of the German Cancer Research Center; and is on the Board of Directors of Ion Beam Applications and Paige. All other authors declare no competing interests.