28 Feb 2019
Research
4 minute read

Biophysics-Inspired AI Uses Photons to Help Surgeons Identify Cancer

Cancerous tissue differs from healthy tissue in many ways, the most obvious being that medical intervention seeks to eradicate every last bit of one while preserving as much as possible of the other. Our collaborative team, including quantitative scientists and colorectal surgeons, seeks to improve medical interventions by exploiting another difference: cancerous tissue grows its own blood supply, which is typically chaotic and leaky; this process is called angiogenesis. The resulting difference in blood flow patterns can be used to detect and potentially delineate cancer.

Cartoon drawing to illustrate uptake of IGC by cancerous tissue over time.Figure 1. Perfusion dynamics: TH fluorescent dye ICG is taken up more quickly by the healthy tissue, but it lingers for much longer in the cancer tissue, due to leakage from capillaries into the surrounding tissue.

A fluorescent dye called ICG and an infrared camera can be used to quantify the differences in blood perfusion, which is the passage of blood through the vascular system to tissues. In our preliminary work, we observed that uptake and release of ICG is faster in healthy tissue than in cancerous tissue, potentially due to chaotic and leaky capillaries (see Figure 1 and Video 1 ). However, current clinical usage of ICG to guide decision-making is limited to human observation of the long, almost stationary phase during which the dye persists in the cancer but has been washed out from the healthy tissue (corresponding to Stage 3 in Figure 1). This is because it is challenging even for very experienced surgeons to tell which regions of tissue were perfused early and well, which were not, and which retained the dye longer.

Tracking ICG fluorescence in cancerous and healthy tissue. Note the slower uptake in regions 1 - 3, shown in white, which are located on the tumor, and compare to the uptake on regions 4 - 6, which are placed on healthy tissue. Note that the green color is a false-color overlay, the intensity of the green is proportional to the intensity of the image captured by the near-infrared (NIR) camera.

In this context, we at IBM Research - Ireland together with colorectal surgeons and pathologists at University College Dublin/Mater Misericordiae Hospital, biochemists from the Royal College of Surgeons Ireland, and digital pathology-focused SME Deciphex were among the first awardees of the Irish Disruptive Technology Innovation Fund to work on a project titled “The Future of Colorectal Cancer Diagnosis and Treatment: Combining Tissue Responsive Fluorescent Probes, AI, and Machine Learning to Transform Medical Care”. The ultimate goal of this project is to provide novel computational tools to extract the information encoded in the dynamic behavior of dyes from real-time feeds collected during surgery from Clinical Imaging Systems (CIS), using biophysical models of perfusion and photon diffusion in biological tissues, and to use this information in biophysics-inspired AI tools to support surgeons’ decisions. This information could potentially be made available to the surgeon during surgery through an Augmented Reality (AR) view, which would overlay it on the real-time feed from a CIS. Such Augmented Reality for Surgeons (ARS) decision-support systems could support human judgement by combining features visible and interpretable by a skilled human (e.g., shape, color, and mechanical properties of the tissue) with information that can be revealed only by computer analysis (e.g., subtle changes and differences in textures, and perfusion properties estimated from dye inflow, uptake, release, and outflow).

BiophysicsFig2-768x420.pngFigure 2. Biophysics-inspired AI: Extracting the ICG concentration from the brightness data requires taking into account the complex biophysical interactions of light, ICG, and tissue. The obtained data can then be used to derive features for the AI (a simple SVM is shown as an illustration). The insights gained are made available in the operating theatre.

The project is funded for three years, and the challenges are many. Unlike in many imaging methods, the body part being imaged cannot be immobilized, and video processing needs to compensate for patient and camera movement as well as occasional occlusions. Next, the concentration of fluorescent dye in tissue influences its optical properties in a complex way, and the optical properties, in turn, have to be extracted from the image captured by the camera sensors by solving the inverse problem for the photon diffusion equation. Once the spatio-temporal evolutions of dye concentrations have been estimated, all measured and estimated parameters will be used to define features for a classification algorithm, scoring each area of tissue in one or more dimensions such as probability of malignancy, quality of blood supply, homogeneity of dye uptake, etc. We call this combination of a biophysical inverse problem and data-driven methods biophysics-inspired AI (see Figure 2 for an illustration of the flow we are envisioning).

Such biophysics-inspired AI techniques would enable a richer amount of information and, ultimately, the building of a 3D surgical heat map displaying areas of suspected malignant growth. ARS systems can then potentially be used to support intraoperative decisions of surgeons, including those with less experience, by providing them with direct access to relevant collective expert knowledge.

Date

28 Feb 2019