AI model shows promise for 'virtual staining' of cancerous tissue

Pathology Digital Workstation Microscope Social

Researchers have created an artificial intelligence (AI) model that generates images showing virtual colorations of cancerous tissue.

According to the research team, which includes scientists from the Universities of Lausanne and Bern in Switzerland, the generative tool is designed to accurately depict what the staining for a given cellular marker would look like in its synthesized images, reducing resource-intensive laboratory analyses.

In an article published in Nature Machine Intelligence, the research team detailed its development of and findings for its model, dubbed the VirtualMultiplexer. They initially trained the VirtualMultiplexer model on images from 210 patients of a prostate cancer cohort from the European Multicenter Prostate Cancer Clinical and Translational Research Group.

They used unpaired hematoxylin- and eosin-stained and immunohistochemistry (IHC) images of four cores from each patient for six clinically relevant markers. After the tool learned the defining parameters for stained images, the VirtualMultiplexer could then apply those rules to a tissue image and generate a virtual version with the stain in question.

The team validated the results to ensure that the output was accurate and clinically relevant, comparing the VirtualMultiplexer's generated results against those of both unpaired stain-to-stain (S2S) translation models, such as CycleGAN. Additionally, they confirmed how well the generated images predicted outcomes -- e.g., disease progression or patient survival -- through comparison to real-world data from stained tissue.

In both instances, the virtual dye sets had high levels of accuracy and were clinically useful. Moreover, compared to the S2S models, the authors noted that the "results indicated that virtual images generated by the VirtualMultiplexer were closer to the real ones in terms of distribution than any of the competing methods."

The researchers further assessed the quality of the virtual images by conducting a visual Turing test. Three experts in prostate histopathology and one board-certified pathologist were shown randomly selected patches from 50 real and 50 generated images and were asked to classify each patch as virtual or real.

The virtual model was able to "trick the experts," the researchers wrote, with a close-to-random average sensitivity of 52.1% and specificity of 54.1% across all markers. On a staining quality assessment, in which pathologists performed a qualitative assessment of both real and virtual staining of a set of 50 real and 50 virtual images by set criteria (e.g., background, staining pattern, cell type specificity), on average 70.7% of the model-generated images reached an acceptable staining quality compared with 78.3% of the real images.

While the researchers noted that there were limitations to the VirtualMultiplexer model and that more thorough evaluations would need to be conducted, it provides a foundational model for future work. And, as it is a stain-agnostic model, future adaptations across imaging technologies could substantially lower costs.

"Our vision is that future extensions of our work could lead to an ever-growing and readily available dictionary of virtual stainers for IHC and beyond, surpassing in multiplexing even the most cutting-edge technologies and accelerating spatial biology," the authors said in conclusion.

Page 1 of 18
Next Page