Engineers and physicians have developed a camera inspired by a butterfly’s eye to distinguish between tumors and healthy tissue in breast cancer patients. The system could help surgeons detect and remove cancerous tissue more effectively than current technologies.
A team of engineers and physicians have developed a camera that uses nanoscale optics and photonic crystals inspired by a butterfly’s eye to distinguish between tumors and healthy tissue in breast cancer patients. The system could help surgeons detect and remove cancerous tissue more effectively and efficiently than current technologies.
“Breast cancer is one of the more common cancers, and up to 25 percent of patients can have incomplete cancer resection,” or removal of the tumor, says Viktor Gruev, University of Illinois professor of electrical and computer engineering. “Anything we can do to improve this will also improve quality of life for patients.”
He and his colleagues at Washington University in St. Louis first tested the camera in mice and then in clinical trials with 11 breast cancer patients. They reported their findings in the journal Optica.
Many surgeons still rely on sight and touch to distinguish between cancerous and healthy tissue. This sometimes results in cancerous material being left in the body. While technology exists to help them detect cancerous tissue, those systems start at $20,000 and are difficult to use in the operating theater. Gruev says his battery-sized system is easier to use and could cost only $200.
Existing FDA-approved technologies rely on near-infrared radiation (NIR) greater than 780 nm. Human tissue has very low absorption coefficients at those wavelengths. This enables the camera to see through skin and organs to detect light emitted by NIR fluorescent markers. This enables surgeons to make better informed decisions about where to make incisions prior to surgery, and to identify cancerous material even when hidden under a flap of skin or behind an organ.
Yet existing NIR instruments have issues. While they collect both color and fluorescence data at the same time, they use separate cameras. “This means you are relying on one instrument to superimpose its image on the other instrument,” Gruev says.
This can lead to inaccuracies. Existing FDA-approved instruments use multiple optical elements, such as beam splitters and relay lenses, to send visible and infrared light to separate detectors. Even small fluctuations of temperatures, caused by a nurse walking rapidly by or several surgeons trapping heat when they lean over a patient, can cause the fluctuations that alter optical performance. As a result, surgeons may make inaccurate cuts or leave some tumor tissue behind.
Gruev’s new butterfly-eye system avoids superimposition by collecting visible and near-infrared light from light-detecting nanostructures integrated on a single chip. He says it has seven times the spatial co-registration accuracy of existing clinical imaging systems.
It also does a better job of detecting fluorescent markers injected into patients to bind to cancerous tissue. With current technologies, surgeons must stop their work and dim the lights to “see” the fluorescence, then return to brightness to perform the surgery. This prolongs the operation and keeps the patient under anesthesia longer.
Gruev says his system is 1,000 times more sensitive than existing technologies. This enables it to detect fluorescence under surgical lights, so physicians do not need to interrupt surgical workflows.
The system itself is small enough to fit into any surgical suite, and relies on integration to keep costs down. All imaging takes place using sensors built into a single, integrated chip. Instead of watching a large display, surgeons see healthy and cancerous tissue in real time through gaming goggles.
“We provide them with essential information right when they need it,” he says.
The camera’s optics grew out of a lecture Gruev attended while working on cancer imaging with colleagues in the Carle Illinois College of Medicine. The presentation covered the nanostructured optical system of the morpho butterfly.
At the time, the engineers and physicians were taking a “brute-force” approach to improving existing cameras. After the lecture, Gruev thought: “There’s a more optimal solution if we take inspiration from nature.”
His team set out to develop nanostructures that mimic the morpho butterfly’s eye, which use nanostructured photonic crystals and materials to transmit some light spectra while reflecting others. His design consists of three filters to collect visible light—red, blue, and green (RGB)—and a fourth to capture NIR photons with wavelengths greater than 780 nm. All four are built on a single chip that also processes the images.
The researchers are forming a startup company and hope to receive FDA approval for the system. They are working on new technologies that are more sensitive and can address other clinical imaging problems and also looking for ways to further miniaturize their camera. Reducing its size to 5-10 millimeters, for example, could allow them to use it for imaging procedures such as endoscopy, Gruev says.
“Then we are talking about a different set of problems that we can solve,” he says.
Melissa Lutz Blouin is an independent technology writer.