29 August 2017. A computer science lab designed a smartphone app that detects tell-tale indicators of liver disease and pancreatic cancer by analyzing self-taken photos of a patient’s own eyes. A team in the Ubiquitous Computing Lab at University of Washington in Seattle describes the system in a paper scheduled for presentation on 13 September at the International Joint Conference on Pervasive and Ubiquitous Computing.
The researchers are seeking reliable ways to screen for liver disorders and pancreatic cancer in their early stages. Pancreatic cancer is particularly difficult to diagnose in its early stages, due to few unique symptoms associated with the disease, and because the pancreas is hidden among other organs in the body. As a result, it is often diagnosed in later, more advanced stages, with generally a poor prognosis for survival: 5-year survival rate of only 9 percent. American Cancer Society estimates more than 53,000 people in the U.S. will be diagnosed with pancreatic cancer this year, leading to some 43,000 deaths.
The solution, called BiliScreen, developed by doctoral candidate Alex Mariakakis looks for signs of jaundice in a person, particularly in the sclera, the white part of the eyes. Jaundice is an indicator for a number of liver-related diseases as well as pancreatic cancer, and results from high levels of bilirubin, a substance in bile that can build-up in blood and cause yellowing of the skin and eyes if not processed adequately by the liver. In pancreatic cancer, the tumor often blocks the common bile duct, causing bilirubin to accumulate.
Jaundice, however, is detectable by the naked eye only in advanced stages, and confirmed with multiple blood tests. As a result, the researchers devised BiliScreen to be simple and non-invasive, but still find signs of yellowing sclera much earlier. Their system uses the camera in smartphones, with individuals asked to aim the camera at their own eyes and take a selfie of sorts, focusing on the sclera.
For medical screening, however, a selfie alone is not enough. The researchers first needed to account for ambient light that can subtly alter the color of the sclera, which led to two techniques to capture the image. The first method asks the individual to wear a pair of paper glasses with various colors printed on the frames to calibrate the color of the sclera. A second, more complex method, fits the phone into a 3-D printed plastic box worn on the head, like a virtual-reality display, to eliminate ambient light.
The captured images are then analyzed for early signs of yellowing of the sclera. The BiliScreen software first uses computer vision to recognize the sclera in the image, then color analysis routines to measure color values in the pixels, looking for variations from the bright white indicator of a healthy sclera. These measurements are accumulated and analyzed with machine-learning algorithms to highlight indicators of jaundice. Because BiliScreen is easy to use and non-invasive, individuals can be asked to take further images of their eyes over time to detect changes in sclera coloring.
The researchers tested BiliScreen with 70 volunteers recruited from University of Washington and its associated medical center, with variations in bilirubin levels ranging from normal to elevated. The medical center participants were admitted for some form of liver disease. The team asked participants to capture images of their eyes with BiliScreen on an iPhone using either the color-calibration glasses or plastic box, and compared the results to blood tests.
The results show a high correlation between bilirubin levels calculated by BiliScreen and measured in blood tests. The findings also show images taken with the box accessory correlated more strongly with blood tests than images using the color-calibration glasses, returning a true-positive rate 97 percent, and true-negative rate of 90 percent.
The researchers next want to expand the tests to people with a wider range of jaundice-related disorders, and improve the usability of the device. BiliScreen is demonstrated in the following video.
- App Users Report Lower Blood Glucose Levels
- Algorithms, Imaging Help Predict Alzheimer’s Onset
- Smartphone Attachment Performs Medical Diagnostics
- NIH Supporting App That Helps Opioid Therapies
- Smartphone Data Reveal Physical Activity Worldwide
* * *
You must be logged in to post a comment.