Baylor Connections
Catch Early Signs of Eye Disease in a Flash
Baylor Profs collaborate to create smart-phone app to detect abnormalities in the eye
Dr. Bryan Shaw, associate professor of chemistry, and Dr. Greg Hamerly, associate professor of computer science, developed an app to help families detect early signs of various eye diseases, including the pediatric eye cancer retinoblastoma.
Shaw’s son Noah developed retinoblastoma as an infant. Noah, now 11 and healthy, lost an eye to the disease. The experience led Shaw to partner with Hamerly to help other families detect signs of eye disease more quickly.
Their app, called CRADLE, searches family photos for signs of white eye that can signal the presence of disease.
An October 2019 article in the journal Science Advances (Vol. 5, No. 10) presented a study finding CRADLE to be an effective tool to augment clinical screenings. Additionally, the app was featured in a CBS This Morning story in February. CRADLE is available via the Apple app store and the Google Play store.
The Baylor professors also discussed their work on an episode of Baylor Connections, a weekly radio show and podcast highlighting University people and programs.
WHAT IS CRADLE?
SHAW: CRADLE is Computer Assisted Detector of Leukocoria. Apple kind of wanted us to change the name to something a bit more descriptive so it’s also called the White Eye Detector. If you see a white pupil in a picture of your child, it can mean a lot of things. It can mean that you’re just reflecting light off the optic disc, the little 1.8- to 2-millimeter white disc in the back of your eye, and that is not always a problem. It could mean you have refractor error, or lazy eye, or something like that. But you can also have white pupils because light’s reflecting off the tumor, or light’s reflecting off a cataract in your lens, or a cholesterol deposit in the back of your eye. If you have Coats disease or your retina’s damaged, retinopathy of prematurity (ROP), can cause Leukocoria. It’s a symptom of a lot of different disorders. We were quite surprised to find how many of these disorders the app is able to catch.
HAMERLY: And to be clear, people are used to seeing red eye pupils in photos, and that’s not an issue. That’s the kind of thing that we’re looking for, except white. And if the reflection is off the surface of the eye, you get what we call a specular reflection, that’s not a problem either. But if the pupil is filled with a white light, that’s the type of symptom that we’re looking for.
WHAT WAS THE GENESIS OF THIS APP?
SHAW: For me, the idea that a parent could be helped by technology that searches through their photos, popped in my brain. The light bulb went off when I saw a picture of my son with white eye at 12 days old, months before his cancer was diagnosed. And after his doctor told me what the prognosis would have been if we had brought him in earlier, I thought, “oh boy, we need some sort of technology.” Then I moved to Baylor, and I met Greg.
HAMERLY: My research area is in machine learning, and it seemed like a natural fit. If you take this problem of taking a natural photo — and when I say natural, I mean not clinical — and try to identify symptoms of disease, you need to break that down into parts. Those parts are like, “where are the faces in this photo? Where are the eyes in this photo? Do those eyes look diseased?” When I heard Bryan’s story and he described where he wanted to go with this, I immediately thought, well you can break that problem down into these parts, and a lot of those, if not all of those, can be answered with machine learning approaches. And the immediate idea was what the app became. The actual getting from idea to the app itself took a while. At the time when we first met, smart phones existed, but they were not nearly what they are today. We had a few false starts at the beginning, but the path ahead seemed clear. It was just getting there and finding the right people — like Ryan Henning, who was my former graduate student — to work on this project with us.
WHAT WERE SOME OF THE PROBLEMS THAT NEEDED TO BE SOLVED ALONG THE WAY?
SHAW: There’s a lot of things that happened to me in relation to this project that I would have never anticipated. Having a 4-month-old diagnosed with tumors in both of his eyes was the beginning. After that, all the other stuff wasn’t too surprising.
HAMERLY: For any machine learning project, you have to have examples of what you’re looking for. We needed examples of what looked like diseased eyes and what looked like normal eyes. Finding pictures of normal eyes is easy. Finding examples of the diseased eyes that we were looking for is quite a bit more difficult. In order to collect enough information, you need a lot of images. We collected initially images from Noah. Then as the story got around, we also were able to collect many more images. So, collecting data was one of the first problems. Another major problem is the robustness of the detector itself. You want it to make sure that it’s actually detecting faces and not hubcaps, which we’ve seen. We want to make sure that it’s actually triggering on the right kind of disease presentation and is not giving you false positive triggers. We don’t want the app to scare people or parents. We want it to only trigger in cases where there really is white eye presentation. And getting it to be robust with respect to different lighting conditions, different face orientations, people being far away in the picture, people being close in the picture — all these things are variables that you don’t control when you don’t control who’s taking the picture and under what conditions. So, the other major problem was just getting the app to behave well in all different situations.
WHAT DOES THE STUDY PUBLISHED IN SCIENCE ADVANCES MEAN FOR THE APP?
SHAW: It proves for the first time that casual photographs that parents take can be useful for this app. It shows that this app is capable of detecting Leukocoria in casual sets of photographs and that it can detect Leukocoria or white eye in the pictures long before the child is eventually diagnosed by a doctor.
HAMERLY: Until this point we had our beliefs that this worked well, but this is a validation of the approach.
SHAW: It’s the first time anyone’s looked at the frequency of white eye pictures. Nobody had looked at how often white eye shows up in pictures that parents take of their child from birth through diagnosis, treatment, remission, in the case of cancer. One more thing: we always need more pictures. If your child has amblyopia, strabismus, refractive error, cataract — even the rare serious disorders like retinoblastoma or Coats disease — and you want to donate your images to our research team, they’ll help us out. You can obviously remain confidential. You can email me at bryan_shaw@baylor.edu.
WHAT DOES IT MEAN TO YOU TO NOW BE HELPING OTHER FAMILIES, HOPEFULLY CATCH DISEASES EARLY?
SHAW: It’s good. It warms your heart. As a father, having Noah as a child, the benefits are for him to enjoy, too. He’s getting older and he knows what’s going on, and now I can show him the other children that have been helped because of the ordeal he went through. That’s more important now than it’s ever been for him. So that’s something new.
HAMERLY: As an academic, you always hope to make an impact in the real world, but it doesn’t always happen, right? You publish research, and it goes into the ether. This has a real-world impact, which has been really satisfying.
To hear this complete conversation and to find links to other Baylor Connections episodes, visit baylor.edu/connections.