Artificial Intelligence for Radiology that’s Impressive and Free!

Throughout my medical career, I’ve encountered programs or devices that offered computer assistance in making various diagnosis. From the computer interpretation of EKG’s to the computer aided detection (CAD) for breast cancers in mammograms, I was never really impressed by these systems and found that they could often not be relied upon.

Until recently, I had yet to find a computer that performed as well as a human in my profession as a pediatric radiology. Then the Radiological Society of North America held a global machine learning competition to develop the best algorithm for interpreting bone age exams in pediatric patients and I finally found an “Artificial Intelligence” that was useful.

What is a bone age?

Bone age exams are needed to evaluate a child’s skeletal age and to compare it to their chronological age. They are mainly performed on children that are either too short or too tall for their age or they have some endocrine abnormality. Bone ages play a part in the diagnosis of growth disorders and can be used to predict the final height for patients with short stature.

Bone age can be used to monitor children on growth hormone therapy or to evaluate children with delayed or advanced stages of puberty that may need treatment. The exam is most commonly ordered by pediatricians, pediatric endocrinologists, and family practitioners.

How do you interpret a bone age?

For those of you who have not interpreted a bone age, then you are missing out. During residency and fellowship, interpreting a bone age required that the most junior trainee in the room be given the task of finding the Gruelich and Pyle book called the Radiographic Atlas of Skeletal Development of the Hand and Wrist. The medical student or resident would then stumble around in the dark reading room searching for the ancient book, bound in a blue denim type material, which must have been all the rage back in 1959 when it was published.

Once the book was found, we would turn the pages that looked like they came from the same cavern as the Dead Sea Scrolls. Tinged yellow with age and splotched with coffee stains older than my mother, we would find the right section to compare the standardized bone ages to the study we were interpreting. The joy of this tedious process would come from the radiographs of the children wearing rings. You think that for an atlas of standardized images, the technologist could at least have asked the kids to take off their rings.

That’s right, interpreting bone ages is an art that has not changed since the 1959 publication of Gruelich and Pyle’s standard work. The main atlas that radiologists use for interpreting bone ages is the same atlas that was used by radiologist during a time when the United States only had 48 states and Dwight Eisenhower was president!

What can Artificial Intelligence do for you?

Almost 60 years and 12 United States presidents later, we see the next big advancement in bone age interpretation. Call it machine learning or artificial intelligence, this project by Doctors Alexander Bilbily and Mark Cicero aimed to modernize bone age interpretation which has been unchanged for over half of a century. Their company, 16bit, won the RSNA pediatric bone age competition and now their program is free for anyone to test.

Having a computer program interpret a bone age is as easy as opening a web browser to their site, www.16bit.ai, where you will find the window above. Next, drag your bone age DICOM image into the “Drop files here box” and then let the program work its magic.

Alternatively, you can pull up the website on your phone’s web browser, tap on “If on a mobile device you can take a picture with your camera” and marvel at the future of bone age interpretation.

Is the program any good at interpreting bone ages?

So how does the program do compared to a fellowship trained radiologist? I am pretty impressed by the results. But I do admit that I started using the website mainly out of curiosity and interest and now I am using it to see if I can catch it making a mistake. My pattern is to open the bone age study, drag and drop the image for Dr. Artificial Intelligence to interpret, and while its gears are turning, I open up my bone age atlas to formulate my own interpretation. I’m excited (and a little scared) to see that almost always, we are in agreement.

So why don’t I just have the program just interpret the bone age and save myself the time and effort of interpreting it myself? Call me old fashion, but I am still in the habit of doing it myself. While I find it impressive, I still need to gain confidence in the program before giving it the reins to bone age interpretation.

Should I be worried about my job?

Am I worried about being replaced by artificial intelligence? I think you would be a fool not to be concerned. It’s like asking a room full of Uber drivers at a convention if they are concerned about automated cars. But instead of losing sleep over how I may someday be replaced by a computer, I choose to focus on how I can use computers and artificial intelligence to make my life better. I embrace my microchip powered companions.

I joke with my colleagues that reading bone age exams is a money loser. Between pulling up the atlas and flipping to the right page to compare with the study, this relatively low paying exam has already taken me just as much time as it does to interpret a head CT. If I can just drag and drop the image on a website and have a program spit out a bone age for me, that leaves me more time to read the more “high tech” studies such as ultrasound, CT and MRI.

I welcome change and advancement. I think there is a future where radiologists and computers can work together for the betterment of our patients and I am excited for that future. If you read bone ages, give the 16bit bone age analyzer a try. I would love to know what you think.

Leave a Comment