Reading a chest X-ray is tough. So much so that even radiologists get it right only around 70-80% of the time. qXR, an artificial intelligence (AI) tool to interpret chest X-rays, was developed by the Mumbai-based Qure.ai, a health-care technology company. It was trained on over 1.5 million X-rays to detect 15 chest abnormalities, ranging from tuberculosis to potentially cancerous lung nodules.
To test the product, Qure.ai, in December 2017, began collaborating with radiologists at the Bengaluru hospitals of the Columbia Asia health-care group. The company received a set of 2,000 chest X-rays from Columbia Asia’s digital database which qXR then interpreted. Next, the AI’s findings were compared with the interpretations by three expert radiologists.
The results of the trial were very promising, says Shalini Govil, senior adviser and quality controller at Columbia Asia Radiology Group; the AI was calling the X-rays correctly around 90% of the time. But there was another feature that particularly pleased her — qXR’s ability to explain, in the way human radiologists do, why it interpreted the X-ray the way it did. Even the most sophisticated AIs frequently cannot do this — a problem known as AI’s “black box”.
The “black box” is inherent in advanced AI techniques such as deep-learning. This is how deep-learning works: to teach a computer to think like humans, researchers use a network of mathematical functions (called an artificial neural network) which mimics the biological brain. Next, they input data into this network. In qXR’s case, these were chest X-rays and radiologist interpretations of them. When the network is exposed to millions of such X-rays and interpretations, it builds its own rules for translating the images into interpretations.
The resulting AI can now read new X-rays and spot abnormalities accurately. But even though such AIs can be highly competent, they are often unable to communicate the rules they used to arrive at the interpretation. This ‘black box’ in AI functioning affects several products today. For example, in early 2017, Stanford University researchers in the U.S. trained a neural network to identify cancerous skin lesions as accurately as dermatologists could. Yet, when they asked the AI what part of a mole looked cancerous, it couldn’t answer.
qXR is designed to avoid this problem. For example, if it detects a pleural effusion (build-up of fluid in the lungs), it tells the radiologist where in the X-ray the abnormality lies. It does so by circling the part of the X-ray indicating the effusion, such as a blunting of the angle between the diaphragm and ribs (costophrenic angle)
“This was the ground-breaking thing for us,” says Ms. Govil. “The AI looks at the X-ray just like my post-graduate radiologist would.”
Given its potential, NITI Aayog recently began talks with Qure.ai for a pilot tuberculosis (TB) screening project as part of its Aspirational Districts programme. Under the programme, the government will work to raise living standards in 112 Indian districts. TB screening projects are critical for India, which has an estimated 2.7 million TB cases every year. Many of these patients do not get a diagnosis early enough, because they lack access to health care. When they do consult doctors, the doctor may fail to spot the disease in a chest X-ray if she isn’t trained.
So, an AI that can distinguish likely TB cases from normal X-rays and send the likely patients for further testing can save radiologists a lot of time.
Only some 50 of every 1,000 healthy people screened are likely to need further testing, says Prashant Warier, the CEO and co-founder of Qure.ai.
But there will be substantial challenges in deploying qXR across rural India, given the lack of digitisation in hospitals. Two weeks ago, Mr. Warier and his team installed qXR at the district hospital in Baran, Rajasthan.TB incidence here is around 170 per 100,000 people, Dr. Sampat Raj Nagar, Baran’s district chief medical and health officer said. Yet, the 100 chest X-rays ordered by the hospitals everyday are not screened for TB by default. With qXR, Dr. Nagar hopes to start doing this.
To cover all of Baran though, 15 other diagnostic centres will need to use qXR. The problem is that none of these centres digitises X-rays today, without which the AI can’t function. Qure.ai is working to fix this. It is a big challenge but not one that cannot be tackled, according to Mr. Warier. “We see a substantial opportunity to change how radiology is delivered in Indian hospitals.”