The new AI understands when it can diagnose itself, and when it is necessary to turn to a live specialist

A new model has emerged that understands when she can diagnose herself and when she needs to see a specialist. This will reduce the number of errors in the diagnosis of diseases.

The developers noted that AI can already detect cancers of the lungs, breast, brain, skin, and cervix. But experts faced another problem – in difficult cases, the model may be more likely to make a mistake than the doctor. To solve this problem, researchers at the Massachusetts Institute of Technology (MIT) Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a machine learning system that can evaluate each case and then make a decision on their own or consult an expert.

In doing so, the system can adapt to how often a specialist is available, as well as assess the doctor’s experience in the desired area. For example, in a busy hospital, the system can only ask a person for help when needed.



The researchers explained that there are many questions about medical models that prevent them from being fully integrated into the process and used in a real hospital setting. That is why they decided to add one more link for decision making. They hope that another model will help specialists place greater confidence in machine learning and use AI more boldly in their practice.

Next, the researchers will test a system that works with several doctors at the same time. For example, AI can collaborate with different radiologists who have more experience with different patient groups.

The team also believes that their system can be used to moderate content, including texts and images. This tool could reduce the load on live moderators without fully automating the process.

Tags: