AI has been a bit of a contentious topic as of late. Not the AI that powers NPC movements in games but the stuff behind the LLMs driving chatbots, making generative imagery, and creating websites. Though concerns still exist, a team of researchers at the University of Cambridge has shown off the ability to diagnose heart murmurs in dogs, and I think that’s just neat.
As shared in the Nvidia developer blog, a study used models that “were trained using PyTorch and Nvidia CUDA on Nvidia GeForce 10 Series GPUs, enabling efficient data processing.” The machine-learning algorithm ‘listened’ to digital heartbeat data and used signs of heart murmurs and heart disease to diagnose them.
In any grade of murmur (which is to say any intensity) the algorithm managed to detect murmurs at a rate of 87.9%, exactly matching a cardiologist’s grade in 57% of recordings.
As the study states, “the model is a promising tool to enable accurate, low-cost screening in primary care.”
This is a great potential use-case for AI, as it will not only be cheaper for a pet owner to have their pet scanned but it could be more efficient and easily accessible. Instead of booking time in with a vet specifically, you could have someone trained with the machine simply scan your dog and let you on your way. This could free up time for qualified vets to work on different procedures.
However, there is one concern worth analysing here, and one the researchers are likely aware of when going into this study. Just a few months ago, a study claimed up to 20% of local doctors in the UK could be using generative AI tools.
There’s not inherently a problem with AI being used to create calendars or even draft letters but it’s important to note that AI can’t reason as humans do. It can approximate reasoning by synthesising information but a human being is needed to get over that last hurdle that is understanding.
It doesn’t genuinely see or hear concerns or feelings from patients and doesn’t legitimately diagnose problems. It can compile potential symptoms, put them together with potential causes, and that’s mostly it.
AI like that which might be used to diagnose patients needs to scrape tons of information, both for scientific purposes, but also for understanding the language that patients use. However, with so much data going in, LLMs are subject to ‘hallucinations’ where bad or false information is treated as if it is real.
What is artificial general intelligence?: We dive into the lingo of AI and what the terms actually mean.
Though they can get more accurate, a human being is needed to see that hallucination in the first place. Just last month, a fake AI website tricked thousands of Halloween celebrators into showing up to a fake event.
For this reason, AI is something that isn’t used to diagnose patients, but can merely act as a tool to help the diagnosis procedure. In this case with listening to potential heart murmurs in dogs, it seems to have a great degree of accuracy, which could be partially related to the fact the tool is built for a very specific diagnosis, but a human is needed to verify and suggest treatment.
If used responsibly, this could be a great tool for both vets and dogs, and one of the best uses I’ve seen for AI so far.