Google is training AI to ‘hear’ when you’re sick. Here’s how it works.

Share
  • September 1, 2024

Google’s AI arm is reportedly tapping into “bioacoustics” — a field that blends a combination of biology and sounds that, in part, help researchers gain insights on how pathogen presence affects human sound. As it turns out, our sounds convey tell-tale information about our well-being.

According to a Bloomberg report, the search-engine giant built an AI model that uses sound signals to “predict early signs of disease.” In places where there is difficulty accessing quality healthcare, this technology can step in as an alternative where users need nothing but their smartphone’s microphone.

How does Google’s bioacoustics AI work?

Google’s bioacoustics-based AI model is called HeAR (Heath Acoustic Representations). It was trained on 300 million, two-second audio samples that include coughs, sniffles, sneezes, and breathing patterns. These audio clips were pulled from non-copyrighted, publicly available content from platforms like YouTube.

One example of such content is a video that recorded sounds of patients in a Zambia-based hospital where sick individuals came in for tuberculosis screenings. In fact, HeAR has been trained on 100 million cough sounds that help detect tuberculosis.

According to Bloomberg, bioacoustics can offer “near-imperceptible clues” that can reveal subtle signs of illness that can help health professionals diagnose patients. Plus, the AI model can detect minute differences in patients’ cough patterns, allowing it to spot early signs of a sickness’ amelioration or deterioration.

Google is partnering with Salcit Technologies, an AI healthcare startup based in India. Salcit has its own AI model called Swaasa (which means “breath” in Sanskrit) — and the Indian collaborator is using Swaasa to help HeAR improve its accuracy for tuberculosis and lung health screening.

Swaasa offers a mobile app that allows users to submit a 10-second cough sample. According to Salcit’s co-founder, Manmohan Jain, the app can identify whether an individual has a disease with an accuracy rate of 94 percent

The auditory-based test costs $2.40. This is much cheaper than a spirometry test, which costs about $35 in an India-based clinic.

HeAR doesn’t come without challenges, though. For example, Google and Salcit are still trying to navigate problems with users submitting audio samples with too much background noise.

Google’s bioacoustics-based AI model is nowhere near the “ready-for-market” stage, but you’ve got to admit, the concept of using AI in the medical field, combined with sound, is arguably innovative and promising.

Source :

Google is training AI to ‘hear’ when you’re sick. Here’s how it works.