The promise of voice is great: doctors speak to their patients, while their words get correctly transcribed, interpreted and recorded in a structured way in a clinical system. No more long hours spent on typing clinical notes on the computer. While this may seem futuristic, it\u2019s actually already in use in some places. At HLTH in November, I spoke with Punit Singh Sonu, CEO of Suki, which provides doctors with an AI-powered voice assistant for healthcare designed to save doctors time and energy. We discussed how Suki works, how it translates text to structured data, and how clinically risky is to rely on AI to interpret medication names which can very quickly sound alike correctly.\xa0\nThe biggest issue, says Punit Singh Sonu, is not specialty phrases, it\u2019s regular English. \u201cThe problem typically happens not in medical terminology. It happens in regular English.\xa0I'll give you a very funny example. The doctor would just say \u201cbilateral knee,\u201d and it would actually understand it as \u201cbeyonce knowles\u201d. Regular English is where speech recognition trips and falls in, in specific medical terminology,\u201d he explained.\xa0\nRecap: https://www.facesofdigitalhealth.com/blog/nlp-in-healthcare-suki-voicetech \nSuki: suki.ai \nMonthly. newsletter: https://fodh.substack.com/ \nwww.facesofdigitalhealth.com \nLeave a rating or a review: lovethepodcast.com/facesofdigitalhealth