A Simon Fraser University researcher emphasizes the need for ethical, legal, and social oversight of voice artificial intelligence (AI) in clinical settings, particularly in therapeutic care.
Voice AI analyzes vocal patterns to detect signs of physical, cognitive, and mental health conditions based on vocal qualities like pitch and jitter or fluency and specific words people use.
Some tech companies have even dubbed it “the new blood” of healthcare because of its potential to act as a biomarker.
Zoha Khawaja, a health sciences researcher at SFU, urges caution and explores the potential and perils of voice-based AI apps in the mental health field in her thesis paper.
Author's summary: Voice AI needs ethical oversight in clinical settings.