Mental-health professionals explore AI-powered digital therapeutics to treat patients and improve care – DIGIWIZ CENTRAL

Mental-health professionals explore AI-powered digital therapeutics to treat patients and improve care

Francesco Carta Fotografo/Getty Images

AI is being used in the understaffed mental-health-care field to help providers.AI-powered software can suggest treatments through mobile apps and analyze therapy sessions.This article is part of “Build IT,” a series about digital tech trends disrupting industries.

The fusion of human ingenuity and machine intelligence is offering an innovative approach to personalized mental-health care. By leveraging AI technology, clinicians and behavioral-health-care facilities can provide tailored treatments for people with conditions such as depression and addiction. They can also use AI to assess the quality of their services and find ways to improve as providers of mental-health care.

These advancements also bring up important ethical and privacy considerations. As technology becomes more involved in mental-health care, ensuring data security, confidentiality, and equitable access to services must be top priorities.

How an AI-powered mobile app provides treatment

Dr. Christopher Romig, the director of innovation at the mental-health clinic Stella, said he saw great potential with AI “aiding in early diagnosis, personalized treatment plans, and monitoring patient progress.”

There’s a reason for this anticipated gain in momentum, he added: “Because there’s such a huge shortage in this country of mental-health-care providers, AI is going to be a key component moving forward in terms of support and interventions.”

Click Therapeutics, a biotechnology company that develops AI-powered software for medical treatments and interventions, helps patients through a mobile app. The software can work independently or in conjunction with pharmacotherapies to treat conditions such as depression, migraines, and obesity.

The company’s algorithm collects and analyzes patient data, including symptom severity and sleep-wake cycles, from the app. It uses this information to identify patterns and correlations to provide tailored treatment strategies.

Click Therapeutics’ mobile app gives a personalized overview of a user’s health journey.

Click Therapeutics

It also leverages digital biomarkers such as smartphone sensors. For example, the sensors can monitor a patient’s heart rate to detect high stress; the algorithm can then recommend mindfulness exercises, relaxation techniques, or cognitive-behavioral-therapy modules within the app. “It’s bona fide therapeutics that are changing the brain,” Shaheen Lakhan, the chief medical officer of Click Therapeutics, told Business Insider.

Patients can share these insights with their healthcare providers to give a more comprehensive understanding of their conditions and behaviors. The metrics can inform treatment decisions and improve care results. “You’re the active ingredient, meaning you have to engage in it,” Daniel Rimm, the head of product, said.

In January, Click Therapeutics announced the Food and Drug Administration would help accelerate the development of the company’s software for treating schizophrenia. Research suggests that this use case could significantly benefit from digital therapeutics.

Dr. Haig Goenjian, the principal investigator and medical director at CenExel CNS, told BI that patients who used prescription digital therapeutics in a schizophrenia-focused study said the approach “changed the way they socialize” and “that they are better able to navigate their schizophrenia symptoms to function in the real world.”

“At the end of our studies, many patients asked how they can continue to use this digital therapeutic,” he added.

How an AI platform is helping mental-health-care providers improve their services

The AI platform Lyssn is another tech-driven tool for mental-health services. It provides on-demand training modules for clients such as behavioral-health-care providers who want to improve engagement and sessions with their patients.

Providers can record therapy sessions with the consent of their patients and use Lyssn’s AI tech to evaluate factors such as speech patterns and tone from both parties to better understand how to effectively converse and improve their approach to sessions.

“There’s a need for more, and there’s a need for better,” Lyssn’s cofounder and chief psychotherapy-science officer, Zac Imel, said, referring to the countrywide shortage of mental-health workers.

Imel and Lyssn’s chief technology officer, Michael Tanana, said it’s hard to evaluate the quality of service being provided because sessions between mental-health-care professionals and patients are private and, therefore, difficult to monitor. Lyssn aims to hold providers accountable for improved care, especially because “the quality of mental-health care is highly variable,” Imel said.

Lyssn’s dashboard shows quantified insights for qualitative factors such as showing empathy to a client during a therapy session.

Lyssn

Tanana, who also cofounded Lyssn, added that “we need ways to ensure quality” as more people seek access to mental-health services. The developers at Lyssn keep this in mind as they train their AI tech to recognize both problematic and successful conversation styles, Imel said.

For example, Lyssn can analyze a provider’s responses during conversations that require cultural sensitivity; this includes gauging how curious they are about the client’s experience and whether they’re anxious when talking about such topics. Based on its evaluation, the platform can give providers immediate feedback on their skills and suggest certain training and tools to help them learn and improve.

Darin Carver, a licensed therapist and assistant clinical director at Weber Human Services, uses Lyssn to improve patient outcomes. “Clinicians have near-immediate access to session-specific information about how to improve their clinical work,” he told BI.

He added that supervisors also have access to skills-based feedback generated from session reports, which they use to transform fuzzy recollections from clinicians into hard facts about which skills they used and need improvement.

Carver said feedback and advanced analytics are essential treatment decisions. “We can drill down to what our real training needs are and which clinicians and areas need help,” he said. “It’s been a game changer.”

Concerns with AI in mental health

There’s still a need for human-led regulation when using AI in mental-health services. AI algorithms can perpetuate biases and stereotypes from the data they’re trained on.

To account for issues, Lyssn creates a detailed annual report that evaluates the performance of its training and quality-assurance models for helping people from historically marginalized communities. The company also partners with leading universities to assess the tech’s multicultural competency.

Stringent compliance regulations are also needed to protect patient privacy and confidentiality. Lyssn, for instance, uses encrypted data transfers and storage, two-factor authentication, and regular external compliance audits to help thwart data leaks. Now that tech-driven care is evolving, Carver said, mental-health professionals have a duty to ethically use AI to improve people’s health and well-being.

Read the original article on Business Insider
Please follow and like us:
Pin Share

Leave a Reply

Your email address will not be published. Required fields are marked *

RSS
Follow by Email
LinkedIn
Share