A paper in JAMA Psychiatry says mental health providers should ask if patients are using artificial intelligence chatbots, just as they would ask patients about sleep habits and substance use.
A new paper in JAMA Psychiatry argues that mental health care providers should ask clients routinely about their use of AI for emotional support and health information.
In his 2025 book, AI, Automation, and War, Anthony King implores the reader to understand artificial intelligence as a phenomenon most closely related to organizational structure: "It is vital that we ...
The US Army is developing AI models trained on data from real missions, with the goal of deploying a chatbot specifically for ...
To determine whether AI-powered automation is likely to help your business or lead to challenges, I advise putting yourself ...
Younger Americans are more likely to use social media at least sometimes for health information than their older peers.
It's becoming common to use artificial intelligence for therapy and mental health advice. But is it safe? A licensed ...
Research shows media coverage of AI chatbot use and mental health focuses on instances of user psychosis and suicide.
In fact, Altman's comments to Von may have been prompted by OpenAI's very own legal troubles: Courts were demanding the AI ...
Generative AI is designed to please humans, but maybe not in the case of customer service chatbots dealing with angry ...
Exclusive: Research finds sharp rise in models evading safeguards and destroying emails without permission ...