Begin typing your search...

OpenAI Warns About Emotional Bonds with ChatGPT’s Realistic Voice

OpenAI raises concerns about users forming emotional bonds with ChatGPT’s realistic voice, highlighting potential risks and long-term impacts.

OpenAI is an American artificial intelligence (AI) research laboratory.

OpenAI Warns About Emotional Bonds with ChatGPT’s Realistic Voice
X

11 Aug 2024 4:07 PM IST

OpenAI has expressed concerns about its realistic voice feature in ChatGPT, noting that it could lead users to form emotional bonds with the AI, potentially at the expense of real human interactions.

The report highlighted that users have been observed developing connections with the model, attributing human-like traits to it, and expressing sentiments such as regret over their "last day together."

The advanced voice capabilities of ChatGPT-4o may make interactions feel more personal, increasing the risk of anthropomorphization—treating AI as if it were human.

OpenAI Warns About Emotional Bonds with ChatGPT’s Realistic Voice

This can result in misplaced trust and emotional attachments. OpenAI plans further studies to understand the long-term impact of these interactions.

OpenAI also expressed concerns that AI could affect social norms and lead to excessive dependence on technology.

The report also noted issues with AI spreading false information and conspiracy theories. OpenAI intends to continue testing to assess and address these risks.

OpenAI ChatGPT realistic voice emotional bonds anthropomorphization AI risks social norms technology dependence AI impact 
Next Story
Share it