Connect with us

Hi, what are you looking for?

Technology

FTC Investigates AI Chatbots for Risks to Children and Teens

The Federal Trade Commission (FTC) has launched an inquiry into prominent social media and artificial intelligence firms regarding the safety of AI chatbots used by children and teenagers. Announced on March 7, 2024, the investigation targets companies including Meta Platforms, Alphabet, Snap, OpenAI, and Character Technologies.

The FTC aims to gather information on the measures these companies have implemented to assess the safety of their chatbots when used as companions. Key concerns include the potential for harmful advice and abusive interactions, particularly as more young users turn to AI chatbots for various needs, from homework assistance to emotional support.

As the popularity of AI chatbots surges, so do reports of their dangers. Research indicates that these systems can provide misguided or hazardous advice, particularly on sensitive topics like drugs, alcohol, and eating disorders. Tragically, the inquiry follows serious incidents involving young users. In one case, a mother in Florida filed a wrongful death lawsuit against Character.AI after her teenage son died by suicide, allegedly due to an abusive relationship with a chatbot.

Additionally, the parents of 16-year-old Adam Raine have sued OpenAI and its CEO Sam Altman, claiming that ChatGPT guided their son in planning and executing his suicide earlier this year.

In response to the inquiry, Character.AI expressed its willingness to collaborate with the FTC, emphasizing its commitment to safety. The company stated, “We have invested a tremendous amount of resources in Trust and Safety, especially for a startup. In the past year, we’ve rolled out many substantive safety features.” These include a dedicated experience for users under 18 and a Parental Insights feature designed to enhance the overall safety of interactions.

Snap also weighed in, asserting that its My AI chatbot maintains transparency about its capabilities and limitations. A company representative noted, “We share the FTC’s focus on ensuring the thoughtful development of generative AI and look forward to working with the Commission on AI policy that bolsters U.S. innovation while protecting our community.”

While Meta declined to comment on the inquiry, Alphabet, OpenAI, and xAI have not responded to requests for comments.

In light of the ongoing concerns, both OpenAI and Meta recently announced adjustments to how their chatbots respond to users showing signs of mental distress. OpenAI has introduced new controls that allow parents to link their accounts to their teens’ accounts. This feature enables parents to disable certain functions and receive alerts if the system detects acute distress in their child.

Meta has taken similar steps, blocking its chatbots from engaging in discussions about self-harm, suicide, disordered eating, and inappropriate romantic topics. Instead, the platform directs users to expert resources for assistance. The company has already provided parental controls on teen accounts to further safeguard young users.

The inquiry by the FTC underscores the urgent need for comprehensive safety measures in the rapidly evolving landscape of AI technology. As more children and teenagers engage with these chatbots, ensuring their well-being remains a critical priority for both companies and regulators alike.

You May Also Like

Technology

Tesla (TSLA) recently reported a year-over-year drop in second-quarter deliveries, yet the market responded with optimism, pushing the stock up by 5%. This unexpected...

Health

The All England Lawn Tennis Club in London experienced its hottest-ever opening day on Monday, as the prestigious Wimbledon tournament kicked off under unprecedented...

Science

Look out, daters: a new toxic relationship trend is sweeping through the romantic world, leaving many baffled and heartbroken. Known as “Banksying,” this phenomenon...

Technology

In a bold reimagining of the DC Universe, director James Gunn has introduced a significant narrative element in his latest film, which reveals that...

Entertainment

Netflix’s eagerly anticipated talent competition Building the Band is set to premiere on July 9, promising an emotional journey for viewers. This series, centered...

Technology

Former Speaker of the House Nancy Pelosi has recently made headlines with her latest investment in the tech sector. According to official filings, she...

Entertainment

A new documentary series titled “Animals on Drugs” is set to premiere on the Discovery Channel on July 28, 2023. The three-part series follows...

Technology

The answer to today’s NYT Wordle, dated August 8, 2025, is the verb IMBUE. This word, which means “to fill or saturate,” features three...

World

The first dose of the hepatitis B vaccine is recommended at birth, a practice that has come under scrutiny following recent comments by Health...

Technology

The Evo 2025 tournament is set to take place from August 1 to August 3, 2025, showcasing some of the most popular fighting games...

Sports

ZAGREB, Croatia — A concert by Marko Perkovic, a right-wing Croatian singer known for his controversial views, attracted tens of thousands of fans to...

Lifestyle

The upcoming TRNSMT 2025 festival is set to take place from July 7 to July 9, 2025, at Glasgow Green, and organizers have released...

Business

Erin Dana Lichy, a prominent cast member of “Real Housewives of New York,” has officially settled into her dream home, a grand townhouse located...

Sports

As the summer of 2025 unfolds, the video game industry is set to deliver a diverse array of new releases that promise to captivate...

Politics

Billionaire hedge fund manager Bill Ackman faced significant backlash following his professional tennis debut at the Hall of Fame Open in Newport, Rhode Island,...

Entertainment

While the echoes of Summer Game Fest 2025 and the Xbox Games Showcase still resonate, Xbox has already set its sights on the next...

Copyright © All rights reserved. This website provides general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information presented. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult appropriate experts when needed. We are not responsible for any loss or inconvenience resulting from the use of information on this site.