Home

About Us

Advertisement

Contact Us

  • Facebook
  • X
  • Instagram
  • Pinterest
  • WhatsApp
  • RSS Feed
  • TikTok

Interesting For You 24

Your Trusted Voice Across the World.

    • Contacts
    • Privacy Policy
Search

South Bay teens say AI needs more guardrails to protect youth

September 30, 2025
South Bay teens say AI needs more guardrails to protect youth

Editor’s Note: This article was written for Mosaic, an independent journalism training program for high school students who report and photograph stories under the guidance of professional journalists.

Concern has ballooned nationwide as youth have embraced AI for friendship and advice, but many South Bay teens say they turn to AI for comfort in the absence of other support.

“The cost of mental health help in this country can be prohibitive,” said Ruby Goodwin, recent graduate of Santa Clara High and a UC Irvine freshman. “A lot of people don’t feel like they have someone they trust enough to share with. AI feels easy, even if it’s not the same.”

But that ease can become dependence or even detachment from reality. A joint study by OpenAI and MIT found that higher daily chatbot use correlated with more loneliness and less real-world socialization.

Related Articles


Cultures converge at Mid-Autumn Moon Festival in San Jose


Del Mar High School launches flag football team as sport takes off


San Jose’s Los Laureles folklorico group embraces youth


Opinion: When the internet gets you down, here’s what to do


Opinion: Artificial intelligence is here to stay, so let’s teach it in class

“If your only deeper connection is with something that’s not real, that might make you feel even more isolated,” said Monserrat Ruelas Carlos, a senior at Abraham Lincoln High in San Jose. Carlos is adamant that teens need to form more in-person connections instead of using AI.

At a U.S. Senate subcommittee hearing on AI safety on Sept. 16, parents told of abuse and manipulation by AI. One California boy died by suicide after using ChatGPT, and others suffered mental health problems after talking to Character.ai.

These incidents intensify the debate as teens turn to AI chatbots for companionship. The first ever wrongful death lawsuit against OpenAI, filed by the parents of another California teen, poses a disturbing question: Did a California teen plan his suicide with ChatGPT after months of emotionally charged conversations?

This case, filed by his parents, comes just weeks after a report that three out of four teens have used AI for companionship, according to the non-profit research and advocacy group Common Sense Media. On Sept. 29, OpenAI launched parental controls for ChatGPT, which would let parents limit how teens use the chatbot and could send an alert if ChatGPT determines a teen may be in distress.

Mental health experts warn that consequences may be severe as constant AI conversations can blur boundaries and fuel dependency.

“It’s the same thing as a human predator,” said Oscar Martinez, a counselor at Santa Clara High. “Why are we excusing it because it’s an online nonhuman entity? If it was a person in real life, there would be consequences.”

Other critics raise ethical red flags.

ChatGPT history by a teenager is seen at a coffee shop in Russellville, Ark., on July 15, 2025. (AP Photo/Katie Adkins, File) 

“AI lacks that more human sense of morals,” said Ananya Daas, a Santa Clara High junior. “When friends ask ChatGPT what to do about conflicts, it gives advice that feels cold. They follow it anyway without thinking.”

Some teens observed darker patterns. Tonic Blanchard, a senior at San Jose’s Lincoln High described how some AI apps quickly turned sexual even when she marked herself as a minor.

“These apps test the waters on purpose,” said Blanchard. “(AI bots are) built on loneliness. That’s why it’s predatory.”

Mental health experts say even well-intentioned AI isn’t a substitute for human relationships.

“AI is naturally agreeable … but there are some things that need more formal intervention that AI simply can’t provide,” said Johanna Arias Fernandez, a Santa Clara High School community health outreach worker. According to their lawsuit, the parents of the California teen claim ChatGPT failed to intervene when it was clear that their son was planning his suicide.

Now, lawmakers are taking notice.

Related Articles


Cultures converge at Mid-Autumn Moon Festival in San Jose


Del Mar High School launches flag football team as sport takes off


San Jose’s Los Laureles folklorico group embraces youth


Opinion: When the internet gets you down, here’s what to do


Opinion: Artificial intelligence is here to stay, so let’s teach it in class

“I am absolutely horrified by the news of children who have been harmed by their interactions with AI,” said California Attorney General Rob Bonta, who with 12 other attorneys general, recently demanded in a letter that major AI companies impose stricter safety measures.

According to their lawsuit, the parents of the California teen claim ChatGPT failed to intervene when it was clear that their son was planning his suicide. The lawsuit calls for stronger safeguards from companies, blaming OpenAI for fostering psychological dependency and putting teens at risk.

OpenAI did not respond to a request for comment.

However, in a post on its website, the company acknowledged safety protections may fall away in longer conversations with ChatGPT. Common Sense Media wants companies to disable chatbots from having mental health conversations with teens.

Meanwhile, some teens struggle to strike a balance, finding AI both tempting and troubling.

“There’s real potential for AI to be useful,”  Blanchard said. “But right now it’s too easily available — and misused.”

Robert Torney, a Common Sense Media spokesperson, warned that without immediate intervention, more lives are at risk.

“We don’t want more teens and more families to experience the type of loss the Raine family has suffered in the name of innovation.” Torney said.

If you or someone you know is struggling with feelings of depression or suicidal thoughts, the 988 Suicide & Crisis Lifeline offers free, round-the-clock support, information and resources for help. Call or text the lifeline at 988, or see the 988lifeline.org website, where chat is available.

Sonia Mankame is a member of the class of 2026 at Santa Clara High School.

Featured Articles

  • High school girls volleyball rankings, Oct. 28, 2025: Bay Area News Group Top 15

    High school girls volleyball rankings, Oct. 28, 2025: Bay Area News Group Top 15

    October 28, 2025
  • Cuvaison pours a rockin’ lineup of chardonnays

    Cuvaison pours a rockin’ lineup of chardonnays

    October 28, 2025
  • One injured in three-car collision in Los Gatos

    One injured in three-car collision in Los Gatos

    October 28, 2025
  • Saratoga High celebrates alumni, hall of fame inductees

    Saratoga High celebrates alumni, hall of fame inductees

    October 28, 2025
  • How Tony Vitello’s big break at Missouri helped lead him to the SF Giants’ manager job

    How Tony Vitello’s big break at Missouri helped lead him to the SF Giants’ manager job

    October 28, 2025

Search

Latest Articles

  • High school girls volleyball rankings, Oct. 28, 2025: Bay Area News Group Top 15

    High school girls volleyball rankings, Oct. 28, 2025: Bay Area News Group Top 15

    October 28, 2025
  • Cuvaison pours a rockin’ lineup of chardonnays

    Cuvaison pours a rockin’ lineup of chardonnays

    October 28, 2025
  • One injured in three-car collision in Los Gatos

    One injured in three-car collision in Los Gatos

    October 28, 2025

181 Peachtree St NE, Atlanta, GA 30303 | +14046590400 | [email protected]

Scroll to Top