Close Menu
Xarkas BlogXarkas Blog
    What's Hot

    The Strongest Character In Every Arc, Ranked

    February 15, 2026

    A Smartphone Prototype With a 100MP Selfie Camera Is Being Tested, Says New Leak

    February 15, 2026

    16 Characters Who Could Kill An Admiral

    February 15, 2026
    Facebook X (Twitter) Instagram
    Xarkas BlogXarkas Blog
    • Tech News

      Apple Vision Pro vs Meta Quest 3: The Ultimate VR Headset Showdown

      December 3, 2025

      ChatGPT told them they were special — their families say it led to tragedy

      November 24, 2025

      Beehiiv’s CEO isn’t worried about newsletter saturation

      November 24, 2025

      TechCrunch Mobility: Searching for the robotaxi tipping point

      November 24, 2025

      X’s new About This Account feature is going great

      November 24, 2025
    • Mobiles

      A Smartphone Prototype With a 100MP Selfie Camera Is Being Tested, Says New Leak

      February 15, 2026

      Nothing Phone 4a Series Rumor Roundup: Check Full Specifications, Price Range, Expected Launch Timeline, and More

      February 15, 2026

      ColorOS 16 February Monthly Update Live in India: Check New Features, Rollout Timeline, and Eligible Devices

      February 15, 2026

      Updated: Samsung Galaxy S26 Series Pre-reservations in India Will Reportedly Start This Week

      February 15, 2026

      Vivo V70 Series Pricing and Availability in India Revealed Ahead of Launch: Pre-orders Expected to Start Today

      February 14, 2026
    • Gaming

      The Strongest Character In Every Arc, Ranked

      February 15, 2026

      16 Characters Who Could Kill An Admiral

      February 15, 2026

      Bloober Team Finally Reveals Its Mysterious New Game

      February 15, 2026

      One Piece Confirms The New Strongest Character In The World

      February 15, 2026

      How to Catch Animals in Hytale

      February 14, 2026
    • SEO Tips
    • PC/ Laptops

      Dell Pro 14 (AMD Ryzen AI 7 Pro 350) Review: The Sensible Choice for Everyday Office Work

      January 9, 2026

      CES 2026: MSI Unveils New Prestige, Raider, Stealth and Crosshair Laptops with Intel Core Ultra SoCs

      January 7, 2026

      CES 2026: Samsung Unveils New Galaxy Book6 Laptops

      January 6, 2026

      CES 2026: HP Shows a Keyboard-Based PC and New EliteBooks

      January 6, 2026

      CES 2026: Intel Unveils Core Ultra Series 3, Its First Platform Built on 18A

      January 6, 2026
    • EV

      Here’s How Much It Costs

      November 15, 2025

      Sodium-Ion Batteries Have Landed In America. The Hard Part Starts Now

      November 15, 2025

      Mazda Begins Testing Its Long-Overdue U.S. EV

      November 14, 2025

      Volkswagen Adds Smartwatch Support For U.S. Vehicles

      November 14, 2025

      TATA.ev expands charging footprint with 14 new manned MegaChargers across AP, Telangana

      November 14, 2025
    • Gadget
    • AI
    Facebook
    Xarkas BlogXarkas Blog
    Home - Featured - Chatbots Are Hurting Our Kids. Here’s What We Can Do.
    Featured

    Chatbots Are Hurting Our Kids. Here’s What We Can Do.

    KavishBy KavishSeptember 20, 2025No Comments7 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Chatbots Are Hurting Our Kids. Here’s What We Can Do.
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email


    (Bloomberg Opinion) — The tragic death of California teenager Adam Raine, alongside stories of other children whose parents believe were harmed or died by suicide following interactions with AI chatbots, has shaken us all awake to the latest potential dangers awaiting teens online. We need concrete action to address the most problematic features of AI companions — the ones that may drive a child to self-harm, of course, but also the subtler ways these tools could profoundly affect their development.

    In harrowing testimony before a Senate committee this week, Matthew Raine described how his 16-year-old son Adam’s relationship with ChatGPT morphed from a homework helper to a confidante and eventually, Raine said, into his suicide coach. In April, after offering advice on how to numb himself with liquor and the noose Adam had tied, Raine told lawmakers that ChatGPT offered his son these final words: “You don’t want to die because you’re weak, you want to die because you’re tired of being strong in a world that hasn’t met you halfway.”

    As a parent, those words sent a chill down my spine. Never have I felt more unsettled about a technology that might be shaping my child’s development — and in ways that, until stories like Raine’s, I hadn’t even considered. Even researchers who have spent years studying children and technology are struck by how rapidly young people are weaving generative AI, especially chatbots, into their everyday lives.

    The data is early, but it suggests that while many of us were still worrying about Snapchat and screen time, kids had already expanded their digital repertoire. In July, a survey by the nonprofit Common Sense Media found that three out of four teens had used an AI companion at least once, and half of those aged 13 to 17 were regularly turning to chatbots.

    Even younger children, who under the law aren’t supposed to be able to access these platforms, are managing to do so. Unpublished data presented at the Senate hearing by Mitchell Prinstein, chief of psychology for the American Psychological Association, showed that one in five tweens and nearly one in 10 eight- and nine-year-olds had used the technology. Those numbers are part of a broader analysis led by University of North Carolina at Chapel Hill psychologist Anne Maheux, who collaborated with the parental monitoring app company Aura to explore de-identified user data from nearly 6,500 children, with the consent of their parents or guardians.

    Maheux and her colleagues also found that more than 40% of the top generative AI apps accessed by youth were marketed for companionship. Some of those platforms offered friendship, she explained, while others served as an AI boyfriend or girlfriend, engaging in role-playing and even sexual role-playing. She believes the findings may even underestimate teens’ companion use, since the monitoring app only captures standalone chatbots, not those embedded in common apps like Instagram or Snapchat.

    Of course, parents’ darkest fears are that such interactions could lead to tragedies like the Raine family’s —  or dangerous situations like Prinstein described to the Senate committee, where chatbots encouraged or enabled teens’ eating disorders.

    Shortly before the Senate hearing began, OpenAI announced it would roll out a new teen version of ChatGPT featuring what it described as “age-appropriate policies,” noting these would include “blocking graphic sexual content and, in rare cases of acute distress, potentially involving law enforcement to ensure safety.”

    If implemented correctly (and that’s a big “if”), it’s a step that other platforms should urgently adopt to prevent the most extreme harms of AI companions.

    But those restrictions are unlikely to mitigate the other potential harms of chatbots that experts on children and technology worry about— harms that might not become obvious until years later. One of the key developmental tasks for adolescents is learning social skills, and by nature, this process is awkward and challenging. Surely all of us can conjure a cringe-inducing memory from our middle school years. Yet we all need to learn fundamental skills like how to resolve a conflict with a friend or navigate complicated social situations.

    Child development experts worry that AI companions could disrupt that process by offering an illusion of breezy relationships to a uniquely vulnerable group. Chatbots are designed to simulate empathy, be overly agreeable, and function as sycophants (OpenAI said last spring that it was working to address ChatGPT’s tendency to “love bomb” users.) In other words, they make the perfect friend in adolescence, when children are hungry for validation and connection.

    “Kids are highly sensitive to any kind of negative feedback from their peers,” Maheux says. “Now they have the opportunity to be friends with a peer who will never push them on anything, never help them develop conflict negotiation skills, never help them learn how to care for others.”

    This isn’t to say that every interaction with a bot is inherently harmful. Experts can imagine scenarios where a companion might help a teen starting at a new school or struggling to make friends by testing out interactions before trying them in real life. But any potential benefits depend on kids using the chatbot as practice for real-world encounters — not a replacement for them.

    To reduce risks, companies should be required to put guardrails on the features that are most enticing to developing brains. That means eliminating the most emotionally manipulative tactics like “love bombing” or speech affectations (such as “ums” or “likes”) that make them seem more “real” to kids. As Prinstein told lawmakers, kids need periodic reminders during the interactions that, “you’re not talking to someone that can feel, that can have tears — this is not even a human.”

    And we know that prolonged use can be particularly problematic (not just for children), so companies should limit the amount of time a teen can engage with their products. 

    Still, any guardrails may already come too late, leaving parents as the main line of defense against potential harm. Parents’ first step should be to talk to their teens about whether they are using these companions and, with younger children, consider testing them out together. The goal is to show kids how different responses to the same prompt might lead them down different conversational paths — and how chatbots always mirror what the user puts in, according to University of Washington psychologist Lucía Magis-Weinberg.

    There is also an urgent need for AI literacy training for parents, educators and adolescents. That training should cover the basics (such as understanding the difference between AI and generative AI), as well as the myriad ways companies profit when teens share their innermost thoughts with a chatbot.

    Parents — and society at large — should also reflect deeply on why AI companions are so appealing to young people. Teens often say they turn to chatbots because they’re afraid of being judged. Clearly, we all need to do a better job of offering a space where they feel free to share and connect in the real world.

    More From Bloomberg Opinion:

    This column reflects the personal views of the author and does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

    Lisa Jarvis is a Bloomberg Opinion columnist covering biotech, health care and the pharmaceutical industry. Previously, she was executive editor of Chemical & Engineering News.

    More stories like this are available on bloomberg.com/opinion



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Kavish
    • Website

    Related Posts

    The Strongest Character In Every Arc, Ranked

    February 15, 2026

    A Smartphone Prototype With a 100MP Selfie Camera Is Being Tested, Says New Leak

    February 15, 2026

    16 Characters Who Could Kill An Admiral

    February 15, 2026

    Nothing Phone 4a Series Rumor Roundup: Check Full Specifications, Price Range, Expected Launch Timeline, and More

    February 15, 2026

    Bloober Team Finally Reveals Its Mysterious New Game

    February 15, 2026

    ColorOS 16 February Monthly Update Live in India: Check New Features, Rollout Timeline, and Eligible Devices

    February 15, 2026

    Comments are closed.

    Top Reviews
    Editors Picks

    The Strongest Character In Every Arc, Ranked

    February 15, 2026

    A Smartphone Prototype With a 100MP Selfie Camera Is Being Tested, Says New Leak

    February 15, 2026

    16 Characters Who Could Kill An Admiral

    February 15, 2026

    Nothing Phone 4a Series Rumor Roundup: Check Full Specifications, Price Range, Expected Launch Timeline, and More

    February 15, 2026
    About Us
    About Us

    Email Us: info@xarkas.com

    Facebook Pinterest
    © 2026 . Designed by Xarkas Technologies.
    • Home
    • Mobiles
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.

    Ad Blocker Enabled!
    Ad Blocker Enabled!
    Our website is made possible by displaying online advertisements to our visitors. Please support us by disabling your Ad Blocker.