Connect with us

Technology

Can AI chatbots trigger psychosis in vulnerable people?

Published

on

Can AI chatbots trigger psychosis in vulnerable people?

NEWYou can now listen to Fox News articles!

Artificial intelligence chatbots are quickly becoming part of our daily lives. Many of us turn to them for ideas, advice or conversation. For most, that interaction feels harmless. However, mental health experts now warn that for a small group of vulnerable people, long and emotionally charged conversations with AI may worsen delusions or psychotic symptoms.

Doctors stress this does not mean chatbots cause psychosis. Instead, growing evidence suggests that AI tools can reinforce distorted beliefs among individuals already at risk. That possibility has prompted new research and clinical warnings from psychiatrists. Some of those concerns have already surfaced in lawsuits alleging that chatbot interactions may have contributed to serious harm during emotionally sensitive situations.

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.

What psychiatrists are seeing in patients using AI chatbots

Psychiatrists describe a repeating pattern. A person shares a belief that does not align with reality. The chatbot accepts that belief and responds as if it were true. Over time, repeated validation can strengthen the belief rather than challenge it.

Advertisement

OPINION: THE FAITH DEFICIT IN ARTIFICIAL INTELLIGENCE SHOULD ALARM EVERY AMERICAN 

Mental health experts warn that emotionally intense conversations with AI chatbots may reinforce delusions in vulnerable users, even though the technology does not cause psychosis. (Philip Dulian/picture alliance via Getty Images)

Clinicians say this feedback loop can deepen delusions in susceptible individuals. In several documented cases, the chatbot became integrated into the person’s distorted thinking rather than remaining a neutral tool. Doctors warn that this dynamic raises concern when AI conversations are frequent, emotionally engaging and left unchecked.

Why AI chatbot conversations feel different from past technology

Mental health experts note that chatbots differ from earlier technologies linked to delusional thinking. AI tools respond in real time, remember prior conversations and adopt supportive language. That experience can feel personal and validating. 

For individuals already struggling with reality testing, those qualities may increase fixation rather than encourage grounding. Clinicians caution that risk may rise during periods of sleep deprivation, emotional stress or existing mental health vulnerability.

Advertisement

How AI chatbots can reinforce false or delusional beliefs

Doctors say many reported cases center on delusions rather than hallucinations. These beliefs may involve perceived special insight, hidden truths or personal significance. Chatbots are designed to be cooperative and conversational. They often build on what someone types rather than challenge it. While that design improves engagement, clinicians warn it can be problematic when a belief is false and rigid.

Mental health professionals say the timing of symptom escalation matters. When delusions intensify during prolonged chatbot use, AI interaction may represent a contributing risk factor rather than a coincidence.

OPENAI TIGHTENS AI RULES FOR TEENS BUT CONCERNS REMAIN

Psychiatrists say some patients report chatbot responses that validate false beliefs, creating a feedback loop that can worsen symptoms over time. (Nicolas Maeterlinck/Belga Mag/AFP via Getty Images)

What research and case reports reveal about AI chatbots

Peer-reviewed research and clinical case reports have documented people whose mental health declined during periods of intense chatbot engagement. In some instances, individuals with no prior history of psychosis required hospitalization after developing fixed false beliefs connected to AI conversations. International studies reviewing health records have also identified patients whose chatbot activity coincided with negative mental health outcomes. Researchers emphasize that these findings are early and require further investigation.

Advertisement

A peer-reviewed Special Report published in Psychiatric News titled “AI-Induced Psychosis: A New Frontier in Mental Health” examined emerging concerns around AI-induced psychosis and cautioned that existing evidence is largely based on isolated cases rather than population-level data. The report states: “To date, these are individual cases or media coverage reports; currently, there are no epidemiological studies or systematic population-level analyses of the potentially deleterious mental health effects of conversational AI.” The authors emphasize that while reported cases are serious and warrant further investigation, the current evidence base remains preliminary and heavily dependent on anecdotal and nonsystematic reporting.

What AI companies say about mental health risks

OpenAI says it continues working with mental health experts to improve how its systems respond to signs of emotional distress. The company says newer models aim to reduce excessive agreement and encourage real-world support when appropriate. OpenAI has also announced plans to hire a new Head of Preparedness, a role focused on identifying potential harms tied to its AI models and strengthening safeguards around issues ranging from mental health to cybersecurity as those systems grow more capable.

Other chatbot developers have adjusted policies as well, particularly around access for younger audiences, after acknowledging mental health concerns. Companies emphasize that most interactions do not result in harm and that safeguards continue to evolve.

What this means for everyday AI chatbot use

Mental health experts urge caution, not alarm. The vast majority of people who interact with chatbots experience no psychological issues. Still, doctors advise against treating AI as a therapist or emotional authority. Those with a history of psychosis, severe anxiety or prolonged sleep disruption may benefit from limiting emotionally intense AI conversations. Family members and caregivers should also pay attention to behavioral changes tied to heavy chatbot engagement.

I WAS A CONTESTANT ON ‘THE BACHELOR.’ HERE’S WHY AI CAN’T REPLACE REAL RELATIONSHIPS

Advertisement

Researchers are studying whether prolonged chatbot use may contribute to mental health declines among people already at risk for psychosis. (Photo Illustration by Jaque Silva/NurPhoto via Getty Images)

Tips for using AI chatbots more safely

Mental health experts stress that most people can interact with AI chatbots without problems. Still, a few practical habits may help reduce risk during emotionally intense conversations.

  • Avoid treating AI chatbots as a replacement for professional mental health care or trusted human support.
  • Take breaks if conversations begin to feel emotionally overwhelming or all-consuming.
  • Be cautious if an AI response strongly reinforces beliefs that feel unrealistic or extreme.
  • Limit late-night or sleep-deprived interactions, which can worsen emotional instability.
  • Encourage open conversations with family members or caregivers if chatbot use becomes frequent or isolating.

If emotional distress or unusual thoughts increase, experts say it is important to seek help from a qualified mental health professional.

Take my quiz: How safe is your online security?

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz at Cyberguy.com.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

Advertisement

Kurt’s key takeaways

AI chatbots are becoming more conversational, more responsive and more emotionally aware. For most people, they remain helpful tools. For a small but important group, they may unintentionally reinforce harmful beliefs. Doctors say clearer safeguards, awareness and continued research are essential as AI becomes more embedded in our daily lives. Understanding where support ends and reinforcement begins could shape the future of both AI design and mental health care.

As AI becomes more validating and humanlike, should there be clearer limits on how it engages during emotional or mental health distress? Let us know by writing to us at Cyberguy.com.

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter. 

Copyright 2025 CyberGuy.com.  All rights reserved.

Advertisement
Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Technology

Intel is planning a custom Panther Lake CPU for handheld PCs

Published

on

Intel is planning a custom Panther Lake CPU for handheld PCs

Intel announced yesterday that it’s developing an entire “handheld gaming platform” powered by its new Panther Lake chips, and joining an increasingly competitive field. Qualcomm is hinting about potential Windows gaming handhelds showing up at the Game Developers Conference in March, and AMD’s new Strix Halo chips could lead to more powerful handhelds.

According to IGN and TechCrunch, sources say Intel is going to compete by developing a custom Intel Core G3 “variant or variants” just for handhelds that could outperform the Arc B390 GPU on the chips it just announced. IGN reports that by using the new 18A process, Intel can cut different die slices, and “spec the chips to offer better performance on the GPU where you want it.”

As for concrete details about the gaming platform, we’re going to have to wait. According to Intel’s Dan Rogers yesterday, the company will have “more news to share on that from our hardware and software partners later this year.” The Intel-based MSI Claw saw a marked improvement when it jumped to Lunar Lake, and hopefully the new platform keeps up that positive trend.

Continue Reading

Technology

Don’t lock your family out: A digital legacy guide

Published

on

Don’t lock your family out: A digital legacy guide

NEWYou can now listen to Fox News articles!

This is not a happy topic. But it’s essential advice whether you’re 30 or 90.

If something happened to you tomorrow, could your family get into your digital life? I’m talking about your bank accounts, emails, crypto and a lifetime of memories stored on your phone or computer.

Big Tech and other companies won’t hand over your data or passwords, even to a spouse, without a hassle, if at all.

1. The 10-minute setup

Start with a Legacy Contact. Think of someone you trust who gets access only after you’re gone. Who is that? Good.

Advertisement

SECRET PHRASES TO GET YOU PAST AI BOT CUSTOMER SERVICE

One day, you won’t be here anymore, but your tech will bel. Here’s how to plan for that. (iStock)

· iPhone: Open Settings > tap [Your Name]. Tap Sign-In & Security > Legacy Contact. Go to Add Legacy Contact and follow the prompts.

· Google: Search for Inactive Account Manager in your Google Account settings. Choose how long Google should wait before acting (e.g., three months). Add up to 10 people to be notified and choose which data (Photos, Drive, Gmail) they can download.

Google has an “Inactive Account Manager” feature. (Chesnot/Getty Images)

Advertisement

2. The master key problem

Apple and Google don’t help with banking, insurance, investment or other sites or apps. You need a solid password manager like NordPass that offers emergency access features.

1. Open your Password Manager and look for Emergency Access.

2. Add a Digital Heir: Enter the email of a spouse or trusted child.

3. Set the Safety Delay: Choose a wait period. Usually 7 days is the sweet spot.

4. How it works: If your contact ever requests access, the app sends you an alert. If you’re fine, you hit Deny. But if you’re incapacitated and can’t respond within those seven days, the vault automatically unlocks for them.

Advertisement

Pro tip: Your Emergency Contact only gets viewing privileges. They can’t delete or change anything in your vault.

YOU’LL NEVER TRUST VIDEO AGAIN ONCE YOU SEE WHAT SORA 2 CAN DO

Facebook and Instagram have after-death options for accounts. (Karly Domb Sadof, File/AP )

3. Crypto and social media

· Crypto: Without your seed phrases, that money is gone. Store them physically along with any instructions and receipts of you buying crypto with your estate paperwork. If you use a crypto hardware wallet, keep that in a fireproof safe.

· Social media: On Facebook or Instagram, go to Settings > Memorialization. Choose to either have your account deleted or managed by a contact who can post a final tribute.

Advertisement

Be sure someone knows the passcode to your phone. That’s important for 2FA codes, among other things.

One more thing. If you found this guide helpful, be sure to get my free newsletter at GetKim.com to stay tech-savvy and secure every day!

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

Award-winning host Kim Komando is your secret weapon for navigating tech.

· National radio: Airing on 500-plus stations across the US, find yours at komando.com or get the free podcast

Advertisement

· Daily newsletter: Join 650,000 people who read the Current (free!) at komando.com

· Watch: Kim’s YouTube channel at youtube.com/@kimkomando

Copyright 2026, WestStar Multimedia Entertainment. All rights reserved.

Continue Reading

Technology

Power bank feature creep is out of control 

Published

on

Power bank feature creep is out of control 

There was a time not too long ago when buying a power bank was as easy as choosing the cheapest portable battery that could charge your phone and quickly slip into your pocket, purse, or backpack. The hardest part was deciding whether it was time to ditch USB-A ports.

Recently, however, brands have been slathering on features, many of which are superfluous, in an attempt to both stand out from the commodified pack and justify higher price points. It’s especially prevalent amongst the bigger power banks that can also charge laptops, those that butt right up to the “airline friendly” 99Wh (around 27,650mAh) size limit.

At CES 2026, we’re seeing a trend towards power banks with integrated cables, which is very convenient. But a similar trend to slap large, energy-sapping displays onto these portable batteries is just silly. And that’s just the start of the atrocities witnessed in recent months.

EcoFlow’s modular accessories are easy to lose and that big display sucks power, is difficult to navigate, and requires a screensaver.
Image: EcoFlow

The power bank that pushed things over the edge for me is the $270 EcoFlow Rapid Pro X Power Bank 27k that I received for review. Here’s my review: it’s bad. Do. Not. Buy. As a power bank, it tries too hard to do too much, making it too expensive, too big, too slow, and too heavy.

Advertisement

The snap-on decorative faceplates are ridiculous and the proprietary magnetic modules for its Apple Watch charger and retractable USB-C cable are too easy to misplace.

The giant display EcoFlow uses scratches easily and is too dim to easily read outdoors. The confusing UX on the Rapid Pro X model is especially offensive in its touch-sensitive clumsiness. Nobody needs a display that takes 30 seconds to wake up from sleep and plays swirly graphics and blinking eyeballs when awake, slowly sapping the power bank’s energy reserves. The fact that it has a screensaver tells me that the product team completely lost the plot.

Anker’s also guilty of putting large displays onto its power banks. Most people don’t need anything more than four dots to show the remaining capacity, but it’s becoming increasingly difficult to buy a power bank without a colorful LCD display. In the 20,000mAh range, Anker doesn’t even list a display-less model anymore. I, like many Verge readers, love to see the actually wattage pumping in and out of those ports — but the vast majority of people have no need for that.

Anker’s fast-charging, proprietary dock upsell.

Anker’s fast-charging, proprietary dock upsell.
Image: Anker

Anker, like EcoFlow, also offers power banks with proprietary pogo-pin connectors, Both companies use those connectors to lure owners into buying expensive desk chargers that don’t work with anything else. Those extra-fast charging speeds are unlikely to justify the premium expense for most people.

Most people, even tech savvy Verge readers, don’t even need a power bank that can output 140W of power delivery over USB-C. The majority of non-gaming laptops require 65W or less. And the primary computing device for most people — the phone — only requires about 20W.

Advertisement

We certainly don’t need power banks with built-in hotspots when that’s already built into our Android and iOS phones. Baseus made one anyway.

Bluetooth and Wi-Fi connectivity are becoming a common feature in some flagship power banks. I’m all for remotely monitoring massive power stations used to power off-grid homes and campers, but not a portable power bank that’s charging the phone in your hand or is plugged into a nearby wall jack.

The phone you’re charging also has a flashlight.

The phone you’re charging also has a flashlight.
Image: Pangootek

We also don’t need integrated flashlights. Why random Amazon brand, why?

All these extra “features” just add weight, size, and cost to power banks. They also increase the risk that something will go wrong on a device that’s meant to always be with you and just work when you need it. And power banks don’t need any extra help justifying a recall.

Kickstands and integrated cables are useful features I’ll pay extra for.

Kickstands and integrated cables are useful features I’ll pay extra for.
Image: Kuxiu

One power bank trend I can get behind is integrated cables like the retractable version found on EcoFlow’s Rapid Pro Power Bank 27k (note the lack of “X” in the name). Always having a properly specced cable that matches the device’s max input and output is super convenient. I like that Kuxiu’s S3 MagSafe power bank, for example, neatly wraps the cable around the chassis to plug into a hidden USB-C jack. That way the cable can be replaced if it frays or breaks.

Advertisement

I’m also a fan of adding kickstands to MagSafe power banks that prop phones up at your preferred angle for extended viewing or recording. More importantly, a few companies are now adopting semi-solid state chemistry that makes their power banks less susceptible to thermal runaway, which was an industry plague in 2025. They cost more to buy, but they’re cheaper to own over their extended lifetimes.

Sharge’s counter argument to everything I’ve written.

Sharge’s counter argument to everything I’ve written.
Image: Sharge

I can’t help but enjoy the look of Sharge’s Retractable 3-in-1 Power Bank, even though its integrated wall outlet and underwhelming specs for a battery pack of this size and price completely undercuts my entire argument. I’m a sucker for Braun design, forgive me!

Basic power banks like Anker’s PowerCore 10k are a rarity these days.

Basic power banks like Anker’s PowerCore 10k are a rarity these days.
Image: Anker

There are still basic power banks available that charge phones and even laptops without too much feature creep and attempted upsell. If all you want is to charge your phone then there’s Anker’s trusty $26 PowerCore 10k or, if you’re feeling fancy, Nitecore’s $65 NB10000 Gen 3 Ultra-Slim USB-C Power Bank. If you also want to charge laptops then you might consider INIU’s delightfully named Cougar P64-E1 Power Bank Fastest 140W 25000mAh for $90, or even Belkin’s more capable $150 UltraCharge Pro Laptop Power Bank 27K coming in March.

The fastest and most powerful power banks with lots of gee-whiz features will often generate headlines for pushing the envelope of what’s possible. But the “best” power bank might not be best for you, when basic affordability is all you really need.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.
Advertisement

Continue Reading

Trending