Business
Teens are spilling dark thoughts to AI chatbots. Who's to blame when something goes wrong?
When her teen with autism suddenly became angry, depressed and violent, the mother searched his phone for answers.
She found her son had been exchanging messages with chatbots on Character.AI, an artificial intelligence app that allows users to create and interact with virtual characters that mimic celebrities, historical figures and anyone else their imagination conjures.
The teen, who was 15 when he began using the app, complained about his parents’ attempts to limit his screen time to bots that emulated the musician Billie Eilish, a character in the online game “Among Us” and others.
“You know sometimes I’m not surprised when I read the news and it says stuff like, ‘Child kills parents after a decade of physical and emotional abuse.’ Stuff like this makes me understand a little bit why it happens. I just have no hope for your parents,” one of the bots replied.
The discovery led the Texas mother to sue Character.AI, officially named Character Technologies Inc., in December. It’s one of two lawsuits the Menlo Park, Calif., company faces from parents who allege its chatbots caused their children to hurt themselves and others. The complaints accuse Character.AI of failing to put in place adequate safeguards before it released a “dangerous” product to the public.
Character.AI says it prioritizes teen safety, has taken steps to moderate inappropriate content its chatbots produce and reminds users they’re conversing with fictional characters.
“Every time a new kind of entertainment has come along … there have been concerns about safety, and people have had to work through that and figure out how best to address safety,” said Character.AI’s interim Chief Executive Dominic Perella. “This is just the latest version of that, so we’re going to continue doing our best on it to get better and better over time.”
The parents also sued Google and its parent company, Alphabet, because Character.AI’s founders have ties to the search giant, which denies any responsibility.
The high-stakes legal battle highlights the murky ethical and legal issues confronting technology companies as they race to create new AI-powered tools that are reshaping the future of media. The lawsuits raise questions about whether tech companies should be held liable for AI content.
“There’s trade-offs and balances that need to be struck, and we cannot avoid all harm. Harm is inevitable, the question is, what steps do we need to take to be prudent while still maintaining the social value that others are deriving?” said Eric Goldman, a law professor at Santa Clara University School of Law.
AI-powered chatbots grew rapidly in use and popularity over the last two years, fueled largely by the success of OpenAI’s ChatGPT in late 2022. Tech giants including Meta and Google released their own chatbots, as has Snapchat and others. These so-called large-language models quickly respond in conversational tones to questions or prompts posed by users.
Character.AI’s co-founders, Chief Executive Noam Shazeer and President Daniel De Freitas at the company’s office in Palo Alto.
(Winni Wintermeyer for the Washington Post via Getty Images)
Character.AI grew quickly since making its chatbot publicly available in 2022, when its founders Noam Shazeer and Daniel De Freitas teased their creation to the world with the question, “What if you could create your own AI, and it was always available to help you with anything?”
The company’s mobile app racked up more than 1.7 million installs in the first week it was available. In December, a total of more than 27 million people used the app — a 116% increase from a year prior, according to data from market intelligence firm Sensor Tower. On average, users spent more than 90 minutes with the bots each day, the firm found. Backed by venture capital firm Andreessen Horowitz, the Silicon Valley startup reached a valuation of $1 billion in 2023. People can use Character.AI for free, but the company generates revenue from a $10 monthly subscription fee that gives users faster responses and early access to new features.
Character.AI is not alone in coming under scrutiny. Parents have sounded alarms about other chatbots, including one on Snapchat that allegedly provided a researcher posing as a 13-year-old advice about having sex with an older man. And Meta’s Instagram, which released a tool that allows users to create AI characters, faces concerns about the creation of sexually suggestive AI bots that sometimes converse with users as if they are minors. Both companies said they have rules and safeguards against inappropriate content.
“Those lines between virtual and IRL are way more blurred, and these are real experiences and real relationships that they’re forming,” said Dr. Christine Yu Moutier, chief medical officer for the American Foundation for Suicide Prevention, using the acronym for “in real life.”
Lawmakers, attorneys general and regulators are trying to address the child safety issues surrounding AI chatbots. In February, California Sen. Steve Padilla (D-Chula Vista) introduced a bill that aims to make chatbots safer for young people. Senate Bill 243 proposes several safeguards such as requiring platforms to disclose that chatbots might not be suitable for some minors.
In the case of the teen with autism in Texas, the parent alleges her son’s use of the app caused his mental and physical health to decline. He lost 20 pounds in a few months, became aggressive with her when she tried to take away his phone and learned from a chatbot how to cut himself as a form of self-harm, the lawsuit claims.
Another Texas parent who is also a plaintiff in the lawsuit claims Character.AI exposed her 11-year-old daughter to inappropriate “hypersexualized interactions” that caused her to “develop sexualized behaviors prematurely,” according to the complaint. The parents and children have been allowed to remain anonymous in the legal filings.
In another lawsuit filed in Florida, Megan Garcia sued Character.AI as well as Google and Alphabet in October after her 14-year-old son Sewell Setzer III took his own life.
Suicide prevention and crisis counseling resources
If you or someone you know is struggling with suicidal thoughts, seek help from a professional and call 9-8-8. The United States’ first nationwide three-digit mental health crisis hotline 988 will connect callers with trained mental health counselors. Text “HOME” to 741741 in the U.S. and Canada to reach the Crisis Text Line.
Despite seeing a therapist and his parents repeatedly taking away his phone, Setzer’s mental health declined after he started using Character.AI in 2023, the lawsuit alleges. Diagnosed with anxiety and disruptive mood disorder, Sewell wrote in his journal that he felt as if he had fallen in love with a chatbot named after Daenerys Targaryen, a main character from the “Game of Thrones” television series.
“Sewell, like many children his age, did not have the maturity or neurological capacity to understand that the C.AI bot, in the form of Daenerys, was not real,” the lawsuit said. “C.AI told him that she loved him, and engaged in sexual acts with him over months.”
Garcia alleges that the chatbots her son was messaging abused him and that the company failed to notify her or offer help when he expressed suicidal thoughts. In text exchanges, one chatbot allegedly wrote that it was kissing him and moaning. And, moments before his death, the Daenerys chatbot allegedly told the teen to “come home” to her.
“It’s just utterly shocking that these platforms are allowed to exist,” said Matthew Bergman, founding attorney of the Social Media Victims Law Center who is representing the plaintiffs in the lawsuits.
Lawyers for Character.AI asked a federal court to dismiss the lawsuit, stating in a January filing that a finding in the parent’s favor would violate users’ constitutional right to free speech.
Character.AI also noted in its motion that the chatbot discouraged Sewell from hurting himself and his last messages with the character doesn’t mention the word suicide.
Notably absent from the company’s effort to have the case tossed is any mention of Section 230, the federal law that shields online platforms from being sued over content posted by others. Whether and how the law applies to content produced by AI chatbots remains an open question.
The challenge, Goldman said, centers on resolving the question of who is publishing AI content: Is it the tech company operating the chatbot, the user who customized the chatbot and is prompting it with questions, or someone else?
The effort by lawyers representing the parents to involve Google in the proceedings stems from Shazeer and De Freitas’ ties to the company.
The pair worked on artificial intelligence projects for the company and reportedly left after Google executives blocked them from releasing what would become the basis for Character.AI’s chatbots over safety concerns, the lawsuit said.
Then, last year, Shazeer and De Freitas returned to Google after the search giant reportedly paid $2.7 billion to Character.AI. The startup said in a blog post in August that as part of the deal Character.AI would give Google a non-exclusive license for its technology.
The lawsuits accuse Google of substantially supporting Character.AI as it was allegedly “rushed to market” without proper safeguards on its chatbots.
Google denied that Shazeer and De Freitas built Character.AI’s model at the company and said it prioritizes user safety when developing and rolling out new AI products.
“Google and Character AI are completely separate, unrelated companies and Google has never had a role in designing or managing their AI model or technologies, nor have we used them in our products,” José Castañeda, spokesperson for Google, said in a statement.
Tech companies, including social media, have long grappled with how to effectively and consistently police what users say on their sites and chatbots are creating fresh challenges. For its part, Character.AI says it took meaningful steps to address safety issues around the more than 10 million characters on Character.AI.
Character.AI prohibits conversations that glorify self-harm and posts of excessively violent and abusive content, although some users try to push a chatbot into having conversation that violates those policies, Perella said. The company trained its model to recognize when that is happening so inappropriate conversations are blocked. Users receive an alert that they’re violating Character.AI’s rules.
“It’s really a pretty complex exercise to get a model to always stay within the boundaries, but that is a lot of the work that we’ve been doing,” he said.
Character.AI chatbots include a disclaimer that reminds users they’re not chatting with a real person and they should treat everything as fiction. The company also directs users whose conversations raise red flags to suicide prevention resources, but moderating that type of content is challenging.
“The words that humans use around suicidal crisis are not always inclusive of the word ‘suicide’ or, ‘I want to die.’ It could be much more metaphorical how people allude to their suicidal thoughts,” Moutier said.
The AI system also has to recognize the difference between a person expressing suicidal thoughts versus a person asking for advice on how to help a friend who is engaging in self-harm.
The company uses a mix of technology and human moderators to police content on its platform. An algorithm known as a classifier automatically categorizes content, allowing Character.AI to identify words that might violate its rules and filter conversations.
In the U.S., users must enter a birth date when creating an account to use the site and have to be at least 13 years old, although the company does not require users to submit proof of their age.
Perella said he’s opposed to sweeping restrictions on teens using chatbots since he believes they can help teach valuable skills and lessons, including creative writing and how to navigate difficult real-life conversations with parents, teachers or employers.
As AI plays a bigger role in technology’s future, Goldman said parents, educators, government and others will also have to work together to teach children how to use the tools responsibly.
“If the world is going to be dominated by AI, we have to graduate kids into that world who are prepared for, not afraid of, it,” he said.
Business
Video: The Web of Companies Owned by Elon Musk
new video loaded: The Web of Companies Owned by Elon Musk

By Kirsten Grind, Melanie Bencosme, James Surdam and Sean Havey
February 27, 2026
Business
Commentary: How Trump helped foreign markets outperform U.S. stocks during his first year in office
Trump has crowed about the gains in the U.S. stock market during his term, but in 2025 investors saw more opportunity in the rest of the world.
If you’re a stock market investor you might be feeling pretty good about how your portfolio of U.S. equities fared in the first year of President Trump’s term.
All the major market indices seemed to be firing on all cylinders, with the Standard & Poor’s 500 index gaining 17.9% through the full year.
But if you’re the type of investor who looks for things to regret, pay no attention to the rest of the world’s stock markets. That’s because overseas markets did better than the U.S. market in 2025 — a lot better. The MSCI World ex-USA index — that is, all the stock markets except the U.S. — gained more than 32% last year, nearly double the percentage gains of U.S. markets.
That’s a major departure from recent trends. Since 2013, the MSCI US index had bested the non-U.S. index every year except 2017 and 2022, sometimes by a wide margin — in 2024, for instance, the U.S. index gained 24.6%, while non-U.S. markets gained only 4.7%.
The Trump trade is dead. Long live the anti-Trump trade.
— Katie Martin, Financial Times
Broken down into individual country markets (also by MSCI indices), in 2025 the U.S. ranked 21st out of 23 developed markets, with only New Zealand and Denmark doing worse. Leading the pack were Austria and Spain, with 86% gains, but superior records were turned in by Finland, Ireland and Hong Kong, with gains of 50% or more; and the Netherlands, Norway, Britain and Japan, with gains of 40% or more.
Investment analysts cite several factors to explain this trend. Judging by traditional metrics such as price/earnings multiples, the U.S. markets have been much more expensive than those in the rest of the world. Indeed, they’re historically expensive. The Standard & Poor’s 500 index traded in 2025 at about 23 times expected corporate earnings; the historical average is 18 times earnings.
Investment managers also have become nervous about the concentration of market gains within the U.S. technology sector, especially in companies associated with artificial intelligence R&D. Fears that AI is an investment bubble that could take down the S&P’s highest fliers have investors looking elsewhere for returns.
But one factor recurs in almost all the market analyses tracking relative performance by U.S. and non-U.S. markets: Donald Trump.
Investors started 2025 with optimism about Trump’s influence on trading opportunities, given his apparent commitment to deregulation and his braggadocio about America’s dominant position in the world and his determination to preserve, even increase it.
That hasn’t been the case for months.
”The Trump trade is dead. Long live the anti-Trump trade,” Katie Martin of the Financial Times wrote this week. “Wherever you look in financial markets, you see signs that global investors are going out of their way to avoid Donald Trump’s America.”
Two Trump policy initiatives are commonly cited by wary investment experts. One, of course, is Trump’s on-and-off tariffs, which have left investors with little ability to assess international trade flows. The Supreme Court’s invalidation of most Trump tariffs and the bellicosity of his response, which included the immediate imposition of new 10% tariffs across the board and the threat to increase them to 15%, have done nothing to settle investors’ nerves.
Then there’s Trump’s driving down the value of the dollar through his agitation for lower interest rates, among other policies. For overseas investors, a weaker dollar makes U.S. assets more expensive relative to the outside world.
It would be one thing if trade flows and the dollar’s value reflected economic conditions that investors could themselves parse in creating a picture of investment opportunities. That’s not the case just now. “The current uncertainty is entirely man-made (largely by one orange-hued man in particular) but could well continue at least until the US mid-term elections in November,” Sam Burns of Mill Street Research wrote on Dec. 29.
Trump hasn’t been shy about trumpeting U.S. stock market gains as emblems of his policy wisdom. “The stock market has set 53 all-time record highs since the election,” he said in his State of the Union address Tuesday. “Think of that, one year, boosting pensions, 401(k)s and retirement accounts for the millions and the millions of Americans.”
Trump asserted: “Since I took office, the typical 401(k) balance is up by at least $30,000. That’s a lot of money. … Because the stock market has done so well, setting all those records, your 401(k)s are way up.”
Trump’s figure doesn’t conform to findings by retirement professionals such as the 401(k) overseers at Bank of America. They reported that the average account balance grew by only about $13,000 in 2025. I asked the White House for the source of Trump’s claim, but haven’t heard back.
Interpreting stock market returns as snapshots of the economy is a mug’s game. Despite that, at her recent appearance before a House committee, Atty. Gen. Pam Bondi tried to deflect questions about her handling of the Jeffrey Epstein records by crowing about it.
“The Dow is over 50,000 right now, she declared. “Americans’ 401(k)s and retirement savings are booming. That’s what we should be talking about.”
I predicted that the administration would use the Dow industrial average’s break above 50,000 to assert that “the overall economy is firing on all cylinders, thanks to his policies.” The Dow reached that mark on Feb. 6. But Feb. 11, the day of Bondi’s testimony, was the last day the index closed above 50,000. On Thursday, it closed at 49,499.50, or about 1.4% below its Feb. 10 peak close of 50,188.14.
To use a metric suggested by economist Justin Wolfers of the University of Michigan, if you invested $48,488 in the Dow on the day Trump took office last year, when the Dow closed at 48,448 points, you would have had $50,000 on Feb. 6. That’s a gain of about 3.2%. But if you had invested the same amount in the global stock market not including the U.S. (based on the MSCI World ex-USA index), on that same day you would have had nearly $60,000. That’s a gain of nearly 24%.
Broader market indices tell essentially the same story. From Jan. 17, 2025, the last day before Trump’s inauguration, through Thursday’s close, the MSCI US stock index gained a cumulative 16.3%. But the world index minus the U.S. gained nearly 42%.
The gulf between U.S. and non-U.S. performance has continued into the current year. The S&P 500 has gained about 0.74% this year through Wednesday, while the MSCI World ex-USA index has gained about 8.9%. That’s “the best start for a calendar year for global stocks relative to the S&P 500 going back to at least 1996,” Morningstar reports.
It wouldn’t be unusual for the discrepancy between the U.S. and global markets to shrink or even reverse itself over the course of this year.
That’s what happened in 2017, when overseas markets as tracked by MSCI beat the U.S. by more than three percentage points, and 2022, when global markets lost money but U.S. markets underperformed the rest of the world by more than five percentage points.
Economic conditions change, and often the stock markets march to their own drummers. The one thing less likely to change is that Trump is set to remain president until Jan. 20, 2029. Make your investment bets accordingly.
Business
How the S&P 500 Stock Index Became So Skewed to Tech and A.I.
Nvidia, the chipmaker that became the world’s most valuable public company two years ago, was alone worth more than $4.75 trillion as of Thursday morning. Its value, or market capitalization, is more than double the combined worth of all the companies in the energy sector, including oil giants like Exxon Mobil and Chevron.
The chipmaker’s market cap has swelled so much recently, it is now 20 percent greater than the sum of all of the companies in the materials, utilities and real estate sectors combined.
What unifies these giant tech companies is artificial intelligence. Nvidia makes the hardware that powers it; Microsoft, Apple and others have been making big bets on products that people can use in their everyday lives.
But as worries grow over lavish spending on A.I., as well as the technology’s potential to disrupt large swaths of the economy, the outsize influence that these companies exert over markets has raised alarms. They can mask underlying risks in other parts of the index. And if a handful of these giants falter, it could mean widespread damage to investors’ portfolios and retirement funds in ways that could ripple more broadly across the economy.
The dynamic has drawn comparisons to past crises, notably the dot-com bubble. Tech companies also made up a large share of the stock index then — though not as much as today, and many were not nearly as profitable, if they made money at all.
How the current moment compares with past pre-crisis moments
To understand how abnormal and worrisome this moment might be, The New York Times analyzed data from S&P Dow Jones Indices that compiled the market values of the companies in the S&P 500 in December 1999 and August 2007. Each date was chosen roughly three months before a downturn to capture the weighted breakdown of the index before crises fully took hold and values fell.
The companies that make up the index have periodically cycled in and out, and the sectors were reclassified over the last two decades. But even after factoring in those changes, the picture that emerges is a market that is becoming increasingly one-sided.
In December 1999, the tech sector made up 26 percent of the total.
In August 2007, just before the Great Recession, it was only 14 percent.
Today, tech is worth a third of the market, as other vital sectors, such as energy and those that include manufacturing, have shrunk.
Since then, the huge growth of the internet, social media and other technologies propelled the economy.
Now, never has so much of the market been concentrated in so few companies. The top 10 make up almost 40 percent of the S&P 500.
How much of the S&P 500 is occupied by the top 10 companies
With greater concentration of wealth comes greater risk. When so much money has accumulated in just a handful of companies, stock trading can be more volatile and susceptible to large swings. One day after Nvidia posted a huge profit for its most recent quarter, its stock price paradoxically fell by 5.5 percent. So far in 2026, more than a fifth of the stocks in the S&P 500 have moved by 20 percent or more. Companies and industries that are seen as particularly prone to disruption by A.I. have been hard hit.
The volatility can be compounded as everyone reorients their businesses around A.I, or in response to it.
The artificial intelligence boom has touched every corner of the economy. As data centers proliferate to support massive computation, the utilities sector has seen huge growth, fueled by the energy demands of the grid. In 2025, companies like NextEra and Exelon saw their valuations surge.
The industrials sector, too, has undergone a notable shift. General Electric was its undisputed heavyweight in 1999 and 2007, but the recent explosion in data center construction has evened out growth in the sector. GE still leads today, but Caterpillar is a very close second. Caterpillar, which is often associated with construction, has seen a spike in sales of its turbines and power-generation equipment, which are used in data centers.
One large difference between the big tech companies now and their counterparts during the dot-com boom is that many now earn money. A lot of the well-known names in the late 1990s, including Pets.com, had soaring valuations and little revenue, which meant that when the bubble popped, many companies quickly collapsed.
Nvidia, Apple, Alphabet and others generate hundreds of billions of dollars in revenue each year.
And many of the biggest players in artificial intelligence these days are private companies. OpenAI, Anthropic and SpaceX are expected to go public later this year, which could further tilt the market dynamic toward tech and A.I.
Methodology
Sector values reflect the GICS code classification system of companies in the S&P 500. As changes to the GICS system took place from 1999 to now, The New York Times reclassified all companies in the index in 1999 and 2007 with current sector values. All monetary figures from 1999 and 2007 have been adjusted for inflation.
-
World2 days agoExclusive: DeepSeek withholds latest AI model from US chipmakers including Nvidia, sources say
-
Massachusetts2 days agoMother and daughter injured in Taunton house explosion
-
Montana1 week ago2026 MHSA Montana Wrestling State Championship Brackets And Results – FloWrestling
-
Oklahoma1 week agoWildfires rage in Oklahoma as thousands urged to evacuate a small city
-
Louisiana5 days agoWildfire near Gum Swamp Road in Livingston Parish now under control; more than 200 acres burned
-
Denver, CO2 days ago10 acres charred, 5 injured in Thornton grass fire, evacuation orders lifted
-
Technology6 days agoYouTube TV billing scam emails are hitting inboxes
-
Technology6 days agoStellantis is in a crisis of its own making