Business
The justices are expected to rule quickly in the case.
When the Supreme Court hears arguments on Friday over whether protecting national security requires TikTok to be sold or closed, the justices will be working in the shadow of three First Amendment precedents, all influenced by the climate of their times and by how much the justices trusted the government.
During the Cold War and in the Vietnam era, the court refused to credit the government’s assertions that national security required limiting what newspapers could publish and what Americans could read. More recently, though, the court deferred to Congress’s judgment that combating terrorism justified making some kinds of speech a crime.
The court will most likely act quickly, as TikTok faces a Jan. 19 deadline under a law enacted in April by bipartisan majorities. The law’s sponsors said the app’s parent company, ByteDance, is controlled by China and could use it to harvest Americans’ private data and to spread covert disinformation.
The court’s decision will determine the fate of a powerful and pervasive cultural phenomenon that uses a sophisticated algorithm to feed a personalized array of short videos to its 170 million users in the United States. For many of them, and particularly younger ones, TikTok has become a leading source of information and entertainment.
As in earlier cases pitting national security against free speech, the core question for the justices is whether the government’s judgments about the threat TikTok is said to pose are sufficient to overcome the nation’s commitment to free speech.
Senator Mitch McConnell, Republican of Kentucky, told the justices that he “is second to none in his appreciation and protection of the First Amendment’s right to free speech.” But he urged them to uphold the law.
“The right to free speech enshrined in the First Amendment does not apply to a corporate agent of the Chinese Communist Party,” Mr. McConnell wrote.
Jameel Jaffer, the executive director of the Knight First Amendment Institute at Columbia University, said that stance reflected a fundamental misunderstanding.
“It is not the government’s role to tell us which ideas are worth listening to,” he said. “It’s not the government’s role to cleanse the marketplace of ideas or information that the government disagrees with.”
The Supreme Court’s last major decision in a clash between national security and free speech was in 2010, in Holder v. Humanitarian Law Project. It concerned a law that made it a crime to provide even benign assistance in the form of speech to groups said to engage in terrorism.
One plaintiff, for instance, said he wanted to help the Kurdistan Workers’ Party find peaceful ways to protect the rights of Kurds in Turkey and to bring their claims to the attention of international bodies.
When the case was argued, Elena Kagan, then the U.S. solicitor general, said courts should defer to the government’s assessments of national security threats.
“The ability of Congress and of the executive branch to regulate the relationships between Americans and foreign governments or foreign organizations has long been acknowledged by this court,” she said. (She joined the court six months later.)
The court ruled for the government by a 6-to-3 vote, accepting its expertise even after ruling that the law was subject to strict scrutiny, the most demanding form of judicial review.
“The government, when seeking to prevent imminent harms in the context of international affairs and national security, is not required to conclusively link all the pieces in the puzzle before we grant weight to its empirical conclusions,” Chief Justice John G. Roberts Jr. wrote for the majority.
In its Supreme Court briefs defending the law banning TikTok, the Biden administration repeatedly cited the 2010 decision.
“Congress and the executive branch determined that ByteDance’s ownership and control of TikTok pose an unacceptable threat to national security because that relationship could permit a foreign adversary government to collect intelligence on and manipulate the content received by TikTok’s American users,” Elizabeth B. Prelogar, the U.S. solicitor general, wrote, “even if those harms had not yet materialized.”
Many federal laws, she added, limit foreign ownership of companies in sensitive fields, including broadcasting, banking, nuclear facilities, undersea cables, air carriers, dams and reservoirs.
While the court led by Chief Justice Roberts was willing to defer to the government, earlier courts were more skeptical. In 1965, during the Cold War, the court struck down a law requiring people who wanted to receive foreign mail that the government said was “communist political propaganda” to say so in writing.
That decision, Lamont v. Postmaster General, had several distinctive features. It was unanimous. It was the first time the court had ever held a federal law unconstitutional under the First Amendment’s free expression clauses.
It was the first Supreme Court opinion to feature the phrase “the marketplace of ideas.” And it was the first Supreme Court decision to recognize a constitutional right to receive information.
That last idea figures in the TikTok case. “When controversies have arisen,” a brief for users of the app said, “the court has protected Americans’ right to hear foreign-influenced ideas, allowing Congress at most to require labeling of the ideas’ origin.”
Indeed, a supporting brief from the Knight First Amendment Institute said, the law banning TikTok is far more aggressive than the one limiting access to communist propaganda. “While the law in Lamont burdened Americans’ access to specific speech from abroad,” the brief said, “the act prohibits it entirely.”
Zephyr Teachout, a law professor at Fordham, said that was the wrong analysis. “Imposing foreign ownership restrictions on communications platforms is several steps removed from free speech concerns,” she wrote in a brief supporting the government, “because the regulations are wholly concerned with the firms’ ownership, not the firms’ conduct, technology or content.”
Six years after the case on mailed propaganda, the Supreme Court again rejected the invocation of national security to justify limiting speech, ruling that the Nixon administration could not stop The New York Times and The Washington Post from publishing the Pentagon Papers, a secret history of the Vietnam War. The court did so in the face of government warnings that publishing would imperil intelligence agents and peace talks.
“The word ‘security’ is a broad, vague generality whose contours should not be invoked to abrogate the fundamental law embodied in the First Amendment,” Justice Hugo Black wrote in a concurring opinion.
The American Civil Liberties Union told the justices that the law banning TikTok “is even more sweeping” than the prior restraint sought by the government in the Pentagon Papers case.
“The government has not merely forbidden particular communications or speakers on TikTok based on their content; it has banned an entire platform,” the brief said. “It is as though, in Pentagon Papers, the lower court had shut down The New York Times entirely.”
Mr. Jaffer of the Knight Institute said the key precedents point in differing directions.
“People say, well, the court routinely defers to the government in national security cases, and there is obviously some truth to that,” he said. “But in the sphere of First Amendment rights, the record is a lot more complicated.”
Business
How Blocking Oil and Gas From Leaving the Strait of Hormuz Ripples Around the World
The strait is just 35 miles wide, but before the war began, a quarter of the world’s seaborne oil and one-fifth of its gas traversed through the waterway. The choking off of that supply is creating economic shocks around the world. Even nations not heavily dependent on Gulf oil and gas are contending with the consequences.
International oil prices are at their highest levels in years. L.N.G. prices have soared. Rising jet fuel costs are causing flight cancellations. From Tokyo to Vancouver, driving has become considerably more expensive. In Bangladesh, garment factories have begun to sit idle. In Pakistan, the government has established statewide school closures to conserve power.
The price shock is depleting foreign currency reserves and stoking inflation in nations already struggling with rising costs.
Experts have called the current situation a “systemic collapse” of the energy security era established in the 20th century.
Governments worldwide are deploying measures to combat shortages and high energy prices, including the largest-ever release of strategic oil reserves by the United States, Japan, South Korea and others.
For now, energy experts and economists say these stopgap measures are helping shield households and companies from the most acute disruptions, but they warn that the drag on global economic growth will compound if the war persists.
President Trump has pressed for an international naval coalition to break the Iranian blockade of the strait. Over the weekend, he threatened to obliterate parts of Iran if it did not reverse course. Tehran has said “non-hostile” ships can sail through the strait, but it is unclear if any vessels will try.
Methodology
The New York Times identified ports and energy installations in the Persian Gulf affected by the Strait of Hormuz and then used activity tracked by Kpler, an industry data firm, to measure the tonnage of individual shipments flowing out of the region in 2025, as well as their final destinations. The shipping analysis focused on seaborne trade and was limited to the following oil and gas products: crude oil and condensate; gasoline and naphtha; liquefied petroleum gas; gasoil and diesel; kero and jet; fuel oils; and liquefied natural gas. About half of the outgoing shipments made by Iran are estimated by Kpler using satellite imagery.
Business
Landmark L.A. jury verdict finds Instagram, YouTube were designed to addict kids
After a grueling seven weeks of court proceedings and more than 40 hours of tense deliberations across nine days in one of the country’s most closely-watched civil trials, jurors handed down a landmark decision in Los Angeles Superior Court on Wednesday, finding Instagram and YouTube responsible for the suffering of a Chico woman who charged the platforms were built to addict young users. .
Kaley G.M., the 20-year-old plaintiff, arrived in court just before 10 a.m. wearing the same rose-colored maxi dress she’d donned to testify in February. She remained stoic as the verdict, the $3 million damages award and the decision warranting punitive damages were read out. A companion fought back tears, her chin quivering. Several observers wept silently despite Judge Carolyn B. Kuhl’s repeated warning not to respond.
“We need to have no reaction to the jury’s verdict — no crying out, no reactions, no disturbance,” Kuhl warned. “If there is we will have to have you removed from the courtroom, and we sure don’t want to have to do that.”
Attorneys for Snapchat and TikTok also appeared in court Wednesday morning to hear the decision. The two platforms settled with Kaley out of court for undisclosed sums before the trial.
“We respectfully disagree with the verdict and are evaluating our legal options,” a spokesperson for Instagram’s parent company Meta said.
The verdict arrived less than 24 hours after a New Mexico jury found Meta liable for $375 million in damages related to Atty. Gen. Raúl Torres’ claim it turned Instagram into a “breeding ground” for child predators — a decision the platform has vowed to appeal.
The Los Angeles jury took much longer to deliberate. On Friday, jurors preempted their pizza lunch break to ask Kuhl whether all of them should weigh in on damages, or only those who’d agreed on liability. On Monday they told Kuhl they were struggling to agree about one of the defendants.
Kuhl told the jury to keep trying.
Kaley said she first got hooked on YouTube and Instagram in grade school. Jurors were charged with determining whether the companies acted negligently in designing their products and failed to warn her of the dangers.
Their verdict will echo through of thousands of other pending lawsuits, reshaping the legal landscape for some of the world’s most powerful companies. Experts say the payout will likely set the bar for future awards.
It comes on the heels of a Delaware court decision clearing Meta’s insurers of responsibility for damages incurred from “several thousand lawsuits regarding the harm its platforms allegedly cause children” — a ruling that could leave it and other tech titans on the hook for untold future millions.
Until this trial, which began in late January, no suit seeking to hold tech titans responsible for harms to children had ever reached a jury. Many more are now set to follow.
Amy Neville (L), who lost her son Alexander at 14 from fentanyl he purchased through social media, is hugged by attorney Laura Marquez-Garrett, as they wait for a verdict in the social media trial tasked to determine whether social media giants deliberately designed their platforms to be addictive to children, in Los Angeles, on March 20, 2026.
(PATRICK T. FALLON/AFP via Getty Images)
Kaley’s test case was chosen from among scores of suits currently consolidated in California state court. Hundreds more are moving together through the federal system, where the first trial is set for June in San Francisco.
Collectively, the suits seek to prove that harm flowed not from user content but from the design and operation of the platforms themselves.
That’s a critical legal distinction, experts say. Social media companies have so far been protected by a powerful 1996 law called Section 230, which has shielded the apps from responsibility for what happens to children who use it.
Lawyers for Meta and Google argued Kaley’s struggles were the result of her fractious home life and fallout from the COVID pandemic, not social media.
Phyllis Jones (R), attorney for Meta, leaves the Los Angeles Superior Court on March 12, 2026.
(Frederic J. Brown/AFP via Getty Images)
“I don’t think it should have ever gotten to a jury trial,” said Erwin Chemerinsky, dean of the UC Berkeley School of Law and an expert on the 1st Amendment, which also protects the platforms. “All media tries to keep people on [their platform] and coming back.”
Others say social media’s algorithmic ability to capture, cultivate and control attention makes it fundamentally different from teen-friendly romantasy novels, Marvel movies or first-person shooter games.
“These are truly hard and heart breaking cases,” said Eric J. Segall, a professor at Georgia State College of Law. “[They] represent a clash between free speech values and the real harms caused by protecting those companies that engage in free speech amplification for profit.”
“Letting jurors sort all of this out without more guidance is tempting but also risky,” he said.
As deliberations that began March 13 wore on, jurors signaled similar skepticism, asking to see internal Meta documents, and reviewing testimony from a defense expert “in regards to her professional integrity; being the only doctor stating social media was not a contributing factor to KGM’s mental health.”
They appeared to agree on Meta’s culpability by Friday, but labored through Tuesday to hash out a decision for Google, delivering their verdict just after 10 a.m. Wednesday.
“Today, a jury saw the truth and held Meta and Google accountable for designing products that addict and harm children,” said Lexi Hazam, court-appointed co-lead plaintiffs’ counsel in the related federal action. “This verdict sends an unmistakable message that no company is above accountability.”
The outcome will likely transform the already heated debate over social media addiction as a concept, what role apps may play in engineering it, and whether individuals like Kaley can prove they’re afflicted.
The platforms’ attorneys sought to cast doubt on the ailment — emphasizing that there is no formal diagnosis for social media addiction — while also arguing that Kaley had never been treated for it.
“Substitute the words ‘YouTube’ for the word methamphetamine,” attorney Luis Li urged the jury during closing arguments Thursday. “Ask yourselves with your lifetime of experience whether anybody suffering from addiction could say, ‘Yeah, I just kind of lost interest.’”
“She was sitting there for hours without being on her phone,” said Meta attorney Paul W. Schmidt.
YouTube’s team also sought to distance the video-sharing app from Instagram and other social media platforms, saying its functions are fundamentally different.
Kaley’s team called it “a gateway” to her social media addiction.
“YouTube wasn’t a gateway to anything,” Li said. “YouTube was a toy that a child liked and then put down.”
Jurors disagreed, ultimately holding the platform liable, though they split the liability 70-30, weighting heavily to Meta.
Lanier leaned on his down-home Texas folksiness throughout the trial, telling the jury what was on his heart and scribbling with grease pencil on his demonstrative aids. In his direct addresses to the jury, he used a set of wooden baby blocks, stacks of paper, even a hammer and a crate of eggs.
During the punitive phase of the trial late Wednesday morning, he brought out a glass jar filled with 415 peanut M&Ms to represent the $415 billion dollars of stockholder’s equity Google’s parent company Alphabet was valued at in December.
“What are you going to fine them for this?” he probed. “Are you going to fine them a billion?” He plucked a green M&M from the top of the pile. “Two billion?” He pulled out another. “You know a pack of M&Ms has 18 M&Ms in it? You fine them a billion, and they’re not going to notice.”
“The last thing in the world they want you to do is talk about how many M&Ms they’ve got,” the lawyer said, urging jurors to “talk to Meta in Meta money”. “The last thing in the world they want you to do is focus on what it takes to hold them accountable for what they’ve done.”
Conversely, the tech teams relied on slick digital presentations to review evidence and illustrate their arguments.
“Focus on those facts that are at issue in this case,” Schmit urged the jury during closings. “Not lawyer arguments, not props like a glass of water or a jar of M&Ms, But actual proof in evidence.”
During the punitive phase of the trial, he sought to emphasize that “there wasn’t an intention to do harm” to children, and that it had worked diligently to make its products safer.
The case was the first to get Meta CEO Mark Zuckerberg on the witness stand, where he defended Instagram’s safety record and lamented the difficulty of keeping youngsters off the app.
It also made public tens of thousands of pages of internal documents — documents Lanier argued showed the companies intentionally targeted children, and engineered their products to keep them on the platforms longer.
“These are internal documents that you’re uniquely seeing because you’re the jury that got to sit on this case,” Lanier told the jury during closing arguments on Thursday. “It’s given you exposure that the world hasn’t had.”
Those previously undisclosed materials likely proved critical to the jury’s ultimate verdict, experts said.
“Internal emails here were key — they painted a picture of indifference at Meta,” said Joseph McNally, former Acting U.S. Attorney for the Central District of California and an expert in “technology-related harm.”
The tech titans have already vowed to appeal both the California and New Mexico verdicts, all-but ensuring the issue is ultimately decided by the Supreme Court, experts said.
Business
OpenAI will shut down its Sora tool
OpenAI plans to shut down its Sora text-to-video tool, a stunning move that comes three months after Walt Disney Co. pledged to invest $1 billion in the artificial intelligence company and allow the use of dozens of beloved characters.
The San Francisco-based company did not disclose why it was shutting down the tool or the timeline for its phaseout. In a post Tuesday on the Sora account on X, the company said it knew the news was “disappointing.”
“To everyone who created with Sora, shared it, and built community around it: thank you,” the post said.
Open AI’s pivot comes as the company was engaged in discussions with Disney to formalize their arrangement — but no deal had been reached, according to a source familiar with the matter who was not authorized to comment.
Although Disney had pledged to make the huge investment, the company had not yet made any payments to OpenAI, this person said. OpenAI had not paid any fees to license Disney characters.
A Disney spokesperson said in a statement that the company respected OpenAI’s decision to shift its priorities away from video generation.
“We appreciate the constructive collaboration between our teams and what we learned from it, and we will continue to engage with AI platforms to find new ways to meet fans where they are while responsibly embracing new technologies that respect IP and the rights of creators,” the spokesperson said.
The emergence of Sora had roiled Hollywood, particularly as AI and compensation for actors’ likeness and voice became a central issue in the 2023 strike.
Performers guild SAG-AFTRA had said at the time of the Disney-OpenAI announcement that it would “closely monitor the deal and its implementation to ensure compliance with our contracts and with applicable laws protecting image, voice, and likeness.”
OpenAI first previewed Sora in 2024, and the realism of the tool’s AI-generated videos grabbed audiences at a time when competing video generation apps struggled.
The text-to-video platform enabled users to create short videos with different styles, voices and dedicated features such as “storyboard,” which enabled users to weave together prompts to make longer videos with consistent characters — something that wasn’t possible before.
In September, OpenAI launched Sora as a dedicated app to create and share AI-generated videos with friends, which many viewed as a social networking app modeled after TikTok.
The app’s remix feature enabled users to superimpose the likeness of their friends or celebrities into existing AI-generated video or create new ones. Sam Altman, the chief executive of OpenAI, encouraged users to slap his likeness onto AI-generated scenes and other pop-culture videos.
The lax approach to copyright allowed the re-creation of dead celebrities and copyrighted characters from titles including WWE and South Park, which OpenAI said it would allow on its platform unless the celebrities opt out.
Despite hitting 1 million downloads in a week, the app lost its sheen, as regular users found little everyday use for a dedicated AI video app. As legal challenges mounted, Sora also strengthened its copyright guardrails and “content violation” warnings became a routine part of denying user requests.
But the AI space has become increasingly crowded. OpenAI’s smaller rival Anthropic has gained ground by offering its AI coding services to enterprises rather than just to consumers. Its Claude tool has become especially popular for coding tasks.
Since Sora’s release, competitors such as Google Veo and Bytedance’s Seedance also have rushed into the AI video generation market.
Times staff writer Meg James contributed to this report.
-
Detroit, MI7 days agoDrummer Brian Pastoria, longtime Detroit music advocate, dies at 68
-
Georgia1 week agoHow ICE plans for a detention warehouse pushed a Georgia town to fight back | CNN Politics
-
Movie Reviews7 days ago‘Youth’ Twitter review: Ken Karunaas impresses audiences; Suraj Venjaramoodu adds charm; music wins praise | – The Times of India
-
Science1 week agoIndustrial chemicals have reached the middle of the oceans, new study shows
-
Sports4 days agoIOC addresses execution of 19-year-old Iranian wrestler Saleh Mohammadi
-
Science1 week agoHow a Melting Glacier in Antarctica Could Affect Tens of Millions Around the Globe
-
Culture1 week agoTest Your Memory of Great Lines From Classic Irish Poems
-
New Mexico3 days agoClovis shooting leaves one dead, four injured