This is Optimizer, a weekly newsletter sent every Friday from Verge senior reviewer Victoria Song that dissects and discusses the latest gizmos and potions that swear they’re going to change your life. We’ll be taking a break next week and will back March 20th. Opt in for Optimizer here.
Technology
Trump’s surgeon general nominee is running the wellness grifter playbook perfectly
On the surface, the wellness to MAHA pipeline can appear baffling. How does one get from wanting to be healthy to eschewing vaccines, drinking raw milk, and opting for beef tallow over sunscreen? The simple answer would be: widespread misinformation on online platforms, particularly from influencers.
I’d argue the real answer is slightly more nuanced — and something that I’ve been ruminating over since last week’s confirmation hearing for Casey Means.
Means is President Trump’s controversial nominee for surgeon general, a role often described as the “nation’s doctor.” It entails being America’s foremost spokesperson on public health, as well as educating the public using the best scientific information available. You’re probably most familiar with the surgeon general’s warning on cigarette packs and alcohol labels.
Some of the backlash is because Means currently doesn’t hold an active medical license, is not currently practicing, and never finished her surgical residency — all of which are generally considered prerequisites for the post. She’s primarily known for being a wellness influencer with, as many of her detractors point out, dubious beliefs and inconsistent record of disclosing financial relationships with brands. For example, Trump’s former Surgeon General Jerome Adams has penned an opinion piece directly criticizing her stance on vaccines and history as a tech entrepreneur who recommends supplements. As the cofounder of Levels, a continuous glucose monitor (CGM) startup that’s aimed toward non-diabetics, Means has frequently used her platform to promote CGM use. That’s not inherently bad, but there’s a lack of evidence for its use in non-diabetic populations. There’s also no consensus among experts on how to interpret CGM data in non-diabetics. Aside from a lack of qualifications, Means’ influencer background presents several ethical red flags.
Means tempered her beliefs during her confirmation hearing, despite having previously challenged vaccines, railed against birth control, and endorsed raw milk. I could probably write a separate Optimizer about each of those stances. But what I want to focus on here is the wellness to MAHA pipeline. Not only is it wildly profitable, but it’s got a very specific playbook.
Step one: establish credibility with selective science
If there’s one thing wellness influencers do well, it’s mixing actual science-based facts with emotional truths to lead their audience to potentially misleading conclusions. This is the most important part of any wellness influencer’s game.
Take Means’ book Good Energy, a New York Times bestseller cowritten with her brother Calley Means. The latter is a key figure in the MAHA movement, and serves as a senior adviser to RFK Jr. The book’s primary thesis is that metabolic dysfunction is at the root of every ailment you can think of, from acne to cancer. The front half of the book cites many true things about metabolism. For example, it goes into how mitochondria — the ol’ “powerhouse of the cell” — turns nutrients into cellular energy. She explains in digestible terms how mitochondria produce ATP, what ATP is used for in various bodily processes, and then goes into how certain factors of modern life may lead to “mitochondrial dysfunction.” She also goes into concepts like insulin resistance — when your body, over time, gets less responsive to the hormone leading to a less efficient use of blood sugar — and how it is heavily tied to conditions like diabetes, obesity, cardiovascular disease, and non-alcoholic fatty liver disease. If you remember high school biology, or even searched these basic claims on Google, much of the information passes the smell check.
Throughout the book, Means also dispenses some solid, common-sense health advice. Things like sleeping eight hours a day, exercising, and opting for whole, unprocessed foods whenever possible. At the end of each chapter, Means includes a link to her references. Combined with Casey Means’ background as a graduate from Stanford School of Medicine, this can easily give the impression of a well-researched book by an expert with ample scientific backing.
The problem is those facts are interspersed with less convincing assertions, which all get tied together in service of questionable or misleading conclusions. For example, the book’s premise is that metabolic issues are often the culprit behind many ailments. That means, so long as you practice “good energy” habits that keep your mitochondria functioning, you can prevent cancer and a long list of other illnesses. The reality is scientists find combating mitochondrial dysfunction extremely challenging.
Here’s another example: erectile dysfunction. According to Means, erectile dysfunction is “generally rooted in metabolic disease, with reduced blood flow to the capillaries and nerves of the penis being a key factor, driven by the impact of insulin resistance on forming arterial blockages (called atherosclerosis) and blood vessel dilation.” She quotes another doctor, Sara Gottfried, as saying that erectile dysfunction is a “neon sign” for metabolic disorders. In the scientific references for the chapter, Means quotes her own blog for Levels on the subject as well as some other studies supporting some of the claims.
It is true that metabolic issues can lead to erectile dysfunction. But there are many other causes too. Many a standup comedian has opined about how performance anxiety, stress, or even too much alcohol can impact sexual performance on a given night. Certain medications or conditions like Parkinson’s disease can also contribute to it. Meanwhile, Gottfried is another doctor/wellness influencer who practices functional medicine like Means. Functional medicine is a controversial healthcare approach that attempts to take a holistic look at treatment, focusing on the root cause of a health problem instead of managing symptoms. There’s nothing inherently wrong with that, and some medical institutions like Cleveland Clinic have come to embrace it in recent years. But its critics have accused functional medicine of being a thinly disguised type of alternative medicine that depends on unnecessary blood testing, restrictive diets, and a ton of expensive supplements.

This is a lot of nuance that could easily fly over a reader’s head if they’re not familiar with the subjects at hand. There are some scientific truths in the mix, which give credence to other suspicious assertions that Means will make down the line.
By the end of the book, you might not blink twice that oral antibiotics, birth control, ibuprofen, fluoride toothpaste, scented candles, and perfume are listed as toxins. You might even find yourself at a dinner party, sharing a “factoid” that C-sections are suboptimal for a baby’s gut microbiome, because the infant doesn’t get the chance to ingest the mother’s vaginal organisms. (The truth is more nuanced). Heck, you might just heed Means’ advice and rehome your pet if they keep interrupting your sleep by daring to sleep on the bed. All of that is “bad energy.”
Step two: cast doubt on institutions
In her book and across her platforms, Means has touted the same origin story. After becoming disillusioned with the medical establishment, Means left to find a better way. To tell that story, she uses powerful anecdotes of her mother’s frustrating experience with the traditional medical establishment — as well as her own experiences as a surgical resident.
Means then pairs those emotional stories with other truths. Like the fact that pharmaceutical companies are greedy and do lobby legislators in Washington. Doctors have said they feel pressured to “overtreat” patients due to a number of factors, including financial incentives. From there, she makes the assertion that conventional medicine might be alright for treating acute ailments (e.g., saving your life after a car accident), but you should ignore doctors for chronic illnesses. Chapter three of Means’ book Good Energy is literally titled “Trust yourself, not your doctor.”
This is a potent narrative. Nevermind that Means hedges in her book, saying that she “deeply respects doctors.” The seed of doubt has been planted. It’s not a huge logical leap to This is the secret the establishment is not telling you. Or, You don’t need all those medications because the real profit is in keeping you sick. It’s right there on Means’ website. In a section detailing her controversies, Means asserts that she’s considered controversial because in part “she criticizes ‘sick care’ medicine for profiting from disease management, calls for reform of the Farm Bill, pharmaceutical incentives, food culture, and industrial agriculture.” Here, she’s painted herself as a warrior for health, someone who challenges the status quo because she couldn’t bring herself to participate in the system.
The Los Angeles Times reported on apparent holes in Means’ origin story, including that her former department chair said she quit her residency because of anxiety, not a disillusionment with the system.
But again, this requires the average person to dig deep on their own. All the influencer has to do is present themselves as a more genuine truthteller, exhort you to “do your own research” from links they provide, and offer up a product that will empower you to “take your health into your own hands” — a narrative RFK Jr. has used as well.
Conveniently, there’s an easy built-in counter for anyone who tries to refute these claims with information from reputable institutions: They are corrupt and lying to you.
Step three: offer ‘simple’ solutions that lead to profit
At this point, Means has established that she does research (even if the conclusions are at times questionable) and has a medical background. She’s consistently messaged that medical institutions aren’t trustworthy. The last step is to tell her audience she has the real answer to why everyone is sick (metabolic dysfunction) and how to fix it (several products).
As a wellness influencer, Means sells a lot of things. First and foremost, her philosophy of “good energy” and metabolic health, which has spawned a book and newsletter, complete with affiliate links for the “clean” products and supplements she recommends. In one of her “Good Energy” newsletters, Means recommends blood tests from Function Health — a standard part of her methodology — plus supplements like WeNatal and ENERGYBits, a form of spirulina algae and chlorella. (Nevermind that ENERGYBits was eviscerated on both Shark Tank and by the American Council of Science and Health as allegedly citing junk science and misleading product marketing. Studies have also not conclusively found health benefits to spirulina supplements.)

Means has financial relationships with all three brands, including newsletter sponsorships and partnerships fees. It’s expected that influencers are usually selling something, but the problem is there were no disclosures for any of those three brands in that newsletter.
That’s not an outlier either. While reading the Good Energy book, the only brand relationship I saw Means disclose was that she cofounded Levels. Once in the text itself, and once in the acknowledgements. Conversely, she recommended Function Health three times in the book and not once does she disclose that she’s an investor. Other brands she promotes but doesn’t disclose relationships to in the book include, once again, WeNatal and Daily Harvest, a health food delivery service.

An Associated Press investigation claimed that while Means did disclose newsletter sponsors, she failed to disclose affiliate links in a buying guide on her site. Meanwhile, Public Citizen, a nonprofit consumer advocacy organization, wrote a letter to the FTC calling on the agency to investigate Means for allegedly failing to follow advertising disclosure standards. The nonprofit found that, with regard to affiliate links, Means neglected to disclose financial relationships 56 percent of the time.
The problem with wellness trends
It’s not guaranteed that Means will become surgeon general, but you can already see the impact of this common influencer playbook shifting public health. This strategy is why gray market peptides are popular. It’s why you see people starting to doubt vaccines and other medical treatments with decades of evidence.
It has an impact on health tech too. It’s why we’re starting to see gadgets that seem to spring directly from wellness trends. Hormone balancing and inflammation are two dubious wellness trends that are likely why I saw so many urine, blood, and saliva testing kits pop up at CES. Metabolism and nutrition are two areas where wearable and fitness tech makers are diving into with AI coaches.
The scariest thing about Casey Means and other wellness influencers is that some of what they say is true. They are rightfully honing in on genuine frustrations people have with our broken healthcare system and the overwhelming amount of contradictory information online. But where science says “the truth is complicated,” wellness influencers propose a simple solution: All you have to do is take out your wallet.
Technology
Microsoft starts removing Copilot buttons from Windows 11 apps
Microsoft is starting to remove “unnecessary” Copilot buttons from its Windows 11 apps. In the latest version of the Notepad app for Windows Insiders, Microsoft has removed the Copilot button in favor of a “writing tools” menu. The Copilot button in the Snipping Tool app also no longer appears when you select an area to capture.
The change is part of “reducing unnecessary Copilot entry points, starting with apps like Snipping Tool, Photos, Widgets and Notepad,” that Microsoft promised to complete as part of its broader plan to fix Windows 11. While Copilot buttons are being removed, it looks like the underlying AI features are here to stay, though.
The Copilot button has been removed from Notepad, but the writing tools replacement still uses AI-powered features and looks like the identical menu of options that existed before. I still think these features are largely unnecessary in what’s supposed to be a lightweight text app, but removing the superfluous Copilot branding is a good first step.
Technology
AI chatbots refilling psych meds sparks debate
NEWYou can now listen to Fox News articles!
If you have ever waited weeks just to renew a mental health prescription, you already know how frustrating the system can feel. Now imagine handling that refill through a chatbot instead of a doctor.
That kind of thing is already starting to happen. In Utah, a new pilot program is allowing an artificial intelligence system from Legion Health to renew certain psychiatric medications without direct approval from a physician each time. State officials say this could speed things up and reduce costs.
Many psychiatrists are not convinced. They are asking whether this actually solves the problem it claims to fix.
Sign up for my FREE CyberGuy Report
- Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
- For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com – trusted by millions who watch CyberGuy on TV daily.
- Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.
AMAZON HEALTH AI BRINGS A DOCTOR TO YOUR POCKET
Utah launches AI chatbot to renew select psychiatric prescriptions, raising questions about safety and oversight. (pocketlight/Getty Images)
How the AI prescription system works
Before this starts sounding like a robot psychiatrist, the program stays tightly limited. The AI only renews a short list of lower-risk medications that a doctor has already prescribed. These include commonly used antidepressants like Prozac, Zoloft and Wellbutrin.
To qualify, patients must meet strict requirements. You need to be stable on your current medication. Recent dosage changes or a psychiatric hospitalization will disqualify you. You also need to check in with a healthcare provider after a set number of refills or within a certain time frame.
During the process, the chatbot asks about symptoms, side effects and warning signs such as suicidal thoughts. If anything raises concern, it sends the case to a real doctor before approving a refill. According to an agreement filed with Utah’s Office of Artificial Intelligence Policy, the pilot includes strict safeguards, including human review thresholds and automatic escalation for higher-risk cases. The system cannot prescribe new medications or manage drugs that require close monitoring. As a result, it leaves out many complex conditions from the pilot.
Why some experts are pushing back
Even with those guardrails, many psychiatrists are uneasy. Brent Kious, a psychiatrist and professor at the University of Utah School of Medicine, has questioned whether AI systems like this actually solve the access problem they are designed to address.
He has suggested that the benefits of an AI-based refill system may be overstated, especially since patients must already be stable and under care to qualify. Kious has also raised concerns about how much these systems rely on self-reported answers. Patients may not recognize side effects, may answer inaccurately, or may adjust their responses to get the outcome they want.
He has further questioned whether current AI tools can safely handle even routine parts of psychiatric care, noting that treatment decisions often depend on factors that go beyond simple screening questions. He has also pointed to a lack of transparency in how these systems operate, which can make it harder for doctors and patients to fully trust them.
HEALTHCARE DATA BREACH HITS SYSTEM STORING PATIENT RECORDS
A new pilot program allows AI to handle some mental health medication refills without direct doctor approval. (Sezeryadigar/Getty Images)
The promise behind the technology
Supporters of the program are focused on access. A lot of people in Utah still struggle to get mental health care. Wait times can stretch for weeks. In some areas, there simply are not enough providers available. The idea is that AI can take care of routine refill requests so doctors have more time to focus on patients with more complex needs. That could help take some pressure off the system. Legion Health is also leaning into convenience. The service is expected to cost about $19 a month and is designed to make refills quicker and easier for patients who qualify. From a big-picture view, that could help. From a patient’s point of view, the tradeoff may feel a little more complicated. We reached out to Legion Health for comment, but did not hear back before our deadline.
What this means to you
If you rely on mental health medication, this kind of system could change how you manage your care. You may be able to get refills more quickly if your condition is stable and your treatment plan is not changing. At the same time, this does not replace your doctor. It does not handle new diagnoses or complex decisions. It also adds another layer between you and your care. Instead of a conversation, you are interacting with a system that depends on how you answer a series of questions. Mental health treatment often depends on small details. Changes in mood, sleep or behavior can matter more than a simple yes or no response. That is where some experts believe human care still has a clear advantage.
The bigger question about AI in healthcare
This pilot is only one step in a much larger shift. Utah is already experimenting with AI in other areas of healthcare. Companies like Legion are signaling plans to expand beyond a single state. What starts with simple refills could eventually move into more complex decisions. That is where the conversation becomes more urgent. Is this a practical way to improve access to care, or does it risk reducing something deeply personal into a transaction driven by software?
HOW ARTIFICIAL INTELLIGENCE IS TRANSFORMING HEALTHCARE
Psychiatrists question whether AI prescription refills address access issues or create new risks for patients. (SDI Productions/Getty Images)
Take my quiz: How safe is your online security?
Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com.
Kurt’s key takeaways
There is no question that access to mental health care needs improvement. Long wait times and limited availability are real problems that affect millions of people. AI may help in specific situations, especially when the task is routine and the patient is stable. Still, convenience should not be confused with quality. For now, this system is narrow in scope and closely monitored. That makes it easier to test. It also highlights how early we are in this transition. The technology will continue to evolve. The real question is whether the safeguards, oversight and transparency will evolve at the same pace.
Would you feel comfortable letting a chatbot handle part of your mental health care, or is that a line you do not want technology to cross? Let us know by writing to us at Cyberguy.com.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Sign up for my FREE CyberGuy Report
- Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
- For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com – trusted by millions who watch CyberGuy on TV daily.
- Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.
Copyright 2026 CyberGuy.com. All rights reserved.
Technology
ChatGPT has a new $100 per month Pro subscription
OpenAI has announced a new version of its ChatGPT Pro subscription that costs $100 per month. The new Pro tier offers “5x more” usage of its Codex coding tool than the $20 per month Plus subscription and “is best for longer, high-effort Codex sessions,” OpenAI says.
The company is introducing the new tier as it tries to win over users from Anthropic and its popular Claude Code tool. ChatGPT’s $100 per month option will directly compete with Anthropic’s “Max” tier for Claude, which costs the same price. It also offers a middle ground between the $20 per month Plus tier and the $200 version of the Pro tier.
(Yes, there are now two tiers of “Pro”; while the new tier “still offers access to all Pro features,” OpenAI says that the more expensive one has even higher usage limits.)
According to OpenAI, ChatGPT Plus will “will continue to be the best offer at $20 for steady, day-to-day usage of Codex, and the new $100 Pro tier offers a more accessible upgrade path for heavier daily use.” OpenAI also offers an $8 per month Go tier and a free tier.
-
Atlanta, GA5 days ago1 teenage girl killed, another injured in shooting at Piedmont Park, police say
-
Education1 week agoVideo: We Put Dyson’s $600 Vacuum to the Test
-
Movie Reviews1 week agoVaazha 2 first half review: Hashir anchors a lively, chaos-filled teen tale
-
Georgia3 days agoGeorgia House Special Runoff Election 2026 Live Results
-
Education1 week agoVideo: YouTube’s C.E.O. on the Rise of Video and the Decline of Reading
-
Pennsylvania4 days agoParents charged after toddler injured by wolf at Pennsylvania zoo
-
Education1 week agoVideo: Toy Testing with a Discerning Bodega Cat
-
Milwaukee, WI4 days agoPotawatomi Casino Hotel evacuated after fire breaks out in rooftop HVAC system