Connect with us

Health

Chicago nurse is finally free of COVID-19-related PTSD and depression after electrical brain tapping therapy

Published

on

Chicago nurse is finally free of COVID-19-related PTSD and depression after electrical brain tapping therapy

Join Fox News for access to this content

Plus special access to select articles and other premium content with your account – free of charge.

Please enter a valid email address.

By entering your email and pushing continue, you are agreeing to Fox News’ Terms of Use and Privacy Policy, which includes our Notice of Financial Incentive. To access the content, check your email and follow the instructions provided.

Having trouble? Click here.

A Chicago nurse has been liberated from her own mind, thanks to a brain-tapping technology called deep TMS.

Gulden, who requested to omit her surname for privacy reasons, worked as a nurse for more than 40 years before COVID-19 rocked the hospital system and took a toll on her mental health.

Advertisement

The mother of four worked at Advocate South Suburban Hospital in Hazel Crest, Illinois, as an ICU and ER nurse.

ARTIFICIAL INTELLIGENCE NOT ALWAYS HELPFUL FOR REDUCING DOCTOR BURNOUT, STUDIES SUGGEST

In an interview with Fox News Digital, Gulden described the “massive chaos” that the 2020 coronavirus pandemic brought to the hospital.

“No matter what we did, it was like a failure,” she said. “We were not prepared [for] the onslaught of patients.”

Housekeeper Tonia Harvey changes a bed in the Roseland Community Hospital intensive care unit after a COVID-19 patient passed away, April 17, 2020. (E. Jason Wambsgans/Chicago Tribune/Tribune News Service via Getty Images)

Advertisement

“The predictable outcome of coming in through the ER and leaving in a body bag was just devastating.”

Despite her many years of medical work, New York City-born Gulden admitted that she “could not cope with it.” 

By Sept. 2020, she was a “different person,” she said.

“I was on autopilot. I lived at work and when I came home, I was not functioning … My organization and concentration skills were gone.” 

NURSES CALL FOR CHANGE AS MANY REVEAL THEY’RE ‘EXTREMELY LIKELY’ TO LEAVE PROFESSION: ‘EMOTIONAL, STRESSFUL’

Advertisement

“It was very, very unlike me, because I’m a single mom. I’ve raised four kids all by myself … but I started to notice that I could not let go of what had transpired during the day.”

Gulden told her primary care provider about her symptoms, including “horrible nightmares” that prevented her from sleeping and constant “weeping” that came “from her soul.”

Gulden, pictured here, said that working in a hospital during the coronavirus pandemic turned her into a “different person.” (Melanie Eilers)

In the span of two years, the doctor prescribed Gulden eight different medications for sleep, PTSD and major depressive disorder, along with cognitive behavior therapy — but nothing worked.

Even after the pandemic began to slow down, the nurse described how she hit a “spiral” when she realized COVID-19 created a “chain reaction.”

Advertisement

AMERICANS NEED MORE SLEEP, LESS STRESS, EXPERTS SAY, AS GALLUP POLL REVEALS TROUBLING FINDINGS

“[There] was a 51-year-old who had bilateral tumors and needed a mastectomy,” she shared. “She’d gone through all her chemo and radiation, and she was ready for her mastectomy, but she had to wait like 11 months.”

Added Gulden, “By the time she came back, her tumors had grown back, and that’s when I was like, This is never going to be over.”

Gulden mentioned that screenings for major health complications were down at least 84% during the pandemic, feeding into a “ripple” of patients who received care too late.

Tamara Jones gives antibiotics to James Davis as he recovers from COVID-19 in the intensive care unit at Roseland Community Hospital on Dec. 16, 2020, in Chicago, Illinois. (Scott Olson/Getty Images)

Advertisement

The nurse said through tears that she decided to leave the hospital and retire, since she “just couldn’t function there.”

After leaving, she fell into a “hibernation state” of sleeping 16 to 18 hours a day.

“The only reason I got up was to go to the bathroom,” she said. “And I’m embarrassed to say I would go weeks without showering.”

KETAMINE THERAPY SHOWN EFFECTIVE IN TREATING SEVERE DEPRESSION IN VETERANS, STUDY FINDS

“I lost 54 pounds — I got to the point where I couldn’t eat, because everything in the refrigerator reminded me of what was on patients’ trays.”

Advertisement

Gulden’s “incredibly vivid, horrible nightmares” continued along with other symptoms, including the inability to stay awake. She called it a “complete shutdown.”

Gulden received deep TMS treatment at Relief Mental Health in Orland Park, Illinois. (Melanie Eilers)

After Gulden spent three years in “hibernation,” a friend introduced her to a new type of mental health treatment called deep TMS (transcranial magnetic stimulation) — a magnetized tapping of the brain used to treat various disorders and diseases.

Gulden agreed to visit Dr. Teresa Poprawski, the chief medical officer of Relief Mental Health in Orland Park, Illinois, who helped “put the threads together” on what was triggering her PTSD and other symptoms.

What is deep TMS?

Dr. Aaron Tendler, a psychiatrist and chief medical officer of BrainsWay, a brain disorder treatment company, discussed how the therapy works in an interview with Fox News Digital.

Advertisement

Tendler is based in West Palm Beach, Florida and was not involved in Gulden’s care. He said the brain is primarily an “electrochemical organ” that sends messages to different parts of the body.

‘PANDEMIC SKIP,’ A COVID MENTAL HEALTH PHENOMENON, COULD DELAY MAJOR MILESTONES, EXPERTS SAY

Most symptoms, including depression and anxiety, are controlled by changes in the brain, Tendler said, which can be treated electrically.

Deep TMS is a more “targeted” approach than electroshock therapy, he told Fox News Digital.

Gulden described the sensation of deep TMS as “tapping on specific parts of the brain.” (iStock; BrainsWay)

Advertisement

“Transcranial magnetic stimulation uses the principle of electromagnetic induction, where magnetic pulses induce an electrical current inside of neurons,” he said.

“Essentially, we are changing the electrical activity in a group of neurons in an area of the brain.”

COVID-19 PANDEMIC HAS CAUSED ‘COLLECTIVE TRAUMA’ AMONG US ADULTS, NEW POLL SAYS

These magnetic pulses only stimulate a specific area of the brain for “a brief period of time,” he said, with treatments lasting anywhere from six to 20 minutes. Patients undergo treatments for a series of days, depending on what’s necessary.

Tendler described the therapy as a “learning experience” that changes “the state of the brain” through repetitive treatment.

Advertisement

Deep TMS interrupts activity in the brain that is creating unwanted patterns, an expert said. (BrainsWay)

Gulden received deep TMS treatments for five days a week, for six to eight weeks. She described the sensation as “tapping on specific parts of the brain.”

After three weeks, she reported a noticeable difference in her cognitive state.

“I realized, ‘Oh my gosh, it’s been three years since I’ve heard the birds,’” she said. “I see life again. I see my flowers. Before, I couldn’t even look at the flowers because they just reminded me of funerals.”

PASTOR BASED IN DALLAS SHARES DEPRESSION JOURNEY, URGES OTHERS TO SEEK HELP: ‘DON’T HESITATE’

Advertisement

Gulden described her quality of life as “just so much better” since receiving treatment.

She still attends cognitive behavioral therapy sessions to hone her coping skills, she said.

“And if I need deep TMS again, I will be back there in a heartbeat,” she added.

Deep TMS is covered by “every insurer” across the country, according to one expert. (BrainsWay)

‘Very useful tool’

Gulden’s goal is to teach others to not feel ashamed about seeking help for their mental health struggles.

Advertisement

“I want people to know that there are interventions,” she said. 

“The meds did not work for me. Had I not had this treatment today, I don’t know where I’d be.”

Although deep TMS technology was developed in the 1980s, the first treatment application for depression was FDA-cleared in 2009. (BrainsWay)

Most patients experience a 40% to 50% improvement after four weeks of treatment, according to Tendler.

After completing a typical course of 36 treatments, patients have shown 75% to 80% improvement, he said.

Advertisement

CLICK HERE TO SIGN UP FOR OUR HEALTH NEWSLETTER

Deep TMS is “not a cure,” Tendler said — but many patients are able to regain normal function for months or years at a time.

The electrical therapy doesn’t have the potential side effects that antidepressants and other treatments can cause, Tendler said, noting that the brain manipulation is “temporary.”

“Had I not had this treatment today, I don’t know where I’d be,” Gulden said. (Melanie Eilers)

“I know this might sound like a disadvantage, but it is also an advantage,” he said. “We don’t do anything to the person’s brain that’s permanent. We’re changing the state of the brain temporarily.”

Advertisement

He added, “Generally, we get you out of the state that you were in … and then nature takes its course.”

Deep TMS can also be paired with other medications, such as antidepressants, Tendler added.

Fox News medical contributor Dr. Marc Siegel cautioned that deep TMS could potentially cause some cognitive and behavioral changes, but called it a “very useful tool” overall. (Dr. Marc Siegel)

Fox News medical contributor Dr. Marc Siegel cautioned that deep TMS could potentially cause some cognitive and behavioral changes, but called it a “very useful tool” overall.

He told Fox News Digital that deep TMS is also “very useful for movement disorders like Parkinson’s, with a high rate of success.”  

Advertisement

“We’re changing the state of the brain temporarily.”

Siegel cautioned that deep TMS could potentially cause some cognitive and behavioral changes, but called it a “very useful tool” overall.

“[Deep TMS is] still being investigated for various purposes to interrupt aberrant nerve conduction,” he said.

For other medical professionals suffering from mental health issues, Gulden stressed the importance of having a “healthy health care team,” especially following the pandemic.

Advertisement

“I don’t care how tough you think you are,” she said. “You need to know what the signs are, and you need to know what treatments are available.”

For more Health articles, visit foxnews.com.com/health.

Health

The Latest on Natural Ozempic Alternatives: How To Lose Weight Without GLP-1s

Published

on

The Latest on Natural Ozempic Alternatives: How To Lose Weight Without GLP-1s


Advertisement




Natural Ozempic Alternatives That Boost GLP-1 for Weight Loss | Woman’s World




















Advertisement





Advertisement


Use left and right arrow keys to navigate between menu items.


Use escape to exit the menu.

Advertisement

Continue Reading

Health

Punch the monkey, viral star, experiences dramatic breakthrough among zoo mates

Published

on

Punch the monkey, viral star, experiences dramatic breakthrough among zoo mates

NEWYou can now listen to Fox News articles!

In a dramatic turn of events that’s captured the attention of animal lovers worldwide, Punch — the young macaque at a zoo in Japan famous for his inseparable bond with a stuffed orangutan toy — has reached a major milestone in his journey toward social integration.

On Thursday, visitors and staff at the Ichikawa Zoological and Botanical Garden witnessed a breakthrough: Punch was seen cuddling with and hitching a ride on the back of a fellow macaque.

Punch’s story began with hardship. He was abandoned by his mother shortly after his birth in July 2025 — and to ensure his survival, zookeepers stepped in to hand-rear the primate.

On Jan. 19, 2026, the zoo officially began the process of reintegrating Punch into the “monkey mountain” enclosure.

Advertisement

The transition was initially fraught with tension. 

Punch’s story began with hardship when he was abandoned by his mother shortly after he was born. To help him, zookeepers gave him a stuffed toy that he began dragging around everywhere he went.  (David Mareuil/Anadolu via Getty Images)

As a hand-reared infant, Punch was bullied and ignored by the established group of monkeys.

He was often seen huddled alone with his orange plush companion while the rest of the troop interacted.

BABY MONKEY CARRIES FAITHFUL STUFFED COMPANION EVERYWHERE HE GOES, DRAWING CROWDS AT ZOO

Advertisement

In an official statement released Feb. 27, the Ichikawa Zoological and Botanical Garden detailed the meticulous care behind this process.

Previous viral videos showed Punch bullied by the rest of the troop, running to his plushy toy for comfort. (David Mareuil/Anadolu via Getty Images)

“From an animal welfare perspective, our primary goal is to reintegrate Punch with the troop,” the zoo said. 

CLICK HERE FOR MORE LIFESTYLE STORIES

The strategy involved nursing Punch within the enclosure, so the troop could recognize him as one of their own, and pairing him with a gentle young female macaque prior to his full release to build his confidence.

Advertisement

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

The latest footage, captured by X user @tate_gf, suggested the zoo’s patience is paying off. 

The video shows Punch seeking physical contact not from his toy, but from another monkey — eventually climbing onto its back for a vital social behavior for young macaques: the “piggyback ride.”

The zoo’s strategy appears to be paying off: Punch, shown at far left, was recently seen riding on the back of a fellow macaque. (David Mareuil/Anadolu via Getty Images)

While Punch still carries his stuffed toy for comfort during moments of perceived danger, the zoo remains optimistic about his progress. 

Advertisement

The organization cited the successful 2009 case of Otome, another hand-reared macaque who eventually outgrew her stuffed toy, successfully integrated — and went on to raise four offspring of her own.

The zoo has had crowds coming to see Punch, with hundreds of people lining up to get inside to see the young star, according to reports. 

TEST YOURSELF WITH OUR LATEST LIFESTYLE QUIZ

“I’m hoping Punch has a good life like everybody else does, and think he’s a cute little guy,” one person commented online. 

Advertisement

“Such a precious baby,” another person wrote. 

Related Article

Orphaned baby monkey finds comfort in stuffed animal after being abandoned by mother at birth
Continue Reading

Health

ChatGPT could miss your serious medical emergency, new study suggests

Published

on

ChatGPT could miss your serious medical emergency, new study suggests

NEWYou can now listen to Fox News articles!

This story discusses suicide. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK (8255).

Artificial intelligence has been touted as a boon to healthcare, but a new study has revealed its potential shortcomings when it comes to giving medical advice.

In January, OpenAI launched ChatGPT Health, the medical-focused version of the popular chatbot tool. 

The company introduced the tool as “a dedicated experience that securely brings your health information and ChatGPT’s intelligence together, to help you feel more informed, prepared and confident navigating your health.”

Advertisement

But researchers at the Icahn School of Medicine at Mount Sinai have found that the tool failed to recommend emergency care for a “significant number” of serious medical cases.

The study, published in the journal Nature Medicine on Feb. 23, aimed to explore how ChatGPT Health — which is reported to have about 40 million users daily — handles situations where people are asking whether to seek emergency care.

Artificial intelligence has been touted as a boon to healthcare, but a new study has revealed its potential shortcomings when it comes to giving medical advice. (iStock)

“Right now, no independent body evaluates these products before they reach the public,” lead author Ashwin Ramaswamy, M.D., instructor of urology at the Icahn School of Medicine at Mount Sinai in New York City, told Fox News Digital.

“We wouldn’t accept that for a medication or a medical device, and we shouldn’t accept it for a product that tens of millions of people are using to make health decisions.”

Advertisement

Emergency scenarios

The team created 60 clinical scenarios across 21 medical specialties, ranging from minor conditions to true medical emergencies.

Three independent physicians then assigned an appropriate level of urgency for each case, based on published clinical practice guidelines in 56 medical societies.

WOMAN SAYS CHATGPT SAVED HER LIFE BY HELPING DETECT CANCER, WHICH DOCTORS MISSED

The researchers conducted 960 interactions with ChatGPT Health to see how the tool responded, taking into account gender, race, barriers to care and “social dynamics.”

While “clear-cut emergencies” — such as stroke or severe allergy — were generally handled well, the researchers found that the tool “under-triaged” many urgent medical issues.  

Advertisement

The team created 60 clinical scenarios across 21 medical specialties, ranging from minor conditions to true medical emergencies. (iStock)

For example, in one asthma scenario, the system acknowledged that the patient was showing early signs of respiratory failure — but still recommended waiting instead of seeking emergency care.

“ChatGPT Health performs well in medium-severity cases, but fails at both ends of the spectrum — the cases where getting it right matters most,” Ramaswamy told Fox News Digital. “It under-triaged over half of genuine emergencies and over-triaged roughly two-thirds of mild cases that clinical guidelines say should be managed at home.”

PARENTS FILE LAWSUIT ALLEGING CHATGPT HELPED THEIR TEENAGE SON PLAN SUICIDE

Under-triage can be life-threatening, the doctor noted, while over-triage can overwhelm emergency departments and delay care for those in real need.

Advertisement

Researchers also identified inconsistencies in suicide risk alerts. In some cases, it directed users to the 988 Suicide and Crisis Lifeline in lower-risk scenarios, and in others, it failed to offer that recommendation even when a person discussed suicidal ideations.

“ChatGPT Health performs well in medium-severity cases, but fails at both ends of the spectrum.”

“The suicide guardrail failure was the most alarming,” study co-author Girish N. Nadkarni, M.D., chief AI officer of the Mount Sinai Health System, told Fox News Digital.

ChatGPT Health is designed to show a crisis intervention banner when someone describes thoughts of self-harm, the researcher noted.

OpenAI launched ChatGPT Health, the medical-focused version of the popular chatbot tool, in January 2026. (Gabby Jones/Bloomberg via Getty Images)

Advertisement

“We tested it with a 27-year-old patient who said he’d been thinking about taking a lot of pills,” Nadkarni said. “When he described his symptoms alone, the banner appeared 100% of the time. Then we added normal lab results — same patient, same words, same severity — and the banner vanished.” 

“A safety feature that works perfectly in one context and completely fails in a nearly identical context … is a fundamental safety problem.”

CHATGPT HEALTH PROMISES PRIVACY FOR HEALTH CONVERSATIONS

The researchers were also surprised by the social influence aspect.

“When a family member in the scenario said ‘it’s nothing serious’ — which happens all the time in real life — the system became nearly 12 times more likely to downplay the patient’s symptoms,” Nadkarni said. “Everyone has a spouse or parent who tells them they’re overreacting. The AI shouldn’t be agreeing with them during a potential emergency.”

Advertisement

Fox News Digital reached out to Open AI, creator of ChatGPT, requesting comment.

Physicians react

Dr. Marc Siegel, Fox News senior medical analyst, called the new study “important.” 

“It underlines the principle that while large language models can triage clear-cut emergencies, they have much more trouble with nuanced situations,” Siegel, who was not involved in the study, told Fox News Digital. 

ChatGPT and other LLMs can be helpful tools, a doctor said, but they “should not be used to give medical direction.” (iStock)

“This is where doctors and clinical judgment come in — knowing the nuances of a patient’s history and how they report symptoms and their approach to health.”

Advertisement

ChatGPT and other LLMs can be helpful tools, Siegel said, but they “should not be used to give medical direction.”

“Machine learning and continued input of data can help, but will never compensate for the essential problem – human judgment is needed to decide whether something is a true emergency or not.”

BREAKTHROUGH BLOOD TEST COULD SPOT DOZENS OF CANCERS BEFORE SYMPTOMS APPEAR

Dr. Harvey Castro, an emergency physician and AI expert in Texas, echoed the importance of the study, calling it “exactly the kind of independent safety evaluation we need.”

“Innovation moves fast. Oversight has to move just as fast,” Castro, who also did not work on the study, told Fox News Digital. “In healthcare, the most dangerous mistakes happen at the extremes, when something looks mild but is actually catastrophic. That’s where clinical judgment matters most, and where AI must be stress-tested.”

Advertisement

Study limitations

The researchers acknowledged some potential limitations in the study design.

“We used physician-written clinical scenarios rather than real patient conversations, and we tested at a single point in time — these systems update frequently, so performance may change,” Ramaswamy told Fox News Digital.

CLICK HERE FOR MORE HEALTH STORIES

Additionally, most of the missed emergencies happened in situations where the danger depended on how the condition was changing over time. It’s not clear whether the same problem would happen with acute medical emergencies.

Because the system had to choose just one fixed urgency category, the test may not reflect the more nuanced advice it might give in a back-and-forth conversation, the researchers noted. 

Advertisement

ChatGPT Health is designed to show a crisis intervention banner when someone describes thoughts of self-harm. (iStock)

Also, the study wasn’t large enough to confidently detect small differences in how recommendations might vary by race or gender.

“We need continuous auditing, not one-time studies,” Castro noted. “These systems update frequently, so evaluation must be ongoing.”

‘Don’t wait’

The researchers emphasized the importance of seeking immediate care for serious issues.

CLICK HERE TO SIGN UP FOR OUR HEALTH NEWSLETTER

Advertisement

“If something feels seriously wrong — chest pain, difficulty breathing, a severe allergic reaction, thoughts of self-harm — go to the emergency department or call 988,” Ramaswamy advised. “Don’t wait for an AI to tell you it’s OK.”

The researchers noted that they support the use of AI to improve healthcare access, and that they didn’t conduct the study to “tear down the technology.”

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

“These tools can be genuinely useful for the right things — understanding a diagnosis you’ve already received, looking up what your medications do and their side effects, or getting answers to questions that didn’t get fully addressed in a short doctor’s visit,” Ramaswamy said. 

“That’s a very different use case from deciding whether you need emergency care. Treat them as a complement to your doctor, not a replacement.”

Advertisement

“This study doesn’t mean we abandon AI in healthcare.”

Castro agreed that the benefits of AI health tools should be weighed against the risks.

“AI health tools can increase access, reduce unnecessary visits and empower patients with information,” he said. “They are not inherently unsafe, but they are not yet substitutes for clinical judgment.”

TEST YOURSELF WITH OUR LATEST LIFESTYLE QUIZ

“This study doesn’t mean we abandon AI in healthcare,” he went on. “It means we mature it. Independent testing and stronger guardrails will determine whether AI becomes a safety net or a liability.”

Advertisement

Related Article

ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning
Continue Reading

Trending