First came the AI beauty pageant. Then the AI music contests. Now, there is an award for AI Personality of the Year — perhaps the inevitable next step for the AI influencer economy as it transforms from quirky novelty into a serious and lucrative industry.
Technology
Creepy robot mom that gives birth is training future midwives
NEWYou can now listen to Fox News articles!
Most hospital training labs use basic dummies or simple mannequins to teach medical skills. Students practice procedures, learn techniques and move on to real patients later. But a new childbirth simulator called Mama Anne takes training to a very different level. This lifelike robot blinks, breathes and even talks while helping midwifery students practice delivering babies before they ever step into a real delivery room. And if the idea of a robot going into labor feels a little creepy, you are not alone.
At York St. John University in York, England, educators have introduced the simulator as part of a new approach to hands-on medical training. The technology allows students to experience complex labor scenarios in a safe environment where mistakes become learning moments instead of medical emergencies. And yes, the robot actually gives birth.
Sign up for my FREE CyberGuy Report. Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter.
ROBOTS POWER BREAKTHROUGH IN PREGNANCY RESEARCH, BOOSTING IVF SUCCESS RATES
Mama Anne is a high-fidelity childbirth simulator used to train midwifery students in realistic labor and delivery scenarios before they work with real patients. (Laerdal Medical)
How the robot childbirth simulator trains future midwives
The simulator known as Mama Anne looks and behaves much like a real patient in labor. Developed by Laerdal Medical, the high-fidelity mannequin was designed to recreate real childbirth conditions with startling realism.
Students interact with Mama Anne as if she were an actual patient. Her eyes blink and react to light. Her chest rises and falls as she breathes. She even has pulses that can be felt in multiple places across the body. Most importantly, she can deliver a baby mannequin during a simulated birth.
Unlike older training models that stayed mostly static, this simulator moves and reacts during labor. It can deliver in several positions, including lying back or on all fours. It can also display vital signs that change in response to medical complications. In short, it turns a classroom exercise into something that feels much closer to a real hospital scenario.
Why robot childbirth simulators are becoming essential
For decades, midwifery training relied heavily on textbooks, observation and limited hands-on practice. That approach left a major gap. Many students encountered their first true emergencies only after they began working in clinical settings.
Now technology is filling that gap. Simulation tools like Mama Anne allow students to practice high-risk situations repeatedly before they ever treat a real patient. As a result, students build confidence while instructors guide them through difficult scenarios.
For example, the simulator can recreate several dangerous childbirth complications, including:
- Postpartum hemorrhage with realistic blood loss
- Shoulder dystocia when a baby becomes stuck during delivery
- Pre-eclampsia and eclampsia with changing vital signs
- Sepsis symptoms that require rapid treatment
Students also practice everyday clinical skills such as monitoring fetal heart rate, giving injections and managing labor from start to finish. Because the training environment is controlled, instructors can pause a scenario, explain a mistake and run it again.
The robot even teaches communication skills
Medical training is not only about technical procedures. Communication with patients matters just as much. Mama Anne helps with that, too.
The simulator can speak using recorded responses or real-time dialogue through hidden speakers. Students must explain procedures, ask for consent and reassure their patient just as they would in a real delivery room.
If someone touches the simulator without asking first, it can react and vocalize discomfort. That feature reinforces one of the most important lessons in modern healthcare: patient consent and respectful care always come first.
REMOTE ROBOT SURGERY REMOVES CANCER 1,500 MILES AWAY
The lifelike simulator can blink, breathe, display vital signs and deliver a baby mannequin to recreate complex childbirth situations. (Laerdal Medical)
Why universities are investing in this technology
Educators believe simulation training dramatically improves how healthcare students prepare for the real world. Rebecca Beggan, midwifery program lead at York St. John University, says hands-on simulation helps students build both competence and confidence before clinical placements.
Students can experience an entire labor scenario from beginning to end. They learn antenatal care, labor management and postnatal care in a single immersive exercise. Instructors also say the technology helps protect students from the emotional shock of encountering their first medical emergency without preparation. Instead of facing those situations cold, students enter clinical placements with real practice under their belt.
The future of childbirth training
The arrival of hyper-realistic simulators like Mama Anne suggests medical education is entering a new era. Instead of learning mostly through observation and experience, future healthcare professionals may train through realistic simulations that mirror real hospital conditions.
That shift could change everything from how nurses train to how surgeons rehearse complex procedures. Technology will never replace human caregivers. However, it can help prepare them better than ever before.
What this means to you
Even if you never step into a medical classroom, this technology could still affect your life. Better training often leads to better patient outcomes. When healthcare providers practice emergency scenarios in advance, they react faster and make fewer mistakes during real emergencies.
For expectant parents, that can mean safer deliveries and more confident medical teams in the room. Simulation training also reflects a broader shift in healthcare education across the United States. Many hospitals and universities are adopting high-fidelity simulators for surgery, emergency care and trauma response. The goal is simple: Let students practice difficult situations before lives are on the line.
Take my quiz: How safe is your online security?
Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com.
Kurt’s key takeaways
A robot that gives birth may seem a little creepy at first. Still, tools like this could become common in medical training down the road. Students gain hands-on experience. Instructors guide them through emergencies. Patients benefit from better-prepared medical teams. The next generation of midwives may enter the delivery room with far more practice than any class before them. As medical simulators grow more realistic and more widespread, one question naturally follows.
Students use the simulator to practice emergencies like postpartum hemorrhage, shoulder dystocia and other complications in a safe training environment. (Laerdal Medical)
If robots can train doctors to deliver babies today, what other parts of healthcare might soon be practiced first in simulation labs instead of hospitals? Let us know by writing to us at Cyberguy.com.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Sign up for my FREE CyberGuy Report. Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter.
Copyright 2026 CyberGuy.com. All rights reserved.
Technology
Two of my favorite color e-book readers are the cheapest they’ve been in months
Color isn’t essential in an e-reader, but let’s be honest, it’s a nice perk that can bring digital books, magazines, comics, cookbooks, and other publications to life. The catch is that color ebook readers tend to be substantially pricier, which makes today’s deals stand out. Right now, the Kindle Colorsoft (16GB) and Kobo Libra Colour are matching their lowest prices to date, with the Amazon e-reader going for $169.99 ($80 off) at Amazon and Best Buy, and the Libra Colour going for $199.99 ($30 off) via Rakuten’s online storefront.
At their core, both are excellent e-readers with 7-inch, 300ppi E Ink displays, which drop to 150ppi when viewing color. The Colorsoft’s display is slightly more vibrant in most instances, but the difference isn’t dramatic. Each also offers IPX8 water resistance, so you don’t need to worry about spills and can comfortably read in the bath or by the pool.
Which one makes more sense for you largely depends on where you buy your books, how much storage you need, and whether you like to take notes. The Colorsoft is great if you’re heavily embedded in Amazon’s ecosystem, as buying and accessing Kindle books is intuitive and doesn’t require any sideloading. As the more affordable option in Amazon’s lineup, the standard Colorsoft delivers a nearly identical reading experience to the Signature Edition, and it supports Amazon’s “Send to Alexa Plus” feature, which lets you send notes or documents to Amazon’s AI-powered assistant for summaries, to-do lists, reminders, and more. The downside is that it lacks wireless charging and an auto-adjusting front light — which are standard on the step-up model — and comes with 16GB of storage instead of 32GB.
That said, if I didn’t already own so many Kindle books, the Libra Colour would be my pick. It offers double the storage at 32GB and includes intuitive physical page-turn buttons. You can also write notes while reading, given that it offers stylus support, and it includes built-in notebook templates, as well as the ability to convert handwriting to typed text. It also supports EPUB and a wider range of file formats, and lets you save articles for offline reading with Instapaper. And it also offers adjustable warm lighting, which makes reading at night a little easier on the eyes.
Technology
Robot plays tennis with humans in real time
NEWYou can now listen to Fox News articles!
A humanoid robot is now rallying tennis shots with a human in real time. It runs without a script or remote control, so it can react instantly on a tennis court.
The robot stands about 4 feet tall, giving it a compact, human-like frame. Galbot Robotics released a video showing its robot going shot-for-shot with a human player. The system behind it is called LATENT and runs on the Unitree G1.
And it is not just returning the ball. It is moving, adjusting and competing during live play.
Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.
CHINA’S COMPACT HUMANOID ROBOT SHOWS OFF BALANCE AND FLIPS
A humanoid robot rallies tennis shots with a human in real time, reacting without scripts or remote control during live play. (Galbot Robotics)
Why this tennis robot is different from others
Most athletic robots you have seen follow scripts. They perform pre-programmed actions or rely on a remote control. This one operates differently. It reacts to a human opponent in real time, tracking fast-moving balls, shifting across the court and returning shots with surprising accuracy. It also adjusts to changing trajectories and unpredictable shots during rallies. Researchers say it can sustain long rallies with millisecond-level reactions and full-body coordination. That marks a major step forward.
How the AI learned to play tennis
Training a robot to play tennis is extremely complex. Tennis involves:
- Tennis ball speeds can reach up to 67 miles per hour
- Split-second racket contact
- Constant movement across a large court
Capturing complete human gameplay data is difficult. So the researchers used a different method.
Training the robot using motion fragments
Instead of recording full matches, they focused on small segments of movement:
- Forehands
- Backhands
- Side steps
They gathered about five hours of motion data from five players. The sessions took place on a compact 10-by-16-foot court. That space is more than 17 times smaller than a standard tennis court.
RESTAURANT ROBOT GOES HAYWIRE, SENDS TABLEWARE FLYING BEFORE BREAKING OUT IN DANCE MOVES
Humanoid robots designed by Galbot Robotics select items from a shelf at the Shanghai New Expo Center in Shanghai, China, on July 26, 2025. Galbot Robotics also designed the tennis-playing robot that learns movement fragments and applies them in live competition. (Ying Tang/NurPhoto via Getty Images)
How the robot plays tennis during live rallies
The system first learns individual movements. Then it combines them into coordinated sequences. That allows the robot to:
- Move toward the ball
- Strike it with control
- Recover and reposition
To improve performance, the team trained the model in simulation. They varied physical conditions such as mass, friction and aerodynamics. This helps the robot adapt to real-world unpredictability. As a result, the system responds dynamically instead of following a fixed routine.
How well does it actually perform against humans?
In testing, the system achieved up to 96% success on forehand shots in simulation. In real-world trials, the robot can sustain rallies with a human and consistently return the ball across the net.
Watching the demo, it appears competitive. At times, the robot places shots away from the human player. That suggests more than a simple reaction. It points toward early forms of decision-making.
There are still limits. The robot can look unstable at times. Its motion is not yet as fluid as a trained athlete. High or unpredictable shots may still present challenges. Even so, the progress is clear.
Why this matters beyond tennis
This breakthrough goes far beyond tennis. It shows how robots can learn complex human skills without perfect data. The same approach could apply to:
- Football
- Badminton
- Industrial work
- Search and rescue
Any task that lacks complete motion data could benefit from this method. That is the bigger picture.
WORLD’S FASTEST HUMANOID ROBOT RUNS 22 MPH
A robot dances at the launch ceremony of a Galbot Robotics retail store in Beijing, China, on August 7, 2025. The company has also designed a 4-foot robot that returns tennis shots with millisecond reactions and full-body coordination. (VCG/VCG via Getty Images)
Could robots compete with humans one day?
The path forward is becoming clearer. Today, the robot rallies. Next, it competes. In time, robots could train with or challenge professional athletes. Exhibition matches between humans and machines may become part of the sport. That future no longer feels far away.
Take my quiz: How safe is your online security?
Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com.
Kurt’s key takeaways
This demo shows how quickly things are changing. Robots are no longer stuck following scripts. They can now react, adjust and compete in real situations. What used to feel far off is starting to show up right in front of us.
So here is the question: If a robot could outplay you on the court, would you still want to compete, or would you rather train with it? Let us know by writing to us at Cyberguy.com.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.
Copyright 2026 CyberGuy.com. All rights reserved.
Technology
AI influencer awards season is upon us
The contest, a joint venture between generative AI studio OpenArt and AI-powered creator platform Fanvue, with backing from AI voice company ElevenLabs, opens on Monday and runs for a month. The organizers said it is intended to “celebrate the creative talent ‘behind’ AI Influencers” and recognize their growing commercial and cultural clout.
Contestants will compete for a total prize fund of $20,000, which will be split between an overall winner and individual categories of fitness, lifestyle, comedian, music and dance entertainer, and fictional cartoon, anime, or fantasy personality. Victors will be celebrated at an event in May that the organizers are dubbing the “‘Oscars’ for AI personalities.”
To enter, you must develop your AI influencer on OpenArt’s platform and submit it at www.AIpersonality.ai. You’ll be asked for social media handles across TikTok, X, YouTube, and Instagram, as well as the story behind the character, your motivations for creating it, and details of any brand work.
Among those assessing contestants are 13‑time Emmy‑winning comedy writer Gil Rief, the creators of Spanish AI model Aitana Lopez, and Christopher “Topher” Townsend, the MAGA rapper behind AI-generated gospel singer Solomon Ray. According to a copy of the judges’ briefing seen by The Verge, contestants will be scored on four criteria: quality, social clout, brand appeal, and the inspiration behind the avatar. Specific points include reliably engaging with followers, portraying a consistent look across social channels, accurate details like having the “right number of fingers and thumbs,” and having “an authentic narrative” behind the avatar.
The contest is open to established creators and novices alike, though existing AI influencers will still need to submit material produced on OpenArt’s platform, Matt Jones, head of brand at Fanvue, told The Verge.
Despite being designed to celebrate creators of virtual influencers, Jones said that entrants don’t need to publicly identify themselves. “If a person who created this amazing piece of work wants nothing to do with the press or to expose themselves or to have their name out there, that’s obviously fine,” he said. “There would be no need to thrust anybody into the limelight here. We would just celebrate the piece of work.”
That creators can remain anonymous feels odd for a contest judging authenticity, particularly in an AI influencer ecosystem built on fictional people, fake personas, and fabricated backstories. That same anonymity has also helped grifts flourish with little accountability, from the AI white nationalist rapper Danny Bones to MAGA fantasy girl Jessica Foster.
There’s familiar baggage too, including persistent questions about originality, whether AI-generated work, or even a likeness, has been lifted from real creators, and whether these tools simply reproduce the same old biases in synthetic form. Organizer Fanvue has already faced criticism for this in the past: in 2024, a Guardian columnist described its “Miss AI” beauty pageant as something that “take(s) every toxic gendered beauty norm and bundle(s) them up into a completely unrealistic package.”
To Fanvue’s Jones, creators inevitably leave something of themselves in the AI characters they make. “You can’t help but put a little bit of yourself into the stories that you tell and the characters that you make,” he said, urging creators to “lean into that.” The idea feels at home in the influencer economy: not strictly real, but a form of synthetic authenticity the internet already knows how to handle.
-
Detroit, MI5 days agoDrummer Brian Pastoria, longtime Detroit music advocate, dies at 68
-
Oklahoma1 week agoFamily rallies around Oklahoma father after head-on crash
-
Georgia1 week agoHow ICE plans for a detention warehouse pushed a Georgia town to fight back | CNN Politics
-
Science1 week agoFederal EPA moves to roll back recent limits on ethylene oxide, a carcinogen
-
Alaska1 week agoPolice looking for man considered ‘armed and dangerous’
-
Movie Reviews5 days ago‘Youth’ Twitter review: Ken Karunaas impresses audiences; Suraj Venjaramoodu adds charm; music wins praise | – The Times of India
-
Science1 week agoLong COVID leaves thousands of L.A. county residents sick, broke and ignored
-
Education1 week agoVideo: Turning Point USA Clubs Expand to High Schools Across America