Connect with us

Technology

Meet the humanoid robot that learns from natural language, mimics human emotions

Published

on

Meet the humanoid robot that learns from natural language, mimics human emotions

Imagine what it would be like to have a robot friend that can do things like take selfies, toss a ball, eat popcorn and play air guitar? 

Well, you might not have to wait too long.

Researchers at the University of Tokyo have created a robot that can do all that and more, thanks to the power of GPT-4, the latest and most advanced large language model (LLM) in the world.

CLICK TO GET KURT’S FREE CYBERGUY NEWSLETTER WITH SECURITY ALERTS, QUICK VIDEO TIPS, TECH REVIEWS, AND EASY HOW-TO’S TO MAKE YOU SMARTER

A researcher gives Alter3, a humanoid robot, verbal instructions. (University of Tokyo)

Advertisement

What is the Alter3 humanoid robot, how does it work?

Alter3 is a humanoid robot that was first introduced in 2016 as a platform for exploring the concept of life in artificial systems. It has a realistic appearance and can move its upper body, head and facial muscles with 43 axes controlled by air actuators. It also has a camera in each eye that allows it to see and interact with humans and the environment.

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

Alter3 interacts with a human. (University of Tokyo)

But what makes Alter3 really special is that it can now use GPT-4, a deep learning model that can generate natural language texts from any given prompt, to control its movements and behaviors. This means that instead of having to program every single action for the robot, the researchers can simply give it verbal instructions and let GPT-4 generate the corresponding Python code that runs the Android engine.

CALIFORNIA LEGISLATIVE SESSION TO BE DOMINATED BY AI REGULATIONS AND STATE’S STRUGGLING BUDGET

Advertisement

For example, to make Alter3 take a selfie, the researchers can say something like:

“Create a big, joyful smile and widen your eyes to show excitement. Swiftly turn the upper body slightly to the left, adopting a dynamic posture. Raise the right hand high, simulating a phone. Flex the right elbow, bringing the phone closer to the face. Tilt the head slightly to the right, giving a playful vibe.”

And GPT-4 will produce the code that makes Alter3 do exactly that.

Alter3 mimics taking a selfie. (University of Tokyo)

MORE: HUMANOID ROBOTS ARE NOW DOING THE WORK OF HUMANS IN A SPANX WAREHOUSE 

Advertisement

What can the Alter3 humanoid robot do with GPT-4?

The researchers have tested Alter3 with GPT-4 in various scenarios, such as tossing a ball, eating popcorn, and playing air guitar. They have also experimented with different types of feedback, such as linguistic, visual, and emotional, to improve the robot’s performance and adaptability.

Alter3 mimics playing a guitar. (University of Tokyo)

One of the most interesting aspects of Alter3’s behavior is that it can learn from its own memory and from human responses. For instance, if the robot does something that makes a human laugh or smile, it will remember that and try to repeat it in the future. This is similar to how newborn babies imitate their parents’ expressions and gestures.

Alter3 mimics jogging. (University of Tokyo)

MORE: THE NEXT GENERATION OF TESLA’S HUMANOID ROBOT MAKES ITS DEBUT

Advertisement

The researchers have also added some humor and personality to Alter3’s actions. In one case, the robot pretends to eat a bag of popcorn, only to realize that it belongs to the person sitting next to it. It then shows a surprised and embarrassed expression and apologizes with its arms.

Alter3, the humanoid robot (University of Tokyo)

Why is this humanoid robot AI important and what are the implications?

The research team behind Alter3 believes that this is a breakthrough in the field of robotics and artificial intelligence, as it shows how large language models can be used to bridge the gap between natural language and robot control. This opens up new possibilities for human-robot collaboration and communication, as well as for creating more intelligent, adaptable, and personable robotic entities.

Alter3 mimics seeing a pretend snake. (University of Tokyo)

MORE: HOW THIS ROBOT HELPS YOU PROTECT AND CONNECT YOUR HOME

Advertisement

The paper, titled “From Text to Motion: Grounding GPT-4 in a Humanoid Robot ‘Alter3,’” was written by Takahide Yoshida, Atsushi Masumori and Takashi Ikegami and is available on the preprint server arXiv. The authors hope that their work will inspire more research and development in this direction and that one day we might be able to have robot friends that can understand us and share our interests and emotions.

Kurt’s key takeaways

Alter3 is an example of how natural language processing and robotics can work together to create pretty incredible interactions. By using GPT-4, the robot can perform a variety of tasks and behaviors based on verbal commands, without requiring extensive programming or manual control. This also allows the robot to learn from its own experience and from human feedback and to express some humor and personality. Alter3 demonstrates the potential of large language models to improve the field of robotics and artificial intelligence as well as bring us closer to having robot friends that can relate to us and entertain us.

What do you think of Alter3 and its abilities? Would you like to have a robot like that in your life? Let us know by writing us at Cyberguy.com/Contact.

For more of my tech tips & security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter.

Advertisement

Ask Kurt a question or let us know what stories you’d like us to cover.

Answers to the most asked CyberGuy questions:

Ideas for using those Holiday Gift cards:

Copyright 2024 CyberGuy.com. All rights reserved.

Advertisement

Technology

Google’s annual revenue tops $400 billion for the first time

Published

on

Google’s annual revenue tops 0 billion for the first time

Google’s parent company, Alphabet, has earned more than $400 billion in annual revenue for the first time. The company announced the milestone as part of its Q4 2025 earnings report released on Wednesday, which highlights the 15 percent year-over-year increase as its cloud business and YouTube continue to grow.

As noted in the earnings report, Google’s Cloud business reached a $70 billion run rate in 2025, while YouTube’s annual revenue soared beyond $60 billion across ads and subscriptions. Alphabet CEO Sundar Pichai told investors that YouTube remains the “number one streamer,” citing data from Nielsen. The company also now has more than 325 million paid subscribers, led by Google One and YouTube Premium.

Additionally, Pichai noted that Google Search saw more usage over the past few months “than ever before,” adding that daily AI Mode queries have doubled since launch. Google will soon take advantage of the popularity of its Gemini app and AI Mode, as it plans to build an agentic checkout feature into both tools.

Continue Reading

Technology

Waymo under federal investigation after child struck

Published

on

Waymo under federal investigation after child struck

NEWYou can now listen to Fox News articles!

Federal safety regulators are once again taking a hard look at self-driving cars after a serious incident involving Waymo, the autonomous vehicle company owned by Alphabet.

This time, the investigation centers on a Waymo vehicle that struck a child near an elementary school in Santa Monica, California, during morning drop-off hours. The crash happened Jan. 23 and raised immediate questions about how autonomous vehicles behave around children, school zones and unpredictable pedestrian movement.

On Jan. 29, the National Highway Traffic Safety Administration confirmed it had opened a new preliminary investigation into Waymo’s automated driving system.

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter.

Advertisement

TESLA’S SELF-DRIVING CARS UNDER FIRE AGAIN

Waymo operates Level 4 self-driving vehicles in select U.S. cities, where the car controls all driving tasks without a human behind the wheel. (AP Photo/Terry Chea, File)

What happened near the Santa Monica school?

According to documents posted by NHTSA, the crash occurred within two blocks of an elementary school during normal drop-off hours. The area was busy. There were multiple children present, a crossing guard on duty and several vehicles double-parked along the street.

Investigators say the child ran into the roadway from behind a double-parked SUV while heading toward the school. The Waymo vehicle struck the child, who suffered minor injuries. No safety operator was inside the vehicle at the time.

NHTSA’s Office of Defects Investigation is now examining whether the autonomous system exercised appropriate caution given its proximity to a school zone and the presence of young pedestrians.

Advertisement

AI TRUCK SYSTEM MATCHES TOP HUMAN DRIVERS IN MASSIVE SAFETY SHOWDOWN WITH PERFECT SCORES

Federal investigators are now examining whether Waymo’s automated system exercised enough caution near a school zone during morning drop-off hours. (Waymo)

Why federal investigators stepped in

The NHTSA says the investigation will focus on how Waymo’s automated driving system is designed to behave in and around school zones, especially during peak pickup and drop-off times.

That includes whether the vehicle followed posted speed limits, how it responded to visual cues like crossing guards and parked vehicles and whether its post-crash response met federal safety expectations. The agency is also reviewing how Waymo handled the incident after it occurred.

Waymo said it voluntarily contacted regulators the same day as the crash and plans to cooperate fully with the investigation. In a statement, the company said it remains committed to improving road safety for riders and everyone sharing the road.

Advertisement

Waymo responds to the federal investigation

We reached out to Waymo for comment, and the company provided the following statement:

“At Waymo, we are committed to improving road safety, both for our riders and all those with whom we share the road. Part of that commitment is being transparent when incidents occur, which is why we are sharing details regarding an event in Santa Monica, California, on Friday, January 23, where one of our vehicles made contact with a young pedestrian. Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day. NHTSA has indicated to us that they intend to open an investigation into this incident, and we will cooperate fully with them throughout the process. 

“The event occurred when the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made. 

“To put this in perspective, our peer-reviewed model shows that a fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph. This significant reduction in impact speed and severity is a demonstration of the material safety benefit of the Waymo Driver.

“Following contact, the pedestrian stood up immediately, walked to the sidewalk and we called 911. The vehicle remained stopped, moved to the side of the road and stayed there until law enforcement cleared the vehicle to leave the scene. 

Advertisement

This event demonstrates the critical value of our safety systems. We remain committed to improving road safety where we operate as we continue on our mission to be the world’s most trusted driver.”

Understanding Waymo’s autonomy level

Waymo vehicles fall under Level 4 autonomy on NHTSA’s six-level scale.

At Level 4, the vehicle handles all driving tasks within specific service areas. A human driver is not required to intervene, and no safety operator needs to be present inside the car. However, these systems do not operate everywhere and are currently limited to ride-hailing services in select cities.

The NHTSA has been clear that Level 4 vehicles are not available for consumer purchase, even though passengers may ride inside them.

This is not Waymo’s first federal probe

This latest investigation follows a previous NHTSA evaluation that opened in May 2024. That earlier probe examined reports of Waymo vehicles colliding with stationary objects like gates, chains and parked cars. Regulators also reviewed incidents in which the vehicles appeared to disobey traffic control devices.

Advertisement

That investigation was closed in July 2025 after regulators reviewed the data and Waymo’s responses. Safety advocates say the new incident highlights unresolved concerns.

UBER UNVEILS A NEW ROBOTAXI WITH NO DRIVER BEHIND THE WHEEL

No safety operator was inside the vehicle at the time of the crash, raising fresh questions about how autonomous cars handle unpredictable situations involving children. (Waymo)

What this means for you

If you live in a city where self-driving cars operate, this investigation matters more than it might seem. School zones are already high-risk areas, even for attentive human drivers. Autonomous vehicles must be able to detect unpredictable behavior, anticipate sudden movement and respond instantly when children are present.

This case will likely influence how regulators set expectations for autonomous driving systems near schools, playgrounds and other areas with vulnerable pedestrians. It could also shape future rules around local oversight, data reporting and operational limits for self-driving fleets.

Advertisement

For parents, commuters and riders, the outcome may affect where and when autonomous vehicles are allowed to operate.

Take my quiz: How safe is your online security?

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP    

Kurt’s key takeaways

Self-driving technology promises safer roads, fewer crashes and less human error. But moments like this remind us that the hardest driving scenarios often involve human unpredictability, especially when children are involved. Federal investigators now face a crucial question: Did the system act as cautiously as it should have in one of the most sensitive driving environments possible? How they answer that question could help define the next phase of autonomous vehicle regulation in the United States.

Advertisement

Do you feel comfortable sharing the road with self-driving cars near schools, or is that a line technology should not cross yet? Let us know by writing to us at Cyberguy.com

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter. 

Copyright 2026 CyberGuy.com. All rights reserved.

Continue Reading

Technology

Adobe actually won’t discontinue Animate

Published

on

Adobe actually won’t discontinue Animate

Adobe is no longer planning to discontinue Adobe Animate on March 1st. In an FAQ, the company now says that Animate will now be in maintenance mode and that it has “no plans to discontinue or remove access” to the app. Animate will still receive “ongoing security and bug fixes” and will still be available for “both new and existing users,” but it won’t get new features.

An announcement email that went out to Adobe Animate customers about the discontinuation did “not meet our standards and caused a lot of confusion and angst within the community,” according to a Reddit post from Adobe community team member Mike Chambers.

Animate will be available in maintenance mode “indefinitely” to “individual, small business, and enterprise customers,” according to Adobe. Before the change, Adobe said that non-enterprise customers could access Animate and download content until March 1st, 2027, while enterprise customers had until March 1st, 2029.

Continue Reading

Trending