Connect with us

Technology

Google DeepMind’s new AI can model DNA, RNA, and ‘all life’s molecules’

Published

on

Google DeepMind’s new AI can model DNA, RNA, and ‘all life’s molecules’

Google DeepMind is introducing an improved version of its AI model that predicts not just the structure of proteins, but also the structure of “all life’s molecules.” The work from the new model, AlphaFold 3, will help researchers in medicine, agriculture, materials science, and drug development test potential discoveries.

Previous versions of AlphaFold only predicted the structures of proteins. AlphaFold 3 goes beyond that and can model DNA, RNA, and smaller molecules called ligands, expanding the model’s capability for scientific use.

DeepMind says the new model shows a 50 percent improvement in prediction accuracy compared to its previous models. “With AlphaFold 2, it was a big milestone moment in structural biology and has unlocked all kinds of amazing research,” DeepMind CEO Demis Hassabis told reporters in a briefing. “AlphaFold 3 is a step along the path in terms of using AI to understand and model biology.”

AlphaFold 3 has a library of molecular structures. Researchers input a list of molecules they want to combine, then AlphaFold 3 uses a diffusion method to generate a 3D model of the new structure. Diffusion is the same type of AI system that AI image generators like Stable Diffusion use to assemble photos.

DeepMind says Isomorphic Labs, a drug discovery company founded by Hassabis, has been using AlphaFold 3 for internal projects. So far, the model helped Isomorphic Labs improve its understanding of new disease targets.

Advertisement

Along with the model, DeepMind is also making the research platform AlphaFold Server available to some researchers for free. The server, powered by AlphaFold 3, lets scientists generate biomolecular structure predictions regardless of their access to compute power. Hassabis says the server is available for academic, non-commercial uses, but Isomorphic Labs is working with pharmaceutical partners to use AlphaFold models for drug discovery programs.

Google says it is working with the scientific community and policy leaders to deploy the model responsibly. Google says in a paper that some biosecurity experts believe AI models could “may lower the barrier for threat actors and enable them, in concert with other technologies, to design and engineer pathogens and toxins that are more transmissible or harmful.”

The company says it worked with domain experts and biosecurity, research and industry specialists to figure out risks around AlphaFold 3 even before its launch.

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Technology

The Mac Pro and Studio won’t get the M4 nod until mid-2025

Published

on

The Mac Pro and Studio won’t get the M4 nod until mid-2025

Throughout 2024, though, all of Apple’s laptops (except the MacBook Air) will move to the M4 chip that the company just gave the iPad Pro, Gurman writes. Amusingly, this herky-jerky chip upgrade cycle means that the iPad Pro is currently the single-core performance champ of Apple’s lineup — and it will continue to be for about another year, when compared to the Mac Studio and Mac Pro.

This is a silly comparison, of course — The current crop of Mac Studios and Mac Pros are incredible computers that hold more RAM, have more ports, and won’t throttle as quickly as the iPad Pro, even with that heat-conducting Apple logo. They also don’t have an operating system that stands squarely in the way of pushing their hardware. And high-end Mac users should be used to waiting a while between revisions. Still, I’m sure more than a few people will appreciate the upgrade when it comes.

Continue Reading

Technology

Boston Dynamics' creepy robotic canine dances in sparkly blue costume

Published

on

Boston Dynamics' creepy robotic canine dances in sparkly blue costume

As the world celebrated #InternationalDanceDay, a unique duo took the stage, or rather, the screen, to showcase a different kind of choreography. 

Spot, the quadruped robot developed by Boston Dynamics, found a new friend in Sparkles, a dazzlingly dressed counterpart designed to explore the fusion of robotics, art and entertainment.

Sparkles and spot canine robots dance (Boston Dynamics)

A cartoon come to life

At first glance, the video in question seems like a whimsical animation straight out of a children’s show. Yet, this is no fiction. The footage is a testament to how far robotics has come, featuring Spot adorned in a blue, sparkly, albeit slightly creepy costume, performing a dance routine that could rival any animated character.

robot dogs 2

Sparkles and spot canine robots dance  (Boston Dynamics)

GET SECURITY ALERTS, EXPERT TIPS – SIGN UP FOR KURT’S NEWSLETTER – THE CYBERGUY REPORT HERE

Advertisement

Meet Sparkles

“Spot is meeting another strange dog and making friends through the power of dance. Meet Sparkles!” Boston Dynamics announced. The video features two Spots — one in the recognizable black and yellow, and the other, Sparkles, in the blue, sparkly dog costume — engaging in a robotic dance-off that culminates in a mechanical kiss.

ASK ANY TECH QUESTION, AND GET KURT’S FREE CYBERGUY REPORT NEWSLETTER HERE

robot dogs 3

Sparkles and spot canine robots kiss  (Boston Dynamics)

This display of robotic affection and agility has sparked conversations about the potential applications of such technology in entertainment venues like theme parks, where robots could add a layer of realism to character interactions.

Social media’s mixed moves

The reception on social media was as varied as the dance moves displayed. Some viewers were enchanted, praising the mobility and innovation, while others expressed discomfort, humorously suggesting that the cute facade could well be the stuff of nightmares.

Advertisement

CLICK TO GET KURT’S FREE NEWSLETTER, THE CYBERGUY REPORT

ROBOT dogs 4

Sparkles and spot canine robots dance (Boston Dynamics)

HUMANOID ROBOTS ARE NOW DOING THE WORK OF HUMANS IN A SPANX WAREHOUSE 

Kurt’s key takeaways

After watching Spot and Sparkles bust a move together, it’s pretty wild to think about where robotics is heading. It’s like we’re watching a live-action cartoon, isn’t it? These robots are not just showing off some fancy footwork; they’re opening our eyes to a whole new world of possibilities. Whether they’re making us smile or giving us the heebie-jeebies, they’re proof that creativity knows no limits.

How do you feel about robots displaying human-like behaviors such as dancing and kissing? Does it concern you for the future of human-robot interactions? Let us know by writing us at Cyberguy.com/Contact

Advertisement

For more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter

Ask Kurt a question or let us know what stories you’d like us to cover

Follow Kurt on Facebook, YouTube and Instagram

Answers to the most asked CyberGuy questions:

Copyright 2024 CyberGuy.com.  All rights reserved.

Advertisement

Continue Reading

Technology

The smells and tastes of a great video game

Published

on

The smells and tastes of a great video game

As video games and movies become more immersive, it may start to become apparent what sensations are missing in the experience. Is there a point in Gran Turismo that you wish you could smell the burning rubber and engine exhaust? Would an experience playing beer pong in Horizon Worlds not be complete unless you could taste the hops?

On this episode of The Vergecast, the latest in our miniseries about the five senses of video games, we’re tackling the topics of smell and taste in video games — and whether either could actually enhance the virtual experience for gamers. In other words: Smellovision is back for a new generation of media.

First, we try out a product (actually available to buy today) called the GameScent, an AI-powered scent machine that syncs with your gaming and movie-watching experience. The GameScent works by listening in on the sound design of the content you’re playing or watching and deploying GameScent-approved fragrances that accompany those sounds. We tried the GameScent with games like Mario Kart and Animal Crossing to see if this is really hinting at a scent-infused gaming future.

On the taste side, we speak to Nimesha Ranasinghe, an assistant professor at the University of Maine working on taste sensations and taste simulation in virtual reality experiences. Ranasinghe walks us through his research on sending electrical pulses to your tongue to manipulate different taste sensations like salty, sweet, sour, and bitter. He also talks about how his research led to experimental gadgets like a “virtual cocktail,” which would allow you to send curated tasting and drinking experiences through digital signals.

If you want to know more about the world of smelling and tasting digital content, here are some links to get you started:

Advertisement
Continue Reading
Advertisement

Trending