Connect with us

Technology

How Frankenstein’s creature designer found a new look for an iconic monster

Published

on

How Frankenstein’s creature designer found a new look for an iconic monster

For Mike Hill and Guillermo del Toro, it all started with Frankenstein.

Years ago, Hill — a sculptor and special effects artist — was exhibiting his works at a convention in Burbank. Del Toro saw some of Hill’s monstrous creations on display and was so impressed that he decided to get in touch, tracking down Hill’s contact info from an obscure model kit forum. “I don’t know how he found me from some 20-year-old website,” says Hill, who describes del Toro’s investigation as “very Columbo-esque” work. “But he wrote to me, told me who he was, and asked to commission something.”

That first commission turned out to be a sculpture of Boris Karloff having his makeup applied for the iconic 1931 version of Frankenstein, and it would go on to be displayed in the director’s famous Bleak House. “Immediately it was Frankenstein,” Hill says, “our very first job together.”

From there, a fruitful relationship blossomed; Hill went on to design creatures for films like Nightmare Alley and The Shape of Water, and the Netflix anthology Cabinet of Curiosities. But when word came out that del Toro was working on his own long-awaited version of Frankenstein, Hill’s phone remained curiously silent. “I was fretting because I knew he was going to do Frankenstein and hadn’t been in touch with me,” Hill tells me. “It was driving me bonkers.” But del Toro hadn’t forgotten about his partner — in fact, it turns out Hill was vital for the project.

“Guillermo invited me for breakfast and he said: ‘Listen, we’re doing Frankenstein. If you’re not doing it, then I’m not doing it, so it depends on you right now. Eat your eggs and tell me at the end of it if we’re doing the movie.’”

Advertisement

Naturally, Hill said yes.

1/5

Jacob Elordi and Mia Goth.
Image: Netflix

That version of Frankenstein had a brief run in theaters and will be streaming on Netflix on November 7th. And it was particularly challenging for Hill given how ubiquitous Frankenstein’s creature is. Karloff’s interpretation from Frankenstein in 1931, designed by legendary Universal makeup artist Jack Pierce, is an indelible part of pop culture, and since then there have been hundreds of variations across stage and screen. “It was very difficult trying to come up with something that no one had ever seen,” Hill says.

The design process was a collaborative one between the director and artist. Del Toro didn’t provide explicit instructions, but instead explained what he didn’t want. The creature shouldn’t be hideous, for example, which meant no heavy, ugly stitching. From there Hill created a few options, and spent some time researching 18th-century surgery techniques, before hitting on the final version. “I just wanted to make him of the period, like he was built in the 1800s,” Hill says. “I wanted it to look like a human being had meticulously done this to him.”

This iteration of the creature is tall and lean, with scars covering his entire body to create an almost geometric pattern. This fits with the story of the film, which really digs into the pseudoscientific process that Victor Frankenstein goes through to build this creature and eventually bring him to life. And that contrast between beauty and horror is a key part of the character, according to Hill. “There’s a certain beauty that Victor was striving for,” he says. “He tried to make a beautiful glass window, it just ended up stained and broken.”

Advertisement

1/5

Sketches from Guillermo del Toro’s Frankenstein notebook.
Image: Netflix

In those early stages, Hill had little to go on. There was no script, nor was anyone cast as the creature. Later, he spent eight months designing prosthetics for an actor who eventually left the project due to a conflict. At that point, del Toro sent Hill a list of potential actors he was considering to take over, and one in particular stood out: Jacob Elordi, who eventually took on the role.

Hill cites “his demeanor, his gangliness, his limbs, his doe-like eyes,” as the reasons Elordi was so perfect as the creature. It helps that the Euphoria star is a towering 6-foot-5 and, according to Hill, has the kind of face that makeup artists dream about. “Jacob’s bone structure made things a lot easier,” he says. “He has this very strong jaw, this very strong chin. Speaking as a prosthetics artist, chins are a pain in the ass.” The final version of the design involved 42 different prosthetics pieces, and when Elordi had to wear the full-body kit, it required around 10 hours in the makeup chair.

Mike Hill.
Image: Netflix

One of the most important parts of the final design is how it’s able to evolve over the course of the movie. Initially, the creature is bald and nearly naked, signaling his childlike innocence. But after being abandoned by his creator, he takes on a harder look, eventually growing out his hair and wearing a long cloak. Elordi’s demeanor changes as well; he mostly cowers early on, before being turned into something much more menacing and terrifying. From a design standpoint, all that really changes is the hair and wardrobe; and yet, the transformation is dramatic.

Advertisement

In the end, Frankenstein proved to be an ideal collaboration for Hill and del Toro. The artist tells me that he’s been making monsters since he was a kid, scooping up mud from a nearby riverbank to sculpt them with, and from those early days Mary Shelley’s story was a guiding influence. He went on to create multiple versions of the creature as a professional artist, and is currently working on a short film based on a decade-old sculpture. Just like del Toro, the idea of tackling Frankenstein in his own way was a longtime goal. So while it may have involved a bit of stress waiting for del Toro’s call, it was ultimately worth it.

“I always dreamed that he would make it,” Hill says.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

Technology

Birdbuddy’s new smart feeders aim to make spotting birds easier, even for beginners

Published

on

Birdbuddy’s new smart feeders aim to make spotting birds easier, even for beginners

Birdbuddy is introducing two new smart bird feeders: the flagship Birdbuddy 2 and the more compact, cheaper Birdbuddy 2 Mini aimed at first-time users and smaller outdoor spaces. Both models are designed to be faster and easier to use than previous generations, with upgraded cameras that can shoot in portrait or landscape and wake instantly when a bird lands so you’re less likely to miss the good stuff.

The Birdbuddy 2 costs $199 and features a redesigned circular camera housing that delivers 2K HDR video, slow-motion recording, and a wider 135-degree field of view. The upgraded built-in mic should also better pick up birdsong, which could make identifying species easier using both sound and sight.

The feeder itself offers a larger seed capacity and an integrated perch extender, along with support for both 2.4GHz and 5GHz Wi-Fi for more stable connectivity. The new model also adds dual integrated solar panels to help keep it powered throughout the day, while adding a night sleep mode to conserve power.

The Birdbuddy 2 Mini is designed to deliver the same core AI bird identification and camera experience, but in a smaller, more accessible package. At 6.95 inches tall with a smaller seed capacity, it’s geared toward first-time smart birders and smaller outdoor spaces like balconies, and it supports an optional solar panel.

Birdbuddy 2’s first batch of preorders has already sold out, with shipments expected in February 2026 and wider availability set for mid-2026. Meanwhile, the Birdbuddy 2 Mini will be available to preorder for $129 in mid-2026, with the company planning on shipping the smart bird feeder in late 2026.

Advertisement
Continue Reading

Technology

Robots learn 1,000 tasks in one day from a single demo

Published

on

Robots learn 1,000 tasks in one day from a single demo

NEWYou can now listen to Fox News articles!

Most robot headlines follow a familiar script: a machine masters one narrow trick in a controlled lab, then comes the bold promise that everything is about to change. I usually tune those stories out. We have heard about robots taking over since science fiction began, yet real-life robots still struggle with basic flexibility. This time felt different.

Sign up for my FREE CyberGuy Report

Get my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter.

ELON MUSK TEASES A FUTURE RUN BY ROBOTS

Advertisement

Researchers highlight the milestone that shows how a robot learned 1,000 real-world tasks in just one day. (Science Robotics)

How robots learned 1,000 physical tasks in one day

A new report published in Science Robotics caught our attention because the results feel genuinely meaningful, impressive and a little unsettling in the best way. The research comes from a team of academic scientists working in robotics and artificial intelligence, and it tackles one of the field’s biggest limitations.

The researchers taught a robot to learn 1,000 different physical tasks in a single day using just one demonstration per task. These were not small variations of the same movement. The tasks included placing, folding, inserting, gripping and manipulating everyday objects in the real world. For robotics, that is a big deal.

Why robots have always been slow learners

Until now, teaching robots physical tasks has been painfully inefficient. Even simple actions often require hundreds or thousands of demonstrations. Engineers must collect massive datasets and fine-tune systems behind the scenes. That is why most factory robots repeat one motion endlessly and fail as soon as conditions change. Humans learn differently. If someone shows you how to do something once or twice, you can usually figure it out. That gap between human learning and robot learning has held robotics back for decades. This research aims to close that gap.

THE NEW ROBOT THAT COULD MAKE CHORES A THING OF THE PAST

Advertisement

The research team behind the study focuses on teaching robots to learn physical tasks faster and with less data. (Science Robotics)

How the robot learned 1,000 tasks so fast

The breakthrough comes from a smarter way of teaching robots to learn from demonstrations. Instead of memorizing entire movements, the system breaks tasks into simpler phases. One phase focuses on aligning with the object, and the other handles the interaction itself. This method relies on artificial intelligence, specifically an AI technique called imitation learning that allows robots to learn physical tasks from human demonstrations.

The robot then reuses knowledge from previous tasks and applies it to new ones. This retrieval-based approach allows the system to generalize rather than start from scratch each time. Using this method, called Multi-Task Trajectory Transfer, the researchers trained a real robot arm on 1,000 distinct everyday tasks in under 24 hours of human demonstration time.

Importantly, this was not done in a simulation. It happened in the real world, with real objects, real mistakes and real constraints. That detail matters.

Why this research feels different

Many robotics papers look impressive on paper but fall apart outside perfect lab conditions. This one stands out because it tested the system through thousands of real-world rollouts. The robot also showed it could handle new object instances it had never seen before. That ability to generalize is what robots have been missing. It is the difference between a machine that repeats and one that adapts.

Advertisement

AI VIDEO TECH FAST-TRACKS HUMANOID ROBOT TRAINING

The robot arm practices everyday movements like gripping, folding and placing objects using a single human demonstration. (Science Robotics)

A long-standing robotics problem may finally be cracking

This research addresses one of the biggest bottlenecks in robotics: inefficient learning from demonstrations. By decomposing tasks and reusing knowledge, the system achieved an order of magnitude improvement in data efficiency compared to traditional approaches. That kind of leap rarely happens overnight. It suggests that the robot-filled future we have talked about for years may be nearer than it looked even a few years ago.

What this means for you

Faster learning changes everything. If robots need less data and less programming, they become cheaper and more flexible. That opens the door to robots working outside tightly controlled environments.

In the long run, this could enable home robots to learn new tasks from simple demonstrations instead of specialist code. It also has major implications for healthcare, logistics and manufacturing.

Advertisement

More broadly, it signals a shift in artificial intelligence. We are moving away from flashy tricks and toward systems that learn in more human-like ways. Not smarter than people. Just closer to how we actually operate day to day.

Take my quiz: How safe is your online security?

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com     

CLICK HERE TO DOWNLOAD THE FOX NEWS APP 

Kurt’s key takeaways 

Robots learning 1,000 tasks in a day does not mean your house will have a humanoid helper tomorrow. Still, it represents real progress on a problem that has limited robotics for decades. When machines start learning more like humans, the conversation changes. The question shifts from what robots can repeat to what they can adapt to next. That shift is worth paying attention to.

Advertisement

If robots can now learn like us, what tasks would you actually trust one to handle in your own life? Let us know by writing to us at Cyberguy.com

Sign up for my FREE CyberGuy Report

Get my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter. 

Copyright 2025 CyberGuy.com.  All rights reserved.

Advertisement
Continue Reading

Technology

Plaud updates the NotePin with a button

Published

on

Plaud updates the NotePin with a button

Plaud has updated its compact NotePin AI recorder. The new NotePin S is almost identical to the original, except for one major difference: a button. It’s joined by a new Plaud Desktop app for recording audio in online meetings, which is free to owners of any Plaud Note or NotePin.

The NotePin S has the same FitBit-esque design as the 2024 original and ships with a lanyard, wristband, clip, and magnetic pin, so you can wear it just about any way you please — now all included in the box, whereas before the lanyard and wristband were sold separately.

It’s about the same size as the NotePin, comes in the same colors (black, purple, or silver), offers similar battery life, and still supports Apple Find My. Like the NotePin, it records audio and generates transcriptions and summaries, whether those are meeting notes, action points, or reminders.

But now it has a button. Whereas the first NotePin used haptic controls, relying on a long squeeze to start recording, with a short buzz to let you know it worked, the S switches to something simpler. A long press of the button starts recording, a short tap adds highlight markers. Plaud’s explanation for the change is simple: buttons are less ambiguous, so you’ll always know you’ve successfully pressed it and started recording, whereas original NotePin users complained they sometimes failed to record because they hadn’t squeezed just right.

AI recorders like this live or die by ease of use, so removing a little friction gives Plaud better odds of survival.

Advertisement

Alongside the NotePin S, Plaud is launching a new Mac and PC application for recording the audio from online meetings. Plaud Desktop runs in the background and activates whenever it detects calls from apps including Zoom, Meet, and Teams, recording both system audio and from your microphone. You can set it to either record meetings automatically or require manual activation, and unlike some alternatives it doesn’t create a bot that joins the call with you.

Recordings and notes are synced with those from Plaud’s line of hardware recorders, with the same models used for transcription and generation, creating a “seamless” library of audio from your meetings, both online and off.

Plaud Desktop is available now and is free to anyone who already owns a Plaud Note or NotePin device. The new NotePin S is also available today, for $179 — $20 more than the original, which Plaud says will now be phased out.

Continue Reading

Trending