Connect with us

Technology

Ghosts in the Kinect

Published

on

Ghosts in the Kinect

Billy Tolley swings a Microsoft Kinect around an abandoned room in sudden, jittery movements. “Whoa!” he says. “Dude, it was so creepy.” On the display, we see an anomaly of arrows, spheres, and red lines that disappears almost as soon as it arrives. For Tolley and Zak Bagans, two members of the Ghost Adventures YouTube channel, this is enough to suggest they should leave the building. Because for this team and other similar enthusiasts, that seemingly innocuous blotter of white arrows means something more terrifying: a glimpse at specters and phantoms invisible to the human eye.

Fifteen years after its release, just about the only people still buying the Microsoft Kinect are ghost hunters like Tolley and Bagans. Though the body-tracking camera, which was discontinued in 2017, started as a gaming peripheral, it also enjoyed a spirited afterlife outside of video games. But in 2025, its most notable application is helping paranormal investigators, like the Ghost Adventures team, in their attempts at documenting the afterlife.

The Kinect’s ability to convert the data from its body-tracking sensors into an on-screen skeletal dummy delights these investigators, who allege the figures it shows in empty space are, in fact, skeletons of the spooky, scary variety. Looking at it in use — the Kinect is particularly popular with ghost-hunting YouTubers — it’s certainly producing results, showing human-like figures where there are none. The question is: why?

With the help of ghost hunters and those familiar with how the Kinect actually works, The Verge set out to understand why the perhaps most misbegotten gaming peripheral has gained such a strong foothold in the search for the paranormal.

Part of the reason is purely technical. “The Kinect’s popularity as a depth camera for ghost hunting stems from its ability to detect depth and create stick-figure representations of humanoid shapes, making it easier to identify potential human-like forms, even if faint or translucent,” says Sam Ashford, founder of ghost-hunting equipment store SpiritShack.

Advertisement

This is made possible by the first-generation Kinect’s structured light system. By projecting a grid of infrared dots into an environment — even a dark one — and reading the resulting pattern, the Kinect can detect deformations in the projection and, through a machine-learning algorithm, discern human limbs within those deformations. The Kinect then converts that data into a visual representation of a stick figure, which, in its previous life, was pumped back into games like Dance Central and Kinect Sports.

The Kinect isn’t always seeing what it thinks it is

When it was released in 2010, the first-gen Kinect was cutting-edge technology: a high-powered, robust, and lightweight depth camera that condensed what would usually retail upward of $6,000 into a $150 peripheral. Today, you can find a Kinect on eBay for around $20. Ghost hunters, however, typically mount it to a carry handle and a tablet and upsell it for around $400-600, rebranded as a “structured light sensor” (SLS) camera. “The user will direct the camera to a certain point of the room where they believe activity to be present,” says Andy Bailey, founder of a gear shop for ghost hunters called Infraready. “The subject area will be absent of human beings. However, the camera will often calculate and display the presence of a skeletal image.”

Though this is often touted as proof we’re all bound for an eternity haunting aging hotels and abandoned prisons, Bailey urges caution, telling would-be ghost hunters that the cameras are best paired with other equipment to “provide an additional layer of supporting evidence.” For this, Ghost Hunters Equipment, the retail arm of haunted tour operator Ghost Augustine recommends that “EMF readings, temperature, baseline readings, and all of that are essential when considering authentication of paranormal activity.”

That’s because the Kinect isn’t always seeing what it thinks it is. But what is it actually seeing? Did Microsoft, while trying to break into a motion-control market monopolized by the Nintendo Wii, accidentally create a conduit through which we might glimpse the afterlife? Sadly, no.

Advertisement

Photo by Joe Raedle/Getty Images

The Kinect is actually a straightforward piece of hardware. It is trained to recognize the human body, and assumes that it’s always looking at one — because that’s what it’s designed to do. Whatever you show it, whether human or humanoid or something entirely different, it will try and discern human anatomy. If the Kinect is not 100 percent sure of its position, it might even look like the figure it displays is moving. “We may recognise the face of Jesus in a piece of toast or an elephant in a rock formation,” says Jon Wood, a science performer who has a show devoted to examining ghost hunting equipment. “Our brains are trying to make sense of the randomness.” The Kinect does much the same, except it cannot overrule its hunches.

That suits ghost hunters just fine, of course: the Kinect’s habit of finding human shapes where there are none is a crowd-pleaser. The Kinect, deployed in dark rooms bathed in infrared light from cameras and torches, wobbling in the hands of excitable ghost hunters as it tries to read a precise grid of infrared points, is almost guaranteed to show them what they want to see.

Much of ghost hunting depends on ambiguity. If you’re searching for proof of something, be it the afterlife or not, logic suggests you’d want tools that can provide the clearest results, the better to cement the veracity of that proof. Ghost hunters, however, prefer technology that will produce results of any kind: murky recordings on 2000s voice recorders that might be mistaken for voices, low-resolution videos haunted by shadowy artifacts, and any cheap equipment that can call into question the existence of dust (sorry, spirit orbs) — bonus points if battery life is temperamental.

“I’ve watched ghost hunters use two different devices for measuring electromagnetic fields (EMF),” Wood says. “One would be an accurate and expensive Trifled TF2, that never moves unless it actually encounters an electrical field. The other would be a £15 [$18], no-brand, ‘KII’ device with five lights that go berserk when someone so much as sneezes. Which one was more popular, do you think?”

Advertisement

Glitches aren’t tolerated — they’re encouraged

Given the notoriously unreliable skeletal tracking of the Kinect — most non-gaming applications bypass the Kinect’s default SDKs, preferring to process its raw data by other, less error-prone, means — it would be stranger if it didn’t see figures every time it’s deployed. But that’s the point. Like so much technology ghost hunters use, the Kinect’s flaws aren’t bugs or glitches. They’re not tolerated — they’re encouraged.

“If a person pays good money to enjoy a ghost hunt, what are they after?” Wood asks. “They prime themselves for a ‘spooky encounter’ and open up to the suggestion of anything being ‘evidence of a ghost’ — they want to find a ghost, so they make sure they do.”

If it were just the skeletal tracking that ghost hunters were after, better options are now possible with a simple color image. But improved methodology wouldn’t return the false-positives that maintain belief, and so skeletal tracking from 2010 is preferred. None of this is likely to move the needle for those who believe towards something more skeptical. But we do know why the Kinect (or SLS) returns the results it does, and we know it’s not ghosts.

That said, even if its results are erroneous, maybe the Kinect’s new lease on afterlife isn’t a bad thing. Much as ghosts supposedly patrol the same paths over and over until interrupted by ghost hunters, perhaps it’s fitting that the Kinect will continue forevermore to track human bodies — even if the bodies aren’t really there.

Advertisement

Technology

The DJI Romo robovac had security so poor, this man remotely accessed thousands of them

Published

on

The DJI Romo robovac had security so poor, this man remotely accessed thousands of them

Sammy Azdoufal claims he wasn’t trying to hack every robot vacuum in the world. He just wanted to remote control his brand-new DJI Romo vacuum with a PS5 gamepad, he tells The Verge, because it sounded fun.

But when his homegrown remote control app started talking to DJI’s servers, it wasn’t just one vacuum cleaner that replied. Roughly 7,000 of them, all around the world, began treating Azdoufal like their boss.

He could remotely control them, and look and listen through their live camera feeds, he tells me, saying he tested that out with a friend. He could watch them map out each room of a house, generating a complete 2D floor plan. He could use any robot’s IP address to find its rough location.

“I found my device was just one in an ocean of devices,” he says.

A map like the one I saw, with robots and packets trickling in.
Image: Gonzague Dambricourt
Advertisement

On Tuesday, when he showed me his level of access in a live demo, I couldn’t believe my eyes. Ten, hundreds, thousands of robots reporting for duty, each phoning home MQTT data packets every three seconds to say: their serial number, which rooms they’re cleaning, what they’ve seen, how far they’ve traveled, when they’re returning to the charger, and the obstacles they encountered along the way.

I watched each of these robots slowly pop into existence on a map of the world. Nine minutes after we began, Azdoufal’s laptop had already cataloged 6,700 DJI devices across 24 different countries and collected over 100,000 of their messages. If you add the company’s DJI Power portable power stations, which also phone home to these same servers, Azdoufal had access to over 10,000 devices.

Azdoufal says he could remote-control robovacs and view live video over the internet.

Azdoufal says he could remote-control robovacs and view live video over the internet.

When I say I couldn’t believe my eyes at first, I mean that literally. Azdoufal leads AI strategy at a vacation rental home company; when he told me he reverse engineered DJI’s protocols using Claude Code, I had to wonder whether AI was hallucinating these robots. So I asked my colleague Thomas Ricker, who just finished reviewing the DJI Romo, to pass us its serial number.

With nothing more than that 14-digit number, Azdoufal could not only pull up our robot, he could correctly see it was cleaning the living room and had 80 percent battery life remaining. Within minutes, I watched the robot generate and transmit an accurate floor plan of my colleague’s house, with the correct shape and size of each room, just by typing some digits into a laptop located in a different country.

Here are two maps of Thomas’ living space. Above is what we pulled from DJI’s servers without authentication; below is what the owner sees on their own phone.
Screenshots by The Verge

Here’s a fuller floor plan from Gonzague Dambricourt, who tried out a read-only version of Azdoufal’s tool.
Image: Gonzague Dambricourt (X)

Separately, Azdoufal pulled up his own DJI Romo’s live video feed, completely bypassing its security PIN, then walked into his living room and waved to the camera while I watched. He also says he shared a limited read-only version of his app with Gonzague Dambricourt, CTO at an IT consulting firm in France; Dambricourt tells me the app let him remotely watch his own DJI Romo’s camera feed before he even paired it.

Advertisement

Azdoufal was able to enable all of this without hacking into DJI’s servers, he claims. “I didn’t infringe any rules, I didn’t bypass, I didn’t crack, brute force, whatever.” He says he simply extracted his own DJI Romo’s private token — the key that tells DJI’s servers that you should have access to your own data — and those servers gave him the data of thousands of other people as well. He shows me that he can access DJI’s pre-production server, as well as the live servers for the US, China, and the EU.

DJI has MQTT servers associated with the US, EU, and China. I’m not sure what VG stands for.

DJI has MQTT servers associated with the US, EU, and China. I’m not sure what VG stands for.
Screenshot by Sean Hollister / The Verge

Here’s the good news: On Tuesday, Azdoufal was not able to take our DJI Romo on a joyride through my colleague’s house, see through its camera, or listen through its microphone. DJI had already restricted that form of access after both Azdoufal and I told the company about the vulnerabilities.

And by Wednesday morning, Azdoufal’s scanner no longer had access to any robots, not even his own. It appears that DJI has plugged the gaping hole.

But this incident raises serious questions about DJI’s security and data practices. It will no doubt be used to help retroactively justify fears that led to the Chinese dronemaker getting largely forced out of the US. If Azdoufal could find these robots without even looking for them, will it protect them against people with intent to do harm? If Claude Code can spit out an app that lets you see into someone’s house, what keeps a DJI employee from doing so? And should a robot vacuum cleaner have a microphone? “It’s so weird to have a microphone on a freaking vacuum,” says Azdoufal.

It doesn’t help that when Azdoufal and The Verge contacted DJI about the issue, the company claimed it had fixed the vulnerability when it was actually only partially resolved.

Advertisement

“DJI can confirm the issue was resolved last week and remediation was already underway prior to public disclosure,” reads part of the original statement provided by DJI spokesperson Daisy Kong. We received that statement on Tuesday morning at 12:28PM ET — about half an hour before Azdoufal showed me thousands of robots, including our review unit, reporting for duty.

Not just robovacs — DJI’s power stations also use this system.

Not just robovacs — DJI’s power stations also use this system.
Screenshot by Sean Hollister / The Verge

To be clear, it’s not surprising that a robot vacuum cleaner with a smartphone app would phone home to the cloud. For better or for worse, users currently expect those apps to work outside of their own homes. Unless you’ve built a tunnel into your own home network, that means relaying the data through cloud servers first.

But people who put a camera into their home expect that data to be protected, both in transit and once it reaches the server. Security professionals should know that — but as soon as Azdoufal connected to DJI’s MQTT servers, everything was visible in cleartext. If DJI has merely cut off one particular way into those servers, that may not be enough to protect them if hackers find another way in.

Unfortunately, DJI is far from the only smart home company that’s let people down on security. Hackers took over Ecovacs robot vacuums to chase pets and yell racist slurs in 2024. In 2025, South Korean government agencies reported that Dreame’s X50 Ultra had a flaw that could let hackers view its camera feed in real time, and that another Ecovacs and a Narwal robovac could let hackers view and steal photos from the devices. (Korea’s own Samsung and LG vacuums received high marks, and a Roborock did fine.)

It’s not just vacuums, of course. I still won’t buy a Wyze camera, despite its new security ideas, because that company tried to sweep a remote access vulnerability under the rug instead of warning its customers. I would find it hard to trust Anker’s Eufy after it lied to us about its security, too. But Anker came clean, and sunlight is a good disinfectant.

Advertisement

DJI is not being exceptionally transparent about what happened here, but it did answer almost all our questions. In a new statement to The Verge via spokesperson Daisy Kong, the company now admits “a backend permission validation issue” that could have theoretically let hackers see live video from its vacuums, and it admits that it didn’t fully patch that issue until after we confirmed that issues were still present.

Here’s that whole statement:

DJI identified a vulnerability affecting DJI Home through internal review in late January and initiated remediation immediately. The issue was addressed through two updates, with an initial patch deployed on February 8 and a follow-up update completed on February 10. The fix was deployed automatically, and no user action is required.

The vulnerability involved a backend permission validation issue affecting MQTT-based communication between the device and the server. While this issue created a theoretical potential for unauthorized access to live video of ROMO device, our investigation confirms that actual occurrences were extremely rare. Nearly all identified activity was linked to independent security researchers testing their own devices for reporting purposes, with only a handful of potential exceptions.

The first patch addressed this vulnerability but had not been applied universally across all service nodes. The second patch re-enabled and restarted the remaining service nodes. This has now been fully resolved, and there is no evidence of broader impact. This was not a transmission encryption issue. ROMO device-to-server communication was not transmitted in cleartext and has always been encrypted using TLS. Data associated with ROMO devices, such as those in Europe, is stored on U.S.-based AWS cloud infrastructure.

DJI maintains strong standards for data privacy and security and has established processes for identifying and addressing potential vulnerabilities. The company has invested in industry-standard encryption and operates a longstanding bug bounty program. We have reviewed the findings and recommendations shared by the independent security researchers who contacted us through that program as part of our standard post-remediation process. DJI will continue to implement additional security enhancements as part of its ongoing efforts.

Advertisement

Azdoufal says that even now, DJI hasn’t fixed all the vulnerabilities he’s found. One of them is the ability to view your own DJI Romo video stream without needing its security pin. Another one is so bad I won’t describe it until DJI has more time to fix it. DJI did not immediately promise to do so.

And both Azdoufal and security researcher Kevin Finisterre tell me it’s not enough for the Romo to send encrypted data to a US server, if anyone inside that server can easily read it afterward. “A server being based in the US in no way, shape, or form prevents .cn DJI employees from access,” Finisterre tells me. That seems evident, as Azdoufal lives in Barcelona and was able to see devices in entirely different regions.

“Once you’re an authenticated client on the MQTT broker, if there are no proper topic-level access controls (ACLs), you can subscribe to wildcard topics (e.g., #) and see all messages from all devices in plaintext at the application layer,” says Azdoufal. “TLS does nothing to prevent this — it only protects the pipe, not what’s inside the pipe from other authorized participants.”

When I tell Azdoufal that some may judge him for not giving DJI much time to resolve the issues before going public, he notes that he didn’t hack anything, didn’t expose sensitive data, and isn’t a security professional. He says he was simply livetweeting everything that happened while trying to control his robot with a PS5 gamepad.

“Yes, I don’t follow the rules, but people stick to the bug bounty program for money. I fucking don’t care, I just want this fixed,” he says. “Following the rules to the end would probably make this breach happen for a way longer time, I think.”

Advertisement

He doesn’t believe that DJI truly discovered these issues by itself back in January, and he’s annoyed the company only ever responded to him robotically in DMs on X, instead of answering his emails.

But he is happy about one thing: He can indeed control his Romo with a PlayStation or Xbox gamepad.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

Continue Reading

Technology

World’s fastest humanoid robot runs 22 MPH

Published

on

World’s fastest humanoid robot runs 22 MPH

NEWYou can now listen to Fox News articles!

A full-size humanoid robot just ran faster than most people will ever sprint. 

Chinese robotics firm MirrorMe Technology has unveiled Bolt, a humanoid robot that reached a top speed of 22 miles per hour during real-world testing. This was not CGI or a computer simulation. The footage, shared by the company on X, shows a real humanoid robot running at full speed inside a controlled testing facility.

That milestone makes Bolt the fastest running humanoid robot of its size ever demonstrated outside computer simulations. For robotics, this is a line-crossing moment.

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.

Advertisement

WARM-SKINNED AI ROBOT WITH CAMERA EYES IS SERIOUSLY CREEPY

MirrorMe Technology’s humanoid robot Bolt reaches 22 mph during a real-world sprint test inside a controlled facility. (Zhang Xiangyi/China News Service/VCG via Getty Images)

What allows the world’s fastest humanoid robot to run at 22 mph

In the promotional video, the run is shown using a split-screen view. On one side of the screen, Wang Hongtao, the founder of MirrorMe Technology, runs on a treadmill. On the other side, Bolt runs under the same conditions. The comparison makes the difference clear. As the pace increases, Wang struggles to keep up and eventually gives up, while Bolt continues running smoothly, maintaining balance as its stride rate increases.

Bolt takes shorter strides than a human runner but makes up for it with a much faster stride rhythm. That faster rhythm helps the robot stay stable as it accelerates. Engineers say this performance reflects major progress in humanoid locomotion control, dynamic balance and high-performance drive systems. Speed is impressive. Speed with control is the real achievement.

The humanoid robot design choices behind Bolt’s speed

Bolt stands about 5 feet, 7 inches tall and weighs roughly 165 pounds, putting it close to the size and mass of an average adult human. MirrorMe says that similarity is intentional. The company describes this as the ideal humanoid form. 

Advertisement

Rather than oversized limbs or exaggerated mechanics, Bolt relies on newly designed joints paired with a fully optimized power system. The goal is to replicate natural human motion while staying stable at extreme speeds. That combination is what sets Bolt apart.

HUMANOID ROBOTS ARE GETTING SMALLER, SAFER AND CLOSER

MirrorMe says Bolt’s 22 mph run highlights stability and control, not just raw speed. ( Cui Jun/Beijing Youth Daily/VCG via Getty Images)

Why Bolt’s sprint reflects years of robotics development

Bolt did not appear overnight. MirrorMe has focused on robotic speed as a long-term priority since 2016. Last year, its Black Panther II robot stunned viewers by sprinting 328 feet in 13.17 seconds during a live television broadcast in China. Reports suggested the performance exceeded comparable tests involving Boston Dynamics machines. 

In 2025, the company also set a record with a four-legged robot that surpassed 22 mph, reinforcing its focus on acceleration, agility and sustained high-speed motion. China’s interest in robotic athletics continues to grow. Beijing even hosted the first World Humanoid Robot Games, where humanoid robots competed in sprint races on a track.

Advertisement

Why MirrorMe says speed is not the end goal

Running at 22 mph grabs attention, but MirrorMe says speed alone is not the point. The engineers behind Bolt care more about what happens at that speed. Balance, reaction time and control matter more than a headline number. Those skills are what let a humanoid robot move like a trained runner instead of a machine on the verge of tipping over.

That is where the athlete angle comes in. MirrorMe envisions Bolt as a training partner that can run alongside elite athletes, hold a steady pace and push limits without getting tired. By matching and slightly exceeding human performance, the robot could help runners fine-tune form, pacing and endurance while collecting precise motion data. In that context, the sprint is not a stunt. It shows how humanoid robots could move beyond demos and into real training and performance settings.

What this means to you

Humanoid robots that can run at highway speeds are no longer something you only see in demos or concept videos. As these machines get faster and more stable, they start to fit into real-world roles. That includes athletic training, emergency response and physically demanding jobs where speed and endurance make a real difference. At the same time, faster robots bring real concerns. Safety, oversight and clear rules matter even more when machines can move this quickly around people. When robots run this fast, the limits need to be clear.

Take my quiz: How safe is your online security?

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com.

Advertisement

HUMANOID ROBOT MAKES ARCHITECTURAL HISTORY BY DESIGNING A BUILDING

Engineers say Bolt’s high-speed sprint reflects advances in locomotion control, balance and drive systems. (Photo by Kevin Frayer/Getty Images)

Kurt’s key takeaways

Bolt running at 22 mph is eye-catching, but the speed is not the main takeaway. What matters is what it shows. Robots are starting to move more like people. They can run, adjust and stay upright at speeds that used to knock machines over. That opens the door to real uses, but it also raises real questions. How fast is too fast around people? Who sets the rules? And who is responsible when something goes wrong? The technology is moving quickly. The conversation around it needs to move just as fast.

If humanoid robots can soon outrun and outtrain humans, where should limits be set on how and where they are allowed to operate? Let us know by writing to us at Cyberguy.com.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP 

Advertisement

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.

Copyright 2026 CyberGuy.com. All rights reserved.

Advertisement
Continue Reading

Technology

4chan’s creator says ‘Epstein had nothing to do’ with creating infamous far-right board /pol/

Published

on

4chan’s creator says ‘Epstein had nothing to do’ with creating infamous far-right board /pol/

Epstein had nothing to do with the reintroduction of a politics board to 4chan, nor anything else related to the site. The decision to add the board was made weeks beforehand, and the board was added almost 24 hours prior to a first, chance encounter at a social event. His assistant reached out to me afterward, and I met with him one time for an unmemorable lunch meeting. This happened at a time when I was meeting hundreds of people a month while speaking and networking at tech events.

I did not meet him again nor maintain contact. I regret having ever encountered him at all, and have deep sympathy for all of his victims.

Continue Reading

Trending