Connect with us

Technology

The future of AI gadgets is just phones

Published

on

The future of AI gadgets is just phones

At any given time, there are between five and eight phones on my desk. And by “my desk,” I mean any combination of tables and countertops throughout my house. So when I watched the Humane AI Pin reviews start pouring in last week, I did what any logical person would do: grab the closest phone and try to turn it into my own AI wearable.

Humane would like you to believe that its AI Pin represents consumer tech at its most cutting edge. The reviews and the guts of the pin say otherwise: it uses a Snapdragon processor from four years ago and seems to run a custom version of Android 12.

“It’s a midrange Android phone!” I declared at our next team meeting, waving around a midrange Android phone for effect. “You could just download Gemini and stick this to your shirt!” Simple. Trivial. Give me 10 minutes, and I’ll have a more powerful AI gadget whipped up, I said.

Hardware is hard, y’all.

Ideally, I wanted an outward-facing camera and a decent voice assistant I could use hands-free. An iPhone in a shirt pocket was an intriguing solution but a nonstarter because a) none of my shirts have pockets, and b) Siri is just not that smart. Thus, my earliest prototype was a Motorola Razr Plus clamped to the neckline of my shirt. This, unsurprisingly, did not work but for reasons I did not anticipate. 

Advertisement

First off, you can’t download Gemini from the Play Store on a folding phone. That was news to me. But even once I’d sideloaded it and set it as the default assistant, I ran into another barrier: it’s really hard to use a voice assistant from the cover screen of a flip phone. The Razr wants you to flip the phone open before you can do anything aside from get its attention with “Hey Google.” 

The things we do for content.
Photo by Allison Johnson / The Verge

Running Gemini in Chrome on the cover screen actually got me closer to what I was looking for. But trying to tap buttons on the screen to trigger the assistant wasn’t working very well, and neither was operating Google Lens out of the corner of my eye. Also, Gemini misread “recycle” on a tube of toothpaste as “becicle,” which it confidently told me was an old-timey word for eyeglasses. It is not!

Prototype two was the same Razr flip phone running ChatGPT in conversation mode on the cover screen. This meant the app was constantly running and always listening, so it wasn’t practical. But I gave it a shot anyway, and it was a strange experience talking to an AI chatbot that I couldn’t see. 

I want an AI that can do things for me, not just brainstorm stir-fry ingredients

Advertisement

ChatGPT is a decent conversationalist, but we ran out of things to talk about pretty quickly once I’d exhausted my chatbot go-to’s: dinner recipes and plant care tips. I want an AI that can do things for me, not just brainstorm stir-fry ingredients.

I ditched the foldable concept and picked up a Pixel 8 and a Pixel Watch 2 instead. I set up Gemini as the default assistant on the phone and figured that would somehow apply to the watch, too. Wrong. I had one more card to play, though: a good old pair of wireless earbuds. Life on the cutting edge of technology, baby.

Honestly, earbuds might be the AI wearable of the future.
Photo by Chris Welch / The Verge

You know what, though? It kind of worked. I had to leave Gemini open and running on my phone since Google doesn’t fully support Gemini Assistant on headphones. But I took a picture of a Blue Apron recipe I was making for dinner, told Gemini to remember it, and left my phone on the counter. As I moved around the kitchen, I asked Gemini questions I’d normally have to peek back at the recipe to answer like “How long do I roast the vegetables for?” and “How do I prep the fish?” It gave me the right answers every time.

What was more impressive is that I could ask it tangential questions. It helped me use pantry ingredients to recreate a seasoning mix I didn’t have on hand. I asked why the recipe might have me divide the sauce into two portions, and it gave me a plausible answer. And it did something the Humane pin can’t do yet: set a timer.

Advertisement

It wasn’t perfect. First, I had to unplug the Google Home puck sitting on the counter because it kept trying to butt in. Gemini also told me that it couldn’t play an album on Spotify, something that that Google Home speaker has been doing for the better part of a decade. The watch came in handy for that, at least.

What started as a goofy stunt has convinced me of two things: I really do think we’re going to use AI to get more things done in the future, and also, the future of AI gadgets is just phones. It’s phones! 

I love a gadget, but guys, I lived through the era of camera companies trying to convince us that we all needed to carry a compact camera and our phones everywhere. Phones won. Phones already come with powerful processors, decent heat dissipation, and sophisticated wireless connectivity. An AI gadget that operates independently from your phone has to figure all of that out.

And you know what looks a lot less doofy than a pin with a laser on your chest? Earbuds. People willingly wear them throughout the day right now. And the doofy factor definitely matters when it comes to wearables. I’m having a hard time seeing how a separate gadget can beat the humble phone plus a pair of earbuds or something like the Meta Ray Bans. Maybe there’s room in our lives and our pockets for dedicated AI hardware — the gadget lover in me is all for it. But I think it’s more likely that we have all of the ingredients we need to make good AI hardware right in front of us.

Advertisement
Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Technology

Use this map to find the data centers in your backyard

Published

on

Use this map to find the data centers in your backyard

When Oregon resident Isabelle Reksopuro heard Google was gobbling up public land to fuel its data centers in her home state, she didn’t initially know what to believe. “There’s a lot of misinformation about data centers,” she said. “Google has denied taking that land.”

Technically, she explains, The Dalles, a city near the Washington state border, sought to reclaim that land, “and Google is just a big, unnamed power user.” The city had in fact asked for ownership of a 150-acre portion of Mount Hood National Forest, claiming it needs access to Mount Hood’s watershed to meet municipal needs as its population — 16,010 as of the 2020 census — grows. But critics, including environmentalists, say the city is trying to secure more water for Google, which has a sprawling data center campus in The Dalles that already consumes about one-third of the city’s water supply.

This controversy made Reksopuro curious about the backlash to data centers being built in other communities. So Reksopuro, a student at the University of Washington who studies the connections between tech and public policy, decided to map it out. Using information collected by Epoch AI and data scraped from legislation on data centers, she built an interactive map tracking AI policy around the world. She designed it to be simple enough for anyone to use. “I wanted it to be something that my younger sisters could play through and explore to understand what are the data centers in the area and what’s actually being done about it,” Reksopuro said. She hoped to shift their opinions that way, “instead of like, through TikTok.”

Four times a day, the map searches for new sources and checks them against the existing database Reksopuro built out. “Once it does that, it will write a new summary, add it to the news feed, and populate it on the sidebar,” she said. “I wanted it to be self-updating, since I’m also a student.”

Reksopuro isn’t against data centers, but she thinks tech giants benefit from a lack of transparency around data center policies. “Right now, it’s this really opaque thing — and all of a sudden, there’s a facility,” she said. “I think that if people knew about data centers beforehand, it would give them leverage. They would be able to negotiate: ask for job training programs, tax revenue, environmental monitoring, things to improve their community.”

Advertisement
Continue Reading

Technology

Fox News AI Newsletter: Graduation speaker praises AI, gets instantly booed

Published

on

Fox News AI Newsletter: Graduation speaker praises AI, gets instantly booed

NEWYou can now listen to Fox News articles!

 

Welcome to Fox News’ Artificial Intelligence newsletter with the latest AI technology advancements.

IN TODAY’S NEWSLETTER:

– UCF graduates clobber commencement speaker with boos after she says AI is the ‘next Industrial Revolution’

– OPINION: DIRECTOR KASH PATEL: We brought the FBI out of the past and into the AI age

Advertisement

– OpenAI backs creation of global AI governance body led by the U.S. that would include China as a member

TOUGH CROWD: During a recent commencement ceremony at the University of Central Florida, a speaker was met with loud boos from the graduating class after declaring that artificial intelligence represents the next industrial revolution. Fox News Digital reporting captures this tense cultural moment, illustrating the mixed public sentiment and skepticism surrounding AI’s growing footprint in daily life.

A statue on the campus of the University of Central Florida in Orlando, Florida. (iStock)

BADGE MEETS BYTE: Reflecting on the modernization of national security in a Fox News op-ed, FBI Director Kash Patel explores how the bureau must adapt its strategies to address modern threats and advance beyond the artificial intelligence age.

TECH DIPLOMACY: OpenAI is throwing its support behind the establishment of a new global artificial intelligence governance organization that would be led by the United States while notably including China as a member. Fox News Digital reporting examines the geopolitical dynamics and regulatory implications of this proposed framework as global powers race to set the standards for AI development.

Advertisement

EQUITY ELEVATION: The massive wave of wealth generated by the explosive growth of ChatGPT and the broader AI industry is driving a sudden surge in the San Francisco Bay Area’s luxury real estate market. Fox News Digital reporting breaks down how the influx of new tech capital is reshaping local housing dynamics and fueling a high-end property frenzy.

FBI Director Kash Patel listened as Acting Attorney General Todd Blanche spoke during a press conference at the Department of Justice on April 28, 2026, in Washington, D.C. (Tasos Katopodis/Getty Images)

STRATEGY RESET: Tech giant Cisco is planning to eliminate thousands of jobs as the company shifts its primary focus to accelerate its artificial intelligence initiatives, a move that comes despite the company beating earnings expectations. Fox News Digital reporting details the corporate restructuring and broader economic trends pushing legacy tech firms to aggressively pivot toward AI.

ROAD HAZARD: Waymo is issuing a sweeping recall of its autonomous vehicle fleet following a concerning incident that highlighted significant safety issues with the self-driving technology. Fox News Digital reporting outlines the specifics of the recall, the nature of the safety flaw, and what this setback means for the future of fully autonomous transportation on public roads.

BOTS IN THE BAY: A newly developed, artificial intelligence-powered robot has been engineered to seamlessly change and balance vehicle tires without human intervention. Fox News Digital reporting showcases this latest innovation, exploring how automation and AI mechanics could soon revolutionize the automotive service and repair industry.

Advertisement

OpenAI CEO Sam Altman speaks during the 2026 Infrastructure Summit in Washington, D.C., on March 11, 2026. (Kylie Cooper/Reuters)

 

FOLLOW FOX NEWS ON SOCIAL MEDIA

Facebook

Instagram

YouTube

Twitter

Advertisement

LinkedIn

SIGN UP FOR OUR OTHER NEWSLETTERS

Fox News First

Fox News Opinion

Fox News Lifestyle

Fox News Health

Advertisement

DOWNLOAD OUR APPS

Fox News

FOX Business

Fox Weather

Fox Sports

Tubi

Advertisement

WATCH FOX NEWS ONLINE

Fox News Go

STREAM FOX NATION

Fox Nation

Stay up to date on the latest AI technology advancements and learn about the challenges and opportunities AI presents now and for the future with Fox News here.

Continue Reading

Technology

Microsoft’s Edge Copilot update uses AI to pull information from across your tabs

Published

on

Microsoft’s Edge Copilot update uses AI to pull information from across your tabs

Microsoft Edge is adding a new feature that will allow its Copilot AI chatbot to gather information from all of your open tabs. When you start a conversation with Copilot, you can ask the chatbot questions about what’s in your tabs, compare the products you’re looking at, summarize your open articles, and more.

In its announcement, Microsoft says you can “select which experiences you want or leave off the ones you don’t.” The company is retiring Copilot Mode as well, which could similarly draw information from your tabs but offered some agentic features, like the ability to book a reservation on your behalf. Microsoft has since folded these agentic capabilities into its “Browse with Copilot” tool.

Several other AI features are coming to Edge, including an AI-powered “Study and Learn” mode that can turn the article you’re looking at into a study session or interactive quiz. There’s a new tool that turns your tabs into AI-powered podcasts as well, similar to what you’d find on NotebookLM, and an AI writing assistant that will pop up when you start entering text on a webpage.

You can also give Copilot permission to access your browsing history to provide more “relevant, high-quality answers,” according to Microsoft. Copilot in Edge on desktop and mobile will come with “long-term memory” as well, which can tailor its responses based on your previous conversations. And, when you open up a new tab, you’ll see a redesigned page that combines chat, search, and web navigation, along with the Journeys feature, which uses AI to organize your browsing history into categories that you can revisit.

Meanwhile, an update to Edge’s mobile app will allow you to share your screen with Copilot and talk through the questions about what you’re seeing. Microsoft says you’ll see “clear visual cues” when Copilot is active, “so you know when it’s taking an action, helping, listening, or viewing.”

Advertisement
Continue Reading
Advertisement

Trending