Connect with us

Technology

Samsung Galaxy S24 Ultra review: all that and AI

Published

on

Samsung Galaxy S24 Ultra review: all that and AI

The Galaxy S24 Ultra is a hell of a phone. As always, Samsung has jammed it full of more high-end hardware than you can shake an S Pen at, and this year it’s also packed with cutting-edge AI features. But it’s expensive, and most of my favorite things about it have very little to do with the AI parts, which aren’t even exclusive to the Ultra.

It comes with a display that’s so easy to use outside in bright light that I want every other manufacturer to copy it. Its camera system is one of the best in the game and comes with a fantastic portrait mode. The built-in stylus remains one of the nicest and fanciest ways to make a grocery list.

How we rate and review products

Some of the AI features really are impressive: live translation for phone calls could be really helpful for someone who makes a lot of calls in an unfamiliar language. Voice recording summaries are surprisingly good and give my beloved Pixel Recorder a real run for its money. And turning any video into slow motion is just plain fun. Are the results always great? No, but they’re usually delightful. 

But battery performance is just okay, and while I appreciate the new flat-screen design, it leaves some sharp corners that can be uncomfortable in your hand. Above all, the Ultra is expensive — now starting at $1,299, a $100 increase from last year’s model. Samsung’s everything-but-the-kitchen-sink device is still the most feature-packed phone money can buy, but I’m just not seeing an extra $100 worth of improvements, especially considering that the AI features will all be ported to the S23 series in a future software update. 

Ever try to use your phone in direct sunlight and find it turns into a mirror? Reflections bounce right off the glass, and whatever you were trying to look at is suddenly invisible. It’s a real pain in the buns, although it’s been less of a problem as OLED screens have gotten brighter over the past few years. The S24 Ultra goes an extra step and introduces a new anti-glare coating that does a fantastic job of cutting those reflections down. 

Advertisement

The 6.8-inch screen peaks at 2,600 nits in bright light, which makes a real difference when you use your phone outside. I prepared myself to squint at the display when I used it in some bright sunshine, so it was a real treat to realize I could see the screen almost as well as I could indoors. There’s a new Gorilla Glass Armor protecting the screen, and it purports to be much more scratch-resistant than previous versions of Gorilla Glass. It’s hard to judge that in just over a week of testing, but so far, so good. 

Samsung followed Apple’s lead on another screen feature: dimming the wallpaper on the always-on display. You can also add a handful of widgets that will continue to display on the AOD even with the phone locked. It’s a straight-up Apple clone, and I have zero problems with that because I crave information and love widgets. Answering “What’s next on my calendar?” is as simple as glancing at my phone screen.

Widgets on an always-on display? A beautiful thing.

The always-on display is handy, but it does seem to take a significant toll on battery performance. At the end of each day I tested it, diagnostics reported that it was responsible for about 7 percent of my battery use throughout the day. Overall, battery life on the S24 Ultra isn’t great, but it’s not a disaster by any means — on days of light use, I got to bedtime with around 50 percent in the tank. Heavier days with closer to five hours of screen-on time pushed my review unit’s battery down to 30 percent. Honestly, that’s about average for a flagship phone these days, and I wish performance were a little better across the board. 

In any case, plenty of people will get through a full day on the S24 Ultra just fine, though it’s worth remembering your mileage will dwindle over time as the battery ages. If you’re a power user — as I suspect a lot of people interested in this phone are — you might need an afternoon recharge to avoid late-day battery anxiety.

Advertisement

There’s 45W wired and 15W wireless Qi charging available but no Qi2, which is a real disappointment. If you want to live the MagSafe life with the S24 Ultra, you’ll need to pick up a third-party magnet case — just don’t use the stylus with a magnetic accessory attached — and a MagSafe-compatible (not MagSafe or Qi2) charger.

Curved screens are out; flat screens are in.

The S24 Ultra comes equipped with the Snapdragon 8 Gen 3 chipset no matter where you buy it — not true of the standard S24 and S24 Plus, which come with Exynos chips outside of the US. I can’t find anything wrong with performance on the Ultra. It didn’t get excessively hot in my testing, and with 12GB of RAM, it handled everything I threw at it without a problem.

The S24 Ultra is the first Ultra with a flat screen, and I appreciate it — no longer do I fear running the S Pen over the curved edge. The titanium exterior finish is lovely, but this remains an unapologetically big, heavy phone. After the first few times it slipped out of the pocket of my joggers and onto the wood floor with a thud heard ’round the house, I quit carrying it around with me and left it on the dining room table.

Also, this phone is kind of sharp? The corners where the flat parts of the phone meet the curved edges are pointy, and if you don’t get it situated in your hand just right, they’re pretty uncomfortable. I’m not a case person, but I might consider one with the S24 Ultra for this reason.

Advertisement

That’s a whole lot of phone.

The S24 series ships with One UI 6.1, which is Samsung’s take on Android 14. Like Android 14 itself, One UI 6.1 is a relatively light update, and the things that irritate me about Samsung software persist: a push notification urging me to check out the new Galaxy S24 (lol); lots of proprietary apps and features that I don’t have a use for (I refuse to believe Global Goals has inspired anyone to do anything except uninstall Global Goals); and clickbait links stuffed at the bottom of the weather app (“Tourist Finds Large Diamond at State Park,” really?).

You can de-Samsung a lot of this stuff and live a peaceful existence with One UI, and there’s some good news this year: the company is promising seven years of OS upgrades, including seven years of security updates. That’s a great proposition for ROI and anyone who wants to get the absolute most years out of their phone.

That’s the gist of the regular phone stuff — a massive screen, okay battery life, and performance fitting of a 2024 flagship. So how about the marquee feature: Galaxy AI? Settle in, because there’s a lot to cover. 

AI and all those cameras.
Advertisement

The short version is this: the S24 Ultra has an impressive collection of AI features. But it’s just that — a collection. It doesn’t feel like a unified set of tools with a collective purpose; it feels like a handful of capabilities scattered throughout the system that are excellent at times and baffling at others.

Take live translation: it happens on-device, and it can act as a real-time interpreter on phone calls. I tried it with my colleague Victoria Song, a fluent speaker of Japanese, and she thought it did an adequate job translating our exchange. It’s best suited for short transactional conversations because it gets impatient with pauses. It doesn’t quite get things right when talking more casually, either — I asked how Vee’s cats were doing, and the translator somehow interpreted her Japanese for “Petey is eating my chair” as “I am eating my chair.” Hilarious, but not ideal!

But if you need to call and ask for some information or make a reservation, it would serve the purpose. And that’s kind of amazing — if you live in a country where you don’t speak the language, I can see this being a hugely useful tool. It’s the kind of situation where AI that’s good enough is better than nothing.

The S24 Ultra doesn’t transcribe as you’re recording, but it does generate the transcription fairly quickly and on-device.

The S24 Ultra’s transcript summaries come with suggested keywords, subheads, and timestamps — pretty good! But it does require a trip to the cloud.
Advertisement

Automatic note and voice transcript summaries are in the same category. I put the S24 Ultra’s new voice recorder features up against my beloved Pixel Recorder by reading them both a passage from the closest book I had on hand, Precious Little Sleep, aka the baby sleep Bible. Overall, I was surprised by how well Samsung kept up with the Pixel. It doesn’t transcribe in real time like the Pixel does, but it’s on-device, and it’s relatively quick after the fact; a six-minute recording took about 90 seconds to transcribe in my testing.

Once you have your transcript, you can generate a detailed summary in just a few seconds, complete with subheads and timestamps. It’s not something I’d trust without double-checking the source material, but like call translation, it gives you a starting point — something useful when you’d otherwise have nothing. Samsung’s translation summaries happen in the cloud, not on-device, so they do require an internet connection. Summarization seems like it’s too much to ask of a phone processor: the Pixel 8 Pro I tested at the same time tried to summarize its recording of the same text, produced one bullet point, and then gave up after chugging for a few minutes.

Then there’s Circle to Search, which is only really an AI feature by association, but spiritually, it feels at home in a discussion about the S24 Ultra’s AI features. It’s a Google feature that’s debuting on the Galaxy S24 and Pixel 8 series and will reportedly come to more high-end Android phones in the future. Basically, it’s Google Lens but everywhere on your phone — any app, anytime. You long-press the home button or navigation handle to engage it, and then a prompt appears to circle the thing you want to know more about. A page of Google results will follow, and you can tap around to learn more or dismiss the whole thing and go about your business. It’s simple, but it kind of feels like how our phones should have been working all along.

Multisearch will prompt you to “add to your search” after an initial query, and that’s where things get interesting.

Google’s improvements to multisearch really make this feature stand out. After you’ve circled something to search for it, you can clarify your search with additional questions using the image as a starting point. Previously, multisearch could only work with basic modifiers, like “blue” to search for a pair of shoes in a particular color. Now, you can ask more complicated questions, and the search results will offer up an answer using generative AI. And that’s where I found the answer I was after most often.

Advertisement

There are some obvious situations where Circle to Search makes immediate sense — a friend texts the name of a restaurant, you highlight the name, and without ever leaving the messages app, you can check where it is. I had to sort of unlearn doing this the long way while using the S24 Ultra, and now that I’m used to Circle to Search, I don’t want to go back to the old way.

But when I was looking for more information about something, the first set of results didn’t always clear things up. Searching for a mural I photographed in San Jose brought up a page of similar-looking murals — helpful if you’re making a Pinterest mood board, less helpful if you want to know exactly what you’re looking at. But asking “Where is this?” in multisearch (shh, I knew where it was, I was just testing the computer) got me to the right answer quickly. 

There are more AI features — naturally there are more — but I won’t go into depth about every one of them. You can have generative AI spice up your text messages with emoji or summarize webpages. They work reasonably well but don’t strike me as being quite as useful as the others. Of course, there’s one more place you’ll find AI at work: the Galaxy S24 Ultra’s camera. 

The new 5x telephoto is good, but damn the 10x lens was a great party trick.

First, the numbers. Per usual, there are a boatload of cameras on this phone:

Advertisement
  • Main camera: 200-megapixel f/1.7 with OIS
  • 3x telephoto: 10-megapixel f/2.4, OIS
  • 5x telephoto: 50-megapixel f/3.4, OIS
  • Ultrawide: 12-megapixel f/2.2
  • Selfie: 12-megapixel f/2.2

They’re the same cameras that are on the S23 Ultra except for one notable substitution — the 10x lens is gone, swapped for a 5x lens coupled with a bigger, higher-res sensor. This is terrible news for me personally because I love that ridiculous 10x lens. You can take pictures of planes! In the sky! It’s such a great party trick. 

The new version uses crop zoom to get to 10x, and Samsung insists that the image quality is just as good as the 10x optical zoom on the previous version. As far as I can tell, that’s mostly true — detail rendering looks about the same. And the S24 Ultra definitely looks better at 5x since the S23 Ultra was using digital zoom at that focal length. I do see more chromatic aberration on some of the S24 Ultra’s 10x images compared to the S23 Ultra’s, which can make certain subjects appear a little fuzzier — this seems to be less of a problem with distant subjects, like the top of a skyscraper. The switch to a 5x zoom hasn’t been a completely victimless crime. But overall, it’s a move that makes sense. The 5x focal length has a lot more practical uses, and Samsung claims it’s used more often than 10x. Fair. 

Galaxy S24 Ultra (left) and S23 Ultra (right) at 10x zoom, both shown at 100 percent magnification. Chromatic aberration is more noticeable on the S24’s newer 5x camera. Tap the links above for the full images.

Otherwise, there aren’t any drastic changes year over year. Samsung is still leaning on the saturation slider, embracing those vivid reds and blues it’s known for. It usually looks nice and occasionally looks bananas. The company made a few tweaks to the tech behind its portrait mode, which is still excellent. Expert RAW now produces 24-megapixel images with data from 50- and 12-megapixel captures. More data, more better, as they say. 

And I’m thrilled to see Samsung fully embrace Ultra HDR — that’s the high dynamic range image format supported in Android 14. You’ll see the Ultra HDR tone mapping in the live image preview as you take your photo, and the S24 series are the first devices that let you upload Ultra HDR photos to Instagram. These are true HDR photos that look more vibrant than the washed-out “HDR” photos we’re used to seeing, which are really just attempts at showing a wider dynamic range on an SDR display. I’m already impatient for third-party app support to come to more phones.

Samsung is the company that gave us AI Moon, so naturally, there are a few AI photo and video editing features here. Generative AI edits are available behind a star icon in the native gallery editing interface. You can select objects to move around the frame or erase entirely, and you can adjust the horizon using generative fill to pad out the image rather than cropping in. The edits happen off-device, so you need an internet connection and a little patience.

This is very similar to the generative AI editing tools offered on the Pixel 8 Pro, which isn’t surprising — Samsung is using Google’s models to power just about every AI feature on these phones. But I actually find Samsung’s object selection much easier to use than the Pixel’s. On the S24, you just circle an object you want to select, and on-device AI makes the selection. It’s a little uncanny how good it is. Selecting objects on the Pixel feels a little more fiddly.

Advertisement

With an object selected, you can resize it, move it around the frame, or erase it completely. With a somewhat predictable background like grass or a gravel path, generative AI can fill in the blanks convincingly. Predictably, things get dicey with more complicated edits. I selected a lamp on a table and attempted to erase it from a photo; the AI replaced it with a different lamp. 

It’s a similar story with slow-motion AI videos — impressive if you don’t challenge it too much or look too closely. The S24 Ultra can turn any video into a 120fps slow-motion video — essentially using frame interpolation but leaning on generative AI to fill in the gaps. With the right subject and background, it’s totally convincing. But if you start introducing some complexity, it kind of falls apart. 

I slowed down the above 30fps video of my son on a swing, and you can see that the mulch on the playground and greenery in the background gave the AI some trouble. Is it still an adorable video? Yes. Will his grandparents be delighted by it despite its flaws? Also yes. It feels like a feature right on the edge between “good enough” and “too weird” to be fun. 

The AI puzzle pieces don’t quite make a complete picture.

It’s hard to describe why a device that does so much so well feels like it falls flat. The Galaxy S23 Ultra felt like something truly special — a refinement of a well-balanced formula. But with the S24 Ultra, the math feels a little off.

Advertisement

It’s $100 more expensive, but it’s hard to see an extra $100 worth of value, especially considering the new AI features are shared across the whole S24 series. To be clear, I think it’s a good thing that these features are available on all three phones. But if the Ultra really is the fastest, bestest phone in all the land, shouldn’t it be able to do a little more?

The new anti-glare screen coating is impressive and truly helpful on a bright day. The flat screen is an improvement, and the new titanium exterior looks and feels great. But that’s more or less the extent of this year’s Ultra-only improvements. The AI features, Ultra HDR support, seven years of OS upgrades, updated always-on display — they’re all available on the less pricey S24 and S24 Plus.

The feature-to-price ratio feels just a bit off-balance in a way that it didn’t in the S23 Ultra

That leaves the Ultra with a bigger screen, an S Pen, and a 5x zoom to distinguish it. They’re all great features and sure to please loyal Note / Ultra fans. But the feature-to-price ratio feels just a bit off-balance in a way that it didn’t in the S23 Ultra. 

I wish Samsung had spent a little more time on the less-flashy stuff — improving battery life or making it lighter and more comfortable to use. Heck, I’d be thrilled if Samsung spent time on a little housekeeping on the features that have piled up over the years. Bixby Vision, Google Lens, and Circle to Search all exist on this phone. What are we doing here? 

Advertisement

The Galaxy S24 Ultra remains the absolute most phone. Massive screen, S Pen, all of the cameras, performance out the wazoo — it really has no peer. I just wish that it felt a little more worthy of its price bump when the MSRP was already sky-high before. This is an Ultra phone, alright. I just wish it came with a little extra. 

Photography by Allison Johnson / The Verge

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Technology

YouTube made its video player easier to navigate on TVs

Published

on

YouTube made its video player easier to navigate on TVs

The YouTube watch screen has been given a new look on TVs. The redesign aims to provide a “more intuitive experience with easier navigation,” according to YouTube’s announcement, relocating the video title and several controls, and adding a new “Description” button to access creator information and other video features.

I’m already seeing the update on my own Nvidia Shield Pro streaming box and native Phillips TV OS, and I do think it makes it easier to find specific video features and controls. My colleague Thomas Ricker says he isn’t seeing the redesign in Apple TV’s YouTube player, however, so they may still be rolling out. These changes are pretty delayed, considering YouTube announced in April that they would arrive “this summer.”

Videos on the YouTube app for TV will now show the title in the top left corner of the screen instead of just above the video scrubber at the bottom of the page, and the title can no longer be clicked to open comments, metadata, and information about the creator. Instead, those controls are now available by clicking the new “Description” button. The channel thumbnail and subscribe function have also been separated into two buttons, with the creator’s thumbnail now taking users directly to their channel.

Controls have been reorganized into distinct groups under the video scrubber: Channel, Description, and Subscribe on the left, Previous, Pause/Play, and Next in the center, and Like, Dislike, Comment, Save, Closed Captions, and Settings placed into two groups on the right. YouTube says the Subscribe button will remain visible to subscribers, adapting to flag pay-gated content or alert users to new live streams. A “Multiview” control has also been added for live sports content, while Music and Premium subscribers will see a new “Display Mode” control.

Continue Reading

Technology

Android Emergency Live Video gives 911 eyes on the scene

Published

on

Android Emergency Live Video gives 911 eyes on the scene

NEWYou can now listen to Fox News articles!

Holiday travel and winter storms create risky moments for drivers and families. Stress rises fast during emergencies, and describing the scene to 911 can feel overwhelming. 

Now, a new Android feature closes that gap by providing live visual information that helps responders act with speed and accuracy.

If you use an iPhone, Apple offers a similar tool through its Emergency SOS Live Video feature. You can learn how it works right here.

Sign up for my FREE CyberGuy Report 
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter.   

Advertisement

HOW ANDROID MALWARE LETS THIEVES ACCESS YOUR ATM CASH

Android Emergency Live Video gives 911 a secure live view of the scene, so responders understand what is happening right away. (Cyberguy.com)

What Android Emergency Live Video does

Google is rolling out Android Emergency Live Video to give dispatchers a secure view of the scene during an active call or text. A dispatcher can request a live video stream through your phone when it is safe for you to share it. With a single tap, you can stream real-time video that helps responders understand what is happening.

This can help during car accidents, medical emergencies or fast-moving hazards such as wildfire conditions. Live video can also help dispatchers guide you through steps that save lives, such as CPR, until responders arrive.

APPLE NOW LETS YOU ADD YOUR PASSPORT TO YOUR PHONE’S WALLET

Advertisement

How the Android Emergency Live Video feature works

Android designed this tool to work with no setup. When you call or text 911, the dispatcher reviews the situation. If they decide video would help, they will send a request to your phone. You see a clear prompt that lets you choose whether to start the secure stream. The feature uses encryption and gives you full control. You can stop sharing at any moment.

The feature works on Android phones running Android 8 or newer with Google Play services. It is rolling out across the U.S. and select regions in Germany and Mexico. Google plans to expand coverage with more public safety partners.

How to use Emergency Live Video on Android

You cannot turn this feature on in advance. It appears only during an active 911 call or text.

1) Call or text 911 on your Android phone. The dispatcher reviews your situation.

2) Watch for a request on your screen. If the dispatcher decides live video will help, they send a prompt to your device.

Advertisement

3) Tap the notification that appears. You will see a clear message asking if you want to share live video.

4) Choose Share video to start streaming. This opens your camera and begins a secure live feed.

5) Tap Stop sharing at any time. You stay in control the entire time and can end the video at any time.

With one tap, you can choose to share real-time video during a 911 call or text which gives dispatchers the clarity they need to guide you. (CyberGuy.com)

Why Emergency Live Video on Android matters now

Emergencies create confusion. Sharing details verbally takes time and can lead to miscommunication. Video removes guesswork. Responders gain clarity in seconds, which can speed up help and improve outcomes. This tool builds on Android’s safety features, including Satellite SOS, Fall Detection and Car Crash Detection.

Advertisement

NEW ANDROID ATTACK TRICKS YOU INTO GIVING DANGEROUS PERMISSIONS

Alastair Breeze, a Software Engineer for Android, tells CyberGuy that the team built this feature with one goal in mind. “Providing people peace of mind is at the core of Android’s safety mission. Android Emergency Live Video gives you the ability to securely share real-time video to provide dispatchers the critical eyes-on-scene context they need to assist in emergencies.”

What this means to you

If you carry an Android phone, this feature adds another layer of protection during moments that demand quick action. You stay in control of when the video is shared. You also get a simple way to show the situation when describing it feels impossible. Faster clarity can lead to faster help, which can shape how an emergency ends.

Take my quiz: How safe is your online security?

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com. 

The feature works on Android phones running Android 8 or newer and helps responders act faster during emergencies when seconds matter. (Tony Giberson/tgiberson@pnj.com / USA TODAY)

Advertisement

Kurt’s key takeaways

Android Emergency Live Video brings real-time awareness to moments when every second matters. It gives responders a clear view, so they can guide you through urgent steps if necessary. Most of all, it adds peace of mind during situations no one plans for.

Would you feel comfortable sharing live video during an emergency if it helped responders reach you faster? Let us know by writing to us at Cyberguy.com.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

Sign up for my FREE CyberGuy Report 
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter. 

Copyright 2025 CyberGuy.com.  All rights reserved.

Advertisement

Continue Reading

Technology

The Game Awards 2025: all the news and announcements

Published

on

The Game Awards 2025: all the news and announcements

The Game Awards are back once again to showcase a metric ton of commercials, provide the gaming public with their monthly dose of Muppets, and validate gamers’ opinions on which title should be named the Game of the Year. I don’t wanna say it’s a foregone conclusion what this year’s GOTY will be — Silksong may surprise us — but it’s pretty obvious that Clair Obscur: Expedition 33 is the frontrunner and for good reason. It’s netted 12 nominations, the most out of this year’s contenders, including all five craft awards (Direction, Art, Music and Score, Narrative, and Audio Design).

On the announcements side, Crystal Dynamics and Amazon Games are planning something related to the Tomb Raider series. Keighley also probably had plans to reveal big news about Resident Evil: Requiem, but unfortunately it got spoiled early thanks to some leaked key art on the PlayStation Store. Here’s all the news, announcements, and trailers from The Game Awards 2025.

Continue Reading

Trending