Connect with us

Technology

Inside Netflix’s bet on advanced video encoding

Published

on

Inside Netflix’s bet on advanced video encoding

Anne Aaron just can’t help herself.

Aaron, Netflix’s senior encoding technology director, was watching the company’s livestream of the Screen Actors Guild Awards earlier this year. And while the rest of the world marveled at all those celebrities and their glitzy outfits sparkling in a sea of flashing cameras, Aaron’s mind immediately started to analyze all the associated visual challenges Netflix’s encoding tech would have to tackle. “Oh my gosh, this content is going to be so hard to encode,” she recalled thinking when I recently interviewed her in Netflix’s office in Los Gatos, California.

Aaron has spent the past 13 years optimizing the way Netflix encodes its movies and TV shows. The work she and her team have done allows the company to deliver better-looking streams over slower connections and has resulted in 50 percent bandwidth savings for 4K streams alone, according to Aaron. Netflix’s encoding team has also contributed to industrywide efforts to improve streaming, including the development of the AV1 video codec and its eventual successor.

Now, Aaron is getting ready to tackle what’s next for Netflix: Not content with just being a service for binge-watching, the company ventured into cloud gaming and livestreaming last year. So far, Netflix has primarily dabbled in one-off live events like the SAG Awards. But starting next year, the company will stream WWE RAW live every Monday. The streamer nabbed the wrestling franchise from Comcast’s USA Network, where it has long been the No. 1 rated show, regularly drawing audiences of around 1.7 million viewers. Satisfying that audience week after week poses some very novel challenges.

“It’s a completely different encoding pipeline than what we’ve had for VOD,” Aaron said, using industry shorthand for on-demand video streaming. “My challenge to (my) team is to get to the same bandwidth requirements as VOD but do it in a faster, real-time way.”

Advertisement

To achieve that, Aaron and her team have to basically start all over and disregard almost everything they’ve learned during more than a decade of optimizing Netflix’s streams — a decade during which Netflix’s video engineers re-encoded the company’s entire catalog multiple times, began using machine learning to make sure Netflix’s streams look good, and were forced to tweak their approach when a show like Barbie Dreamhouse Adventures tripped up the company’s encoders.

When Aaron joined Netflix in 2011, the company was approaching streaming much like everyone else in the online video industry. “We have to support a huge variety of devices,” said Aaron. “Really old TVs, new TVs, mobile devices, set top boxes: each of those devices can have different bandwidth requirements.”

To address those needs, Netflix encoded each video with a bunch of different bitrates and resolutions according to a predefined list of encoding parameters, or recipes, as Aaron and her colleagues like to call them. Back in those days, a viewer on a very slow connection would automatically get a 240p stream with a bitrate of 235 kbps. Faster connections would receive a 1750 kbps 720p video; Netflix’s streaming quality topped out at 1080p with a 5800 kbps bitrate. 

The company’s content delivery servers would automatically choose the best version for each viewer based on their device and broadband speeds and adjust the streaming quality on the fly to account for network slow-downs.

To Aaron and her eagle-eyed awareness of encoding challenges, that approach seemed inadequate. Why spend the same bandwidth to stream something as visually complex as an action movie with car chases (lots of motion) and explosions (flashing lights and all that noisy smoke) as much simpler visual fare? “You need less bits for animation,” explained Aaron. 

Advertisement

My Little Pony, which was a hit on the service at the time, simply didn’t have the same visual complexity as live-action titles. It didn’t make sense to use the same encoding recipes for both. That’s why, in 2015, Netflix began re-encoding its entire catalog with settings fine-tuned per title. With this new, title-specific approach, animated fare could be streamed in 1080p with as little as 1.5 Mbps.

She-Ra and the Princess of Power is another good example of an animated show with fairly simple visual complexity versus live action-fare.
Image: Netflix

Switching to per-title encoding resulted in bandwidth savings of around 20 percent on average — enough to make a notable difference for consumers in North America and Europe, but even more important as Netflix was eyeing its next chapter: in January of 2016, then-CEO Reed Hastings announced that the company was expanding into almost every country around the world — including markets with subpar broadband infrastructure and consumers who primarily accessed the internet from their mobile phone.

Per-title encoding has since been adopted by most commercial video technology vendors, including Amazon’s AWS, which used the approach to optimize PBS’s video library last year. But while the company’s encoding strategy has been wholeheartedly endorsed by streaming tech experts, it has been largely met with silence by Hollywood’s creative class.

Directors and actors like Judd Apatow and Aaron Paul were up in arms when Netflix began to let people change the playback speed of its videos in 2019. Changes to the way it encodes videos, on the other hand, never made the same kinds of headlines. That may be because encoding algorithms are a bit too geeky for that crowd, but there’s also a simpler explanation: the new encoding scheme was so successful at saving bandwidth without compromising on visual fidelity that no one noticed the difference. 

Advertisement

Make that almost no one: Aaron quickly realized that the company’s per-title-based encoding approach wasn’t without faults. One problem became apparent to her while watching Barbie Dreamhouse Adventures. It’s one of those animated Netflix shows that was supposed to benefit the most from a per-title approach. 

However, Netflix’s new encoding struggled with one particular scene. “There’s this guy with a very sparkly suit and a sparkly water fountain behind him,” said Aaron. The scene looked pretty terrible with the new encoding rules, which made her realize that they needed to be more flexible. “At (other) parts of the title, you need less bits,” Aaron said. “But for this, you need to increase it.”

That’s a lot of glitter to properly encode.
Screenshot: Netflix

The solution to this problem was to get a lot more granular during the encoding process. Netflix began to break down videos by shots and apply different encoding settings to each individual segment in 2018. Two people talking in front of a plain white wall were encoded with lower bit rates than the same two people taking part in a car chase; Barbie hanging out with her friends at home required less data than the scene in which Mr. Sparklesuit shows up.

As Netflix adopted 4K and HDR, those differences became even more stark. “(In) The Crown, there’s an episode where it’s very smokey,” said Aaron. “There’s a lot of pollution. Those scenes are really hard to encode.” In other words: they require more data to look good, especially when shown on a big 4K TV in HDR, than less visually complex fare.

Advertisement

Aaron’s mind never stops looking for those kinds of visual challenges, no matter whether she watches Netflix after work or goes outside to take a walk. This has even caught on with her kids, with Aaron telling me that they occasionally point at things in the real world and shout: “Look, it’s a blur!”

It’s a habit that comes with the job and a bit of a curse, too — one of those things you just can’t turn off. During our conversation, she picked up her phone, only to pause and point at the rhinestone-bedazzled phone case. It reminded her of that hard-to-encode scene from Barbie Dreamhouse Adventures. Another visual challenge!

Still, even an obsessive mind can only get you so far. For one thing, Aaron can’t possibly watch thousands of Netflix videos and decide which encoding settings to apply to every single shot. Instead, her team compiled a few dozen short clips sourced from a variety of shows and movies on Netflix and encoded each clip with a range of different settings. They then let test subjects watch those clips and grade the visual imperfections from not noticeable to very annoying. “You have to do subjective testing,” Aaron said. “It’s all based on ground truth, subjective testing.”

London’s smoggy fog of the early 50s in The Crown made for another encoding challenge.
Screenshot: Netflix

The insights gained this way have been used by Netflix to train a machine learning model that can analyze the video quality of different encoding settings across the company’s entire catalog, which helps to figure out the optimal settings for each and every little slice of a show or movie. The company collaborated with the University of Southern California on developing these video quality assessment algorithms and open-sourced them in 2016. Since then, it has been adopted by much of the industry as a way to analyze streaming video quality and even gained Netflix an Emmy Award. All the while, Aaron and her team have worked to catch up with Netflix’s evolving needs — like HDR. 

Advertisement

“We had to develop yet another metric to measure the video quality for HDR,” Aaron said. “We had to run subjective tests and redo that work specifically for HDR.” This eventually allowed Netflix to encode HDR titles with per-shot-specific settings as well, which the company finally did last year. Now, her team is working on open-sourcing HDR-based video quality assessment.

Slicing up a movie by shot and then encoding every slice individually to make sure it looks great while also saving as much bandwidth as possible: all of this work happens independently of the video codecs Netflix uses to encode and compress these files. It’s kind of like how you might change the resolution or colors of a picture in Photoshop before deciding whether to save it as a JPEG or a PNG. However, Netflix’s video engineers have also actively been working on advancing video codecs to further optimize the company’s streams.

Netflix is a founding member of the Alliance for Open Media, whose other members include companies like Google, Intel, and Microsoft. Aaron sits on the board of the nonprofit, which has spearheaded the development of the open, royalty-free AV1 video codec. Netflix began streaming some videos in AV1 to Android phones in early 2020 and has since expanded to select smart TVs and streaming devices as well as iPhones. “We’ve encoded about two-thirds of our catalog in AV1,” Aaron said. The percentage of streaming hours transmitted in AV1 is “in the double digits,” she added.

And while the roll-out of AV1 continues, work is already underway on its successor. It might take a few more years before devices actually support that next-gen codec, but early results suggest that it will make a difference. “At this point, we see close to 30 percent bit rate reduction with the same quality compared to AV1,” Aaron explained. “I think that’s very, very promising.”

Meridian was a short film made by Netflix specifically to test and train codecs and algorithms for streaming.
Screenshot: Netflix
Advertisement

While contributing to the development of new video codecs, Aaron and her team stumbled across another pitfall: video engineers across the industry have been relying on a relatively small corpus of freely available video clips to train and test their codecs and algorithms, and most of those clips didn’t look at all like your typical Netflix show. “The content that they were using that was open was not really tailored to the type of content we were streaming,” recalled Aaron. “So, we created content specifically for testing in the industry.”

In 2016, Netflix released a 12-minute 4K HDR short film called Meridian that was supposed to remedy this. Meridian looks like a film noir crime story, complete with shots in a dusty office with a fan in the background, a cloudy beach scene with glistening water, and a dark dream sequence that’s full of contrasts. Each of these shots has been crafted for video encoding challenges, and the entire film has been released under a Creative Commons license. The film has since been used by the Fraunhofer Institute and others to evaluate codecs, and its release has been hailed by the Creative Commons foundation as a prime example of “a spirit of cooperation that creates better technical standards.”

Cutting-edge encoding strategies, novel quality metrics, custom-produced video assets, and advanced codecs: in many ways, Netflix has been leading the industry when it comes to delivering the best-looking streams in the most efficient ways to consumers. That’s why the past 14 months have been especially humbling.

Netflix launched its very first livestream in March of 2023, successfully broadcasting a Chris Rock comedy special to its subscribers. A month later, it tried again with a live reunion event for its reality show Love Is Blind — and failed miserably, with viewers waiting for over an hour for the show to start.

The failed livestream was especially embarrassing because it tarnished the image of Netflix as a technology powerhouse that is lightyears ahead of its competition. Netflix co-CEO Greg Peters issued a rare mea culpa later that month. “We’re really sorry to have disappointed so many people,” Peters told investors. “We didn’t meet the standard that we expect of ourselves to serve our members.”

Advertisement

Netflix wants to avoid further such failures, which is why the company is playing it safe and moving slowly to optimize encoding for live content. “We’re quite early into livestreaming,” Aaron said. “For now, the main goals are stability, resilience of the system, and being able to handle the scale of Netflix.” In practice, this means that Aaron’s team isn’t really tweaking encoding settings for those livestreams at all for the time being, even if it forces her to sit through the livestream of the SAG Awards show without being able to improve anything. “We’re starting with a bit more industry-standard ways to do it,” she told me. “And then from there, we’ll optimize.”

The same is true in many ways for cloud gaming. Netflix began to test games on TVs and desktop computers last summer and has since slowly expanded those efforts to include additional markets and titles. With games being rendered in the cloud as opposed to on-device, cloud gaming is essentially a specialized form of livestreaming, apart from one crucial distinction. “They’re quite different,” said Aaron. “[With] cloud gaming, your latency is even more stringent than live.” 

Monday Night RAW is coming to Netflix next year and will bring with it even more opportunities to challenge the streamer’s video encoding technology.
Photo: WWE/Getty Images

Aaron’s team is currently puzzling over different approaches to both problems, which requires them to ignore much of what they’ve learned over the past decade. “The lesson is not to think about it like VOD,” Aaron said. One example: slicing and dicing a video by shot and then applying the optimal encoding setting for every shot is a lot more difficult when you don’t know what happens next. “With live, it’s even harder to anticipate complex scenes,” she said.

Live is unpredictable: that’s not just true for encoding but also for Netflix’s business. The company just inked a deal to show two NFL games on Christmas Day and will begin streaming weekly WWE matches in January. This happens as sports as a whole, which has long been the last bastion of cable TV, is transitioning to streaming. Apple is showing MLS games, Amazon is throwing tons of money at sports, and ESPN, Fox, and Warner Bros. are banding together to launch their own sports streaming service. Keeping up with these competitors doesn’t just require Netflix to spend heavily on sports rights but also actually get good at livestreaming. 

Advertisement

All of this means that Aaron and her team won’t be out of work any time soon — especially since the next challenge is always just around the corner. “There’s going to be more live events. There’s going to be, maybe, 8K, at some point,” she said. “There’s all these other experiences that would need more bandwidth.”

In light of all of those challenges, does Aaron ever fear running out of ways to optimize videos? In other words: how many times can Netflix re-encode its entire catalog with yet another novel encoding strategy, or new codec, before those efforts are poised to hit a wall and won’t make much of a difference anymore?

“In the codec space, people were saying that 20 years ago,” Aaron said. “In spite of that, we still find areas for improvement. So, I’m hopeful.”

And always eagle-eyed to spot the next visual challenge, whether it’s a sea of camera flashes or a surprise appearance by Mr. Sparklesuit.

Advertisement
Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Technology

The latest Instax printer is a pricey but worthy upgrade

Published

on

The latest Instax printer is a pricey but worthy upgrade

FujiFilm’s Instax Mini Link 3 printer is a much-loved $100 accessory in my travel journal kit. I often tape a printed image next to my handwritten thoughts to preserve a moment in time. The prints produced by the instant film can, however, be soft and muddy — something the new $169.95 Instax Mini Link+ promises to improve.

The big upgrade is a new Design Print mode. It’s supposed to make text and intricate illustrations crisp and legible, but I didn’t see much of an improvement, despite that being a big selling point. I did, however, find that the improved processing inside the Mini Link+ enhanced contrast, colors, and sharpness, to reveal more details in a wide variety of photos, and I think that’s more important to most people.

From my testing, the new Mini Link+ is definitely an upgrade, but don’t expect this, or any instant film Instax printer to perform miracles, especially for images measuring just 62 x 46mm (2.44 x 1.81 inches).

$170

The Good

  • Best Instax Mini printer yet
  • Improved colors, sharpness, and contrast on most photos
  • Fun for creatives

The Bad

  • Little improvement on text heavy illustrations
  • Expensive
  • App is overwrought

FujiFilm’s Instax printers all use its Instax Mini instant film which typically costs around $30 for 20 sheets, or about $1.50 per photo. To print, you need to download the “Instax Mini Link” app available for both iOS and Android.

The app is overwrought with features that let you visualize your photos in real space with VR and use the printer as a remote camera shutter. It also helps you organize your images; imagine your prints in frames, on shelves, or as a collage taped to the wall; and prettify them with text, stickers, and filters. You can even connect your Pinterest account if you want. Fun, I suppose, but I’m not twelve-years old – I’m a full-grown man, damnit, and I just want to print photos in my iPhone’s photo library, and do it quickly!

It comes with a lanyard.

The Mini Link+ (left) is only slightly larger than the Mini Link 3 (right).

It uses the same Instax Mini instant film. Each cartridge holds 10 sheets.

It can even be used as a remote shutter button for your phone.

To do that, I have to first import the image into the Instax Mini Link app, hit print, choose either the Simple or Design mode, then wait 20 seconds for the printout. Simple print promises “smooth color tones for everyday images” and produces softer images that, in general, are still an improvement over most anything the Mini Link 3 can print. Design mode is exclusive to the Mini Link+ and the reason you might want it.

Advertisement

I tested the different modes with a variety of images and generally found Design prints made on the Mini Link+ were superior for faces, landscapes, high contrast images, and macro shots of nature. Everything, really, other than text-heavy illustrations, where I saw no obvious improvement.

Link+ Design mode (left), Mini Link 3 (center), Link+ Simple mode (right).

Link+ Design mode (left), Mini Link 3 (center), Link+ Simple mode (right).

For example, look at my stupid face. Photos with intense lighting were susceptible to blowout when printed on the older Mini Link 3. The Simple and Design prints from the Mini Link+ handled the lighting better, with improved contrast, more detail in the eye, and more accurate colors and skin texture.

Link+ Design mode (left), Mini Link 3 (center), Link+ Simple mode (right).

Link+ Design mode (left), Mini Link 3 (center), Link+ Simple mode (right).

In the example above, everything in the Mini Link 3 print is super soft and blends together in a muddled mess. The Mini Link+ again offers improved contrast, with visible textures on the rock faces, tree branches, and improved colors throughout. The wooden slats on the barn, lines of individual trees, and wheel detail are more pronounced on the Design print, with less saturation on that big pine to the left.

Link+ Design mode (left), Mini Link 3 (right).

Link+ Design mode (left), Mini Link 3 (right).

Here, the Mini Link 3 struggles to depict the snow as anything but a white smear, while you can make out individual snowflakes and depth on the Mini Link+ Design print.

Link+ Design mode (left), Mini Link 3 (center), Link+ Simple mode (right).

Link+ Design mode (left), Mini Link 3 (center), Link+ Simple mode (right).

In this example, the Mini Link 3 really flattens the sky and removes the texture from the distant mountain. The greens and blues are more brilliant with the Simple and Design prints, while the separation between bits of gravel and blades of grass is more apparent in Design mode.

Advertisement

Instax Mini Link 3 (left) versus Link+ Design mode (right).

In this Spotify screenshot, Design mode sharpens the lettering and artificially enhances the white text with a black outline, most visible on the letters “a” and “s.” Simple mode doesn’t do this. The outlining does make the lettering pop.

Link+ Design mode (left), Mini Link 3 (center), Link+ Simple mode.

Link+ Design mode (left), Mini Link 3 (center), Link+ Simple mode.

Link+ Design mode (bottom), Mini Link 3 (top).

Link+ Design mode (left), Mini Link 3 (right).

I find surprisingly little difference between these illustrations printed by the Mini Link 3 and the Mini Link+, even in Design mode. Strange because this is where FujiFilm’s new printer is supposed to excel. Nevertheless, they all look good enough for hobbyists, and anyone looking to spice up a journal or decorate a room.

1/7

USB-C charging with a user-replaceable battery if you live in Europe.

After printing 15 photos over the last few days, the battery on the Instax Mini Link+ is still at 80 percent. The battery charges over USB-C, and, if you’re in Europe, the FujiFilm NP-70S battery can be user-replaced when it no longer holds a charge.

From my testing, I think it’s clear that if you want the best photo quality available in an Instax printer, then the $169.95 Mini Link+ is the one to get. It also makes the case for being a worthy upgrade for some Mini Link 3 owners, so long as you’re not expecting improved prints of text-heavy illustrations.

Advertisement

But its price puts the Mini Link+ into direct competition with dye-sublimation printers like the Canon Selphy QX20 which yields prints that are sharp and accurate with better resistance to water and fading. Otherwise, the Mini Link 3 is still a great printer for the price, and the soft, moody images it prints is a vibe worth $100.

Photography by Thomas Ricker / The Verge

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

Continue Reading

Technology

Fox News AI Newsletter: Amazon cuts thousands of roles

Published

on

Fox News AI Newsletter: Amazon cuts thousands of roles

NEWYou can now listen to Fox News articles!

Welcome to Fox News’ Artificial Intelligence newsletter with the latest AI technology advancements.

IN TODAY’S NEWSLETTER:

– Amazon to cut 16,000 roles as it looks to invest in AI, remove ‘bureaucracy’
– Uber unveils a new robotaxi with no driver behind the wheel 
– Ex-Google engineer found guilty of stealing AI secrets for Chinese companies

MASSIVE CUTS: Amazon said Wednesday it will cut approximately 16,000 roles across the company as part of an organizational overhaul aimed at “reducing layers, increasing ownership, and removing bureaucracy,” while continuing to invest heavily in areas such as artificial intelligence.

Advertisement

YOUR NEW RIDE: Uber is getting closer to offering rides with no one behind the wheel. The company recently unveiled a new robotaxi and confirmed that autonomous testing is already underway on public roads in the San Francisco Bay Area. While the vehicle first appeared earlier this month at the Consumer Electronics Show 2026, the bigger story now is what is happening after the show.

Lucid, Nuro and Uber unveil a robotaxi during Nvidia Live at CES 2026 ahead of the annual Consumer Electronics Show in Las Vegas, Jan. 5, 2026.  (Patrick T. Fallon / AFP via Getty Images)

TECH THEFT: A federal jury found a former Google engineer guilty of stealing artificial intelligence (AI) trade secrets and spying for Chinese tech companies, ending a high-profile Silicon Valley trial.

FIDO’S BIG BROTHER: Tuya Smart just introduced Aura, its first AI-powered companion robot made for pets. Aura is designed specifically for household cats and dogs, with AI trained to recognize their behaviors, movements and vocal cues. The idea behind Aura is simple. Pets need more than food bowls and cameras. They need attention, interaction and reassurance.

GOING BIG: What happens when artificial intelligence (AI) moves from painting portraits to designing homes? That question is no longer theoretical. At the Utzon Center in Denmark, Ai-Da Robot, the world’s first ultra-realistic robot artist, has made history as the first humanoid robot to design a building.

Advertisement

A man faces the realistic artist” robot “Ai-Da” using artificial intelligence at a stand during the International Telecommunication Union (ITU) AI for Good Global Summit in Geneva on May 30, 2024. (FABRICE COFFRINI/AFP via Getty Images)

FOLLOW FOX NEWS ON SOCIAL MEDIA

Facebook
Instagram
YouTube
X
LinkedIn

SIGN UP FOR OUR OTHER NEWSLETTERS

Fox News First
Fox News Opinion
Fox News Lifestyle
Fox News Health

DOWNLOAD OUR APPS

Fox News
Fox Business
Fox Weather
Fox Sports
Tubi

WATCH FOX NEWS ONLINE

Fox News Go

Advertisement

STREAM FOX NATION

Fox Nation

Stay up to date on the latest AI technology advancements, and learn about the challenges and opportunities AI presents now and for the future with Fox News here.

Advertisement
Continue Reading

Technology

Sonos’ Super Bowl sale knocks hundreds off its audio gear

Published

on

Sonos’ Super Bowl sale knocks hundreds off its audio gear

Sonos isn’t exactly synonymous with the Super Bowl, although the brand discounts its gear every year around this time like clockwork. It’s knocking 20 percent off many of its marquee products, including soundbars and standalone speakers — all of which can be paired together to improve sound quality and to put audio in more places at home.

Through February 16th, the company is keeping its prices in place on the Era 100 and the larger Era 300 speakers, the Beam and Arc Ultra soundbars, as well as its selection of wireless subwoofers. To put the prices in context, some of these discounts match — or beat — the current costs of Sonos’ certified refurbished gear.

The Era 100 very well may be the best, most feature-packed smart speaker around in its price range. In our 2023 review, we praised its stereo sound playback and improved bass response over its predecessor, the Sonos One. Notably, it supports Bluetooth playback (in addition to Wi-Fi connectivity) as well as line-in audio via USB-C, in case you’d rather plug in a wired audio source. While this model typically sells for $219, it’s currently available for $179 through Sonos, as well as Amazon and Best Buy.

The Era 300 is Sonos’ modern spin on the Sonos Five, offering bigger sound than the Era 100. Its specialty is spatial audio, which sounds incredible when you find a song that’s been mixed just right (the thing is, not all Dolby Atmos tunes are mixed equally). Like the Era 100, this model offers Bluetooth and Wi-Fi wireless connections, as well as line-in via USB-C. Our review notes, however, that stereo playback is an area where the Era 300 actually falters compared to its predecessor. But given its improvements overall (and since the Five that launched in 2020 is no longer on sale), the 300 is a great speaker to consider if you really want to feel immersed in your music. It’s $379 during the sale period at Sonos and Best Buy, down from $479.

Jumping to soundbars, the second-gen Beam is down to $369 from its original $499 price. While it’s definitely not the most feature-packed soundbar that you can get at around this price, its ability to tie-in with other Sonos products, plus its improved soundstage over the first-gen model might make it worth considering for you. The inclusion of Dolby Atmos is its marquee feature, although we noted in our review that it’s a virtualized effect since it lacks upward firing speakers that truly enable the vertical sound effects to shine. Note that it’s lacking in physical connectivity compared to most other models, with just a power plug, an HDMI eARC port, and an ethernet jack. In the event that you wish to connect the Beam to your TV or receiver via optical audio, you’ll need to purchase this $25 HDMI-to-optical adapter.

Advertisement

The Arc Ultra is a much better soundbar than the Beam, and carriers a larger $899 price (down from $1,099). Our reviewer noted that the bass improvements in this model are such that it can stand on its own without the purchase of a wireless subwoofer. It also boasts more immersive sound quality, plus Bluetooth connectivity, which was missing in the original Arc. The Arc Ultra’s sound can be further enhanced by connecting other Sonos speakers to the mix, although its older Play:1 and Play:3 speakers are ineligible to join the speaker family for surround sound.

If you’re considering either the Bean or the Arc Ultra (or if you already own one of its soundbars), their performance will benefit greatly with the addition of a Sonos subwoofer, of which the company makes two models. The Sub 4 is its high-end option, which is $759 during the sale period (down from $899). Anything this close to $1,000 is extremely expensive for a subwoofer, especially considering that most companies include one with their surround sound systems. The Sub 4 is able to lay horizontally or sit vertically — however suits your room best.

For almost half the cost of the Sub 4, you can get the Sub Mini. It’s $399, down from $499. You may be thinking that even this one is still pretty costly, and I agree. Although, it’s a product that Sonos loyalists were begging for, as before it there was no other choice than to spring for its more expensive subwoofer. In our review, we deemed it unworthy for filling large rooms with bass, but totally sufficient in most other ways. Something cool about its design is the force-canceling effect that reduces floor vibrations, which could be great if you’re worried about disturbing neighbors or other people in the house.

Continue Reading

Trending