Connect with us

Technology

Apple’s fancy new CarPlay will only work wirelessly

Published

on

Apple’s fancy new CarPlay will only work wirelessly

Apple’s been talking about its next generation of CarPlay for two years now with very little to show for it — the system is designed to unify the interfaces on every screen in your car, including the instrument cluster, but so far only Aston Martin and Porsche have said they’ll ship cars with the system, without any specific dates in the mix.

And the public response from the rest of the industry towards next-gen CarPlay has been pretty cool overall. I talk to car CEOs on Decoder quite often, and most of them seem fairly skeptical about allowing Apple to get between them and their customers. “We have Apple CarPlay,” Mercedes-Benz CEO Ola Källenius told me in April. “If, for some of the functions, you feel more comfortable with that and will switch back and forth, be my guest. But to give up the whole cockpit head unit — in our case, a passenger screen and everything — to somebody else? The answer is no.”

That industry skepticism seems to have hit home for Apple, which posted two WWDC 2024 videos detailing the architecture and design of next-gen CarPlay. Both made it clear that automakers will have a lot of control over how things look and work, and even have the ability to just use their own interfaces for various features using something called “punch-through UI.” The result is an approach to CarPlay that’s much less “Apple runs your car” and much more “Apple built a design toolkit for automakers to use however they want.”

See, right now CarPlay is basically just a second monitor for your phone – you connect to your car, and your phone sends a video stream to the car. This is why those cheap wireless CarPlay dongles work – they’re just wireless display adapters, basically.

But if you want to integrate things like speedometers and climate controls, CarPlay needs to actually collect data from your car, display it in realtime, and be able to control various features like HVAC directly. So for next-gen CarPlay, Apple’s split things into what it calls “layers,” some of which run on your iPhone, but others which run locally on the car so they don’t break if your phone disconnects. And phone disconnects are going to be an issue, because next-generation CarPlay only supports wireless connections. “The stability and performance of the wireless connection are essential,” Apple’s Tanya Kancheva says while talking about the next-gen architecture. Given that CarPlay connectivity issues are still the most common issue in new cars and wireless made it worse, that’s something Apple needs to keep an eye on.

Advertisement

There are two layers that run locally on the car, in slightly different ways. There’s the “overlay UI,” which has things like your turn signals and odometer in it. These can be styled, but everything about it is entirely run on your car, and otherwise untouchable. Then there is the “local UI,” which has things like your speedometer and tachometer — things related to driving that need to update all the time, basically. Automakers can customize these in several ways – there are different gauge styles and layouts, from analog to digital, and they can include logos and so on. Interestingly, there’s only one font choice: Apple’s San Francisco, which can be modified in various ways, but can’t be swapped out.

Apple’s goal for next-gen CarPlay is to have it start instantaneously — ideally when the driver opens the door — so the assets for these local UI elements are loaded onto the car from your phone during the pairing process. Carmakers can update how things look and send refreshed assets through the phone over time as well — exactly how and how often is still a bit unclear.

Then there’s what Apple calls “remote UI,” which is all stuff that runs on your phone: maps, music, trip info. This is the most like CarPlay today, except now it can run on any other screen in your car. 

The final layer is called “punch-through UI,” and it’s where Apple is ceding the most ground to automakers. Instead of coming up with its own interface ideas for things like backup cameras and advanced driver-assistance features, Apple’s allowing carmakers to simply feed their existing systems through to CarPlay. When you shift to reverse, the interface will simply show you your car’s backup camera screen, for example:

But carmakers can use punch-through UI for basically anything they want, and even deeplink CarPlay buttons to their own interfaces. Apple’s example here is a vision of multiple colliding interface ideas all at once: a button in CarPlay to control massage seats that can either show native CarPlay controls, or simply drop you into the car’s own interface.

Advertisement

A lot of carmakers are going to take the easy way out here, I think.
Apple

Or a hardware button to pick drive modes could send you to either CarPlay settings, deeplink you into the automaker’s iPhone app, or just open the native car settings:

Apple’s approach to HVAC is also what amounts to a compromise: the company isn’t really rethinking anything about how HVAC controls work. Instead, it’s allowing carmakers to customize controls from a toolkit to match the car system and even display previews of a car interior that match trim and color options. If you’ve ever looked at a car with a weird SYNC button that keeps various climate zones paired up, well, the next generation of CarPlay has a weird SYNC button too.

All of this is kept running at 60fps (or higher, if the car system supports it) by a new dedicated UI timing channel, and a lot of the underlying compositing relies on OpenGL running on the car itself.

All in all, it’s a lot of info, and what feels like a lot of Apple realizing that carmakers aren’t going to just give up their interfaces — especially since they’ve already invested in designing these sorts of custom interfaces for their native systems, many of which now run on Unreal Engine with lots of fun animations, and have Google services like Maps integrated right in. Allowing automakers to punch those interfaces through CarPlay might finally speed up adoption – and it also might create a mix-and-match interface nightmare. 

Advertisement

All that said, it’s telling that no one has seen anything but renders of next-gen CarPlay anywhere yet. We’ll have to see what it’s like if this Porsche and Aston ever arrive, and if that tips anyone else into adopting it.

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Technology

Rimac is shifting from electric supercars to robotaxis

Published

on

Rimac is shifting from electric supercars to robotaxis

A new robotaxi service is coming to Croatia, courtesy of the country’s leading supercar company, Rimac. The service will be called Verne, named for French novelist and poet Jules Verne, and will launch in Zagreb in 2026, the company said.

It’s an interesting pivot for a company that has been on a rocket-ship trajectory over the last few years. Founded by Mate Rimac in a garage as a one-man operation, Rimac has since become a highly desirable brand, with many legacy automakers calling upon the startup to help them build their own electric supercars. In addition to making the record-breaking Nevera hypercar, Rimac also took control of Bugatti from Volkswagen in 2021 in a surprise move that created a new company called Bugatti Rimac.

And now the company of the 256mph electric hypercar is getting ready to launch its own robotaxi. I assure you, this is less random than it seems on the surface. Rimac has been working on autonomous technology since 2017, and in 2021, the company received €200 million from the EU to develop robotaxis as part of a €6.3 billion recovery plan for Croatia. (The incentive package opened the company up to a lot of criticism, including one member of the Croatian parliament calling Mate Rimac a fraud and “the Balkan Elizabeth Holmes.” ) The company has also received funding from Hyundai and Kia.

Today, Rimac is out to prove that the money isn’t going to waste. Previously dubbed Project 3 Mobility, the newly renamed Verne will be led by Rimac’s friend Marko Pejković as CEO and Adriano Mudri, the designer of Nevera, as chief designer. The company said it chose to honor the author of such classics as Twenty Thousand Leagues Under the Sea and Journey to the Center of the Earth because “he used the theme of travel as the driving force in his storytelling.”

The robotaxi will be fully electric and rely on autonomous technology from Mobileye, the Intel-owned company that supplies autonomous and advanced driver-assist technology to many automakers. Verne will use Mobileye Drive, a self-driving system that utilizes the Israeli companies’ EyeQ system-on-a-chip, as well as a data crowdsourcing program called the Road Experience Management, or REM, which uses real-time data from Mobileye-equipped vehicles to build out a global 3D map. 

Advertisement

The vehicle is Level 4 fully autonomous, meaning it lacks traditional controls like a steering wheel and pedals. Gone also are other familiar touchstones, like windshield wipers and side-view mirrors, in the interest of reducing drag and enhancing the aerodynamic experience.

Verne’s first vehicle looks radically different from most self-driving cars on the road today. Rather than opt for a retrofitted minivan or a toaster-shaped shuttle with protruding sensors, the Verne robotaxi is sleeker and much smaller with the overall appearance of a two-door hatchback. The expansive greenhouse and sloping windshield enclose an interior that is more luxurious than your average robotaxi. And the vehicle’s two sliding doors are certainly eye-catching, with Rimac saying they were designed for ease of entry.

The decision to go with a two-seater may strike some as curious, considering many robotaxi operators use more high-capacity vehicles. After all, more seats equals more fares, which means more revenue. But Verne’s chief designer Mudri cites data that shows “9 out of 10 rides are used by 1 or 2 people. Therefore, we can satisfy most of all trips with a two-seater and create unmatched interior space in a compact-sized vehicle.”

Reducing the number of seats will make for a more spacious, luxurious ride, Verne says. But the company’s robotaxis won’t just be accessible to the superrich; in a statement, Mate Rimac promised that Verne’s autonomous ridehailing service will be “affordable for all.”

Without a steering wheel or other clunky controls, Rimac was free to go big on its interior screen. The 43-inch display nearly spans the width of the dashboard and includes widgets for media, cabin controls, and weather. The central widget is devoted to the navigation, with a design that appears similar to Tesla or Waymo, with an illuminated line stretching out from the virtual vehicle to help the rider keep track of the trip.

Advertisement

Verne says riders will be able to listen to their own music or watch movies on the widescreen display. Seventeen speakers are located throughout the vehicle, which includes a Dolby Atmos sound system.

The robotaxi can be summoned via a mobile app, much like Uber or Waymo. Through the app, customers can customize certain settings, like temperature, lighting, and even scent, before their vehicle even shows up. On the backend, all the vehicles are connected, enabling Verne to optimize fleet management tasks.

Verne says it will build centrally located vehicle depots called “Motherships” in the cities in which it operates. These will be hubs for the robotaxis to be cleaned, charged, and maintained. The vehicles themselves will be produced at a factory in Croatia that has yet to be built.

After Zagreb, Verne says it will roll out its robotaxi service in other European cities — first in the UK and Germany, and then later in the Middle East. While some companies have been testing autonomous vehicles in Europe, any commercial service appears to be a long way off. Meanwhile, Alphabet’s Waymo is operating in several major cities in the US, and Baidu is similarly running hundreds of driverless cars in China.

Verne is working to become the first major robotaxi operator outside those two countries. The company has already signed agreements with 11 cities in the EU, UK, and the Middle East and is negotiating with more than 30 cities worldwide, it says. And it aims to “complement public transport, not compete against it.”

Advertisement

“In the longer term, Verne should help remove the need for a second or third car in the household that takes up parking spaces, is used rarely, and is a significant expense,” the company says.

Continue Reading

Technology

iPhone’s little-known trick can hear better than some human ears

Published

on

iPhone’s little-known trick can hear better than some human ears

 The iPhone is packed with features that you might take for granted. However, Apple has always prioritized accessibility, ensuring that people with physical challenges can enjoy the iPhone experience just like everyone else. One of the coolest accessibility features that can be used by anyone is Sound Recognition, and it’s something you might find incredibly useful.

GET SECURITY ALERTS, EXPERT TIPS – SIGN UP FOR KURT’S NEWSLETTER – THE CYBERGUY REPORT HERE

Sound Recognition on iPhone (Apple) (Kurt “CyberGuy” Knutsson)

Understanding Sound Recognition

The idea behind Sound Recognition is to allow your iPhone to listen for sounds you might not be able to hear if you are hearing impaired and alert you when those sounds are detected. As Apple describes it, “Sound Recognition uses on-device intelligence to notify users who might otherwise miss audible environmental alerts around them.”

When Sound Recognition is enabled, your iPhone will send you a push notification alerting you to the detected event, even if you lack the ability to hear it. While designed for the hard of hearing and hearing impaired, users without hearing problems can also benefit from this feature.

Advertisement

It’s important to note that Sound Recognition runs entirely locally on your iPhone. When the AI detects a sound, it identifies it right on your device – no uploading to the Internet is needed. This means Sound Recognition works without an Internet connection and keeps your alerts and Sound Recognition events completely private.

iPhone’s little-known trick can hear better than some human ears

Sound Recognition on iPhone (Kurt “CyberGuy” Knutsson)

HOW TO PROTECT YOUR IPHONE, IPAD FROM MALWARE

Sounds iPhone can recognize

As of iOS 16, iPhone Sound Recognition can identify the following types of sounds: fire alarms, sirens, smoke alarms, cats, dogs, appliances, car horns, doorbells, door knocks, glass breaking, kettles, water running, baby crying, coughing and shouting.

iPhone’s little-known trick can hear better than some human ears

Sounds iPhone can recognize (Kurt “CyberGuy” Knutsson)

8 GREAT IPHONE ACCESSIBILITY TIPS TO MAKE LIFE EASIER

How to use Sound Recognition on iPhone

Sound Recognition is not enabled by default, but it’s simple to turn it on. Keep in mind that you’ll need an iPhone running iOS 14 or later to use this feature. Here’s how to enable it to identify a door knock:

Advertisement
  • Open the Settings app
  • Tap Accessibility
  • Under the Hearing header, tap Sound Recognition
iPhone’s little-known trick can hear better than some human ears

Steps to use Sound Recognition on iPhone (Kurt “CyberGuy” Knutsson)

  • On the Sound Recognition screen, toggle the switch to green (ON). Wait a moment for the required sound files to download
  • Once the files are downloaded, tap the Sounds button
  • On the Sounds screen, tap any sound you want your iPhone to recognize, such as Door Bell
iPhone’s little-known trick can hear better than some human ears

Steps to use Sound Recognition on iPhone (Kurt “CyberGuy” Knutsson)

  • On that sound’s screen, toggle the switch to green (ON) next to the types of sounds you want your iPhone to listen for
iPhone’s little-known trick can hear better than some human ears

Steps to use Sound Recognition on iPhone (Kurt “CyberGuy” Knutsson)

After following these steps, your iPhone will continuously listen for the selected sounds and notify you when it detects them.

iPhone’s little-known trick can hear better than some human ears

Sound Recognition on iPhone (Kurt “CyberGuy” Knutsson)

6 THINGS YOU SHOULD ABSOLUTELY DO IF YOU HAVE AN IPHONE

A word of caution

While Sound Recognition is a cool feature, Apple warns against relying on it in situations where you could be injured or killed. As stated on their website, “Don’t rely on your iPhone to recognize sounds in circumstances where you may be harmed or injured, in high-risk or emergency situations, or for navigation.”

SUBSCRIBE TO KURT’S YOUTUBE CHANNEL FOR QUICK VIDEO TIPS ON HOW TO WORK ALL OF YOUR TECH DEVICES

Kurt’s key takeaways

Apple’s commitment to accessibility is commendable, and the Sound Recognition feature is a prime example of how technology can be inclusive and empowering for all users. While designed with the hearing impaired in mind, Sound Recognition can be a handy tool for anyone who wants to stay alert to important sounds in their environment. By following the simple steps outlined in this article, you can unlock the power of this innovative feature and experience the iPhone in a whole new way.

Advertisement

What potential benefits do you see in the iPhone’s Sound Recognition feature for enhancing daily life or providing added security? Let us know by writing us at Cyberguy.com/Contact.

For more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter.

Ask Kurt a question or let us know what stories you’d like us to cover.

Follow Kurt on his social channels

Answers to the most asked CyberGuy questions:

Advertisement

Copyright 2024 CyberGuy.com. All rights reserved.

Continue Reading

Technology

Meta tests Vision Pro-like freeform virtual screen placement for Quest headsets

Published

on

Meta tests Vision Pro-like freeform virtual screen placement for Quest headsets

Meta is testing a feature for its Quest headsets that allows you to place windows freely, similar to the Apple Vision Pro. Multitasking with multiple windows has been part of Meta Horizon OS (formerly Meta Quest OS) for a few years now, but currently, it only supports three virtual windows docked in a side-by-side layout.

It brings the Quest 3, in particular, a step closer to Apple’s spatial computing when used in mixed reality mode, but from the video, it doesn’t seem to work quite the same way. You can freely move up to three windows from 2D apps — such as the browser or OS windows like your library and settings — around your space and keep another three docked.

Other demos suggest that the windows will only remember their placement within a limited distance and return to their default positions should you switch orientation or reset the view. We haven’t tested it yet ourselves to know the full limitations here, but it looks promising.

The update also allows you to switch between curved and flat windows, as well as a dimmer that lowers the brightness of virtual environments while using 2D apps. (The latter doesn’t yet work for passthrough mode.)

The Apple Vision Pro allows you to move windows around whichever space you’re in and keep them locked in place even while you move around and after you take the headset off. That way, you can have a window sitting next to your refrigerator and another positioned alongside the TV in your living room, and then walk to and from the windows as if they’re actual objects.

Advertisement
Continue Reading

Trending