Connect with us

Technology

Can I outsmart thieves with a hidden AirTag in RFID wallet?

Published

on

Can I outsmart thieves with a hidden AirTag in RFID wallet?

Apple AirTags can be a great way to outsmart car thieves, but there are some drawbacks to consider. A key limitation is that thieves who also have iPhones can usually detect a hidden AirTag in vehicles they are trying to steal. This makes hiding an AirTag tricky.

That’s why we were intrigued by Brian’s inquiry about using other ways to make Apple AirTags undetectable to car thieves or outsmarting them twice.

“If an AirTag is put in an RFID wallet and hidden in a car, can the AirTag be located by a thief using a locator? I know the AirTag will function in the wallet, but will it be undetectable by the thief? — Brian, LaSalle, Illinois

Below are reasons why using an RFID wallet might end up leaving you outsmarted instead.

I’M GIVING AWAY THE LATEST AND GREATEST AIRPODS PRO 2

Advertisement

Illustration of a car thief checking to see if the vehicle has an AirTag. (Kurt “CyberGuy” Knutsson)

Why RFID wallets might not outsmart car thieves?

Some thieves use RFID readers, which use radio waves, to scan and read data such as credit cards in wallets. The function of RFID wallets and bags is to protect items or devices from these radio waves by blocking them.

Apple AirTags use different technology, so even if it was placed in an RFID-blocking wallet or bag, it could still be detected by a thief with an iPhone or locator. Unlike the radio waves used to scan for credit cards and other data, Apple AirTags use Bluetooth and Ultra-Wideband (UWB) technology. Neither technology can be shielded by RFID-blocking material because those materials cannot block Bluetooth and UWB signals.

RFID wallet 2

A person holding an RFID wallet. (Kurt “CyberGuy” Knutsson)

DO EMF BLOCKERS REALLY PROTECT US?

Advertisement

What can block AirTag technology?

Faraday bags, on the other hand, use electromagnetic shielding and can successfully block Apple AirTag signals, such as Bluetooth and Wi-Fi signals, from being detected by thieves. The only issue with putting an Apple AirTag in a Faraday bag to hide in your car is that if the car thief cannot read or locate the signal it emits, you cannot either.

While the Apple AirTag may go undetected when in a Faraday bag, you will likely not be able to use FindMy or any other tracking methods to locate it either, because those methods require signals pinged from the AirTag to other iPhones and Apple Devices to track location.

BEST CAR ACCESSORIES

rfid wallet 3

An RFID wallet. (Kurt “CyberGuy” Knutsson)

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

Alternatives to AirTags for car security

While Apple AirTags offer some benefits, there are other technologies and methods to consider for enhancing car security. For example, GPS trackers provide real-time location data without the risk of being detected by a nearby iPhone. Additionally, physical deterrents like steering wheel locks and alarm systems can act as effective safeguards. Combining multiple layers of security can better protect your vehicle against theft. Check out how to prevent your car from being stolen.

Advertisement

HOW TO OUTSMART CAR THIEVES WITH THESE SMART AIRTAG TACTICS

Kurt’s key takeaways

While Apple AirTags can be a useful device to help you keep track of your vehicle, whether it is lost or stolen, it does not replace the reliability of a GPS system installed in your vehicle if you are trying to outsmart car thieves. Because car thieves might be able to locate an Apple AirTag hidden in your vehicle, it may not provide the layer of protection you hope for. Though the technology in RFID wallets will not block signals from an Apple AirTag, which will make them detectable to locators and scanners, Faraday bags will. Unfortunately, you will end up outsmarting yourself in trying to outsmart car thieves if you use a Faraday bag because you will then not be able to track the Apple AirTags either. If you are looking for the optimal way to outsmart car thieves, the Apple AirTags might not be the answer you are looking for.

What personal experiences have you had with technology aiding in theft prevention? Let us know by writing us at Cyberguy.com/Contact

For more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter

Advertisement

Ask Kurt a question or let us know what stories you’d like us to cover.

Follow Kurt on his social channels:

Answers to the most-asked CyberGuy questions:

New from Kurt:

Copyright 2025 CyberGuy.com. All rights reserved.

Advertisement

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Technology

Sony is giving the PS5’s accessories an all-black makeover

Published

on

Sony is giving the PS5’s accessories an all-black makeover

You can finally make your PS5 gaming set up all black everything. Sony has announced that it will soon release the rest of the PS5 suite of accessories in the midnight black colorway.

Sony’s offering the Dual Sense Edge Controller, the Pulse wireless headphones and earbuds, and the PlayStation Portal in sleek and sexy midnight black to match the PS5 cover and Dual Sense controller it released back in 2021.

PlayStation’s website has the details on pricing for the electronics with pre-orders starting on January 16th.

If you’d rather wait, the accessories will hit retailers on February 20th.

Continue Reading

Technology

FBI's new warning about AI-driven scams that are after your cash

Published

on

FBI's new warning about AI-driven scams that are after your cash

The FBI is issuing a warning that criminals are increasingly using generative AI technologies, particularly deepfakes, to exploit unsuspecting individuals. This alert serves as a reminder of the growing sophistication and accessibility of these technologies and the urgent need for vigilance in protecting ourselves from potential scams. Let’s explore what deepfakes are, how they’re being used by criminals and what steps you can take to safeguard your personal information.

I’M GIVING AWAY THE LATEST & GREATEST AIRPODS PRO 2

Enter the giveaway by signing up for my free newsletter.

FBI building in D.C. (Kurt “CyberGuy” Knutsson)

The rise of deepfake technology

Deepfakes refer to AI-generated content that can convincingly mimic real people, including their voices, images and videos. Criminals are using these techniques to impersonate individuals, often in crisis situations. For instance, they might generate audio clips that sound like a loved one asking for urgent financial assistance or even create real-time video calls that appear to involve company executives or law enforcement officials. The FBI has identified 17 common techniques used by criminals to create these deceptive materials.

Advertisement
FBI's new warning about AI-driven scams that are after your cash

Image of AI illustration (Kurt “CyberGuy” Knutsson)

THE AI-POWERED GRANDMA TAKING ON SCAMMERS

Key tactics used by criminals

The FBI has identified 17 common techniques that criminals are using to exploit generative AI technologies, particularly deepfakes, for fraudulent activities. Here is a comprehensive list of these techniques.

1) Voice cloning: Generating audio clips that mimic the voice of a family member or other trusted individuals to manipulate victims.

2) Real-time video calls: Creating fake video interactions that appear to involve authority figures, such as law enforcement or corporate executives.

3) Social engineering: Utilizing emotional appeals to manipulate victims into revealing personal information or transferring funds.

Advertisement

4) AI-generated text: Crafting realistic written messages for phishing attacks and social engineering schemes, making them appear credible.

5) AI-generated images: Using synthetic images to create believable profiles on social media or fraudulent websites.

6) AI-generated videos: Producing convincing videos that can be used in scams, including investment frauds or impersonation schemes.

7) Creating fake social media profiles: Establishing fraudulent accounts that use AI-generated content to deceive others.

8) Phishing emails: Sending emails that appear legitimate but are crafted using AI to trick recipients into providing sensitive information.

Advertisement

9) Impersonation of public figures: Using deepfake technology to create videos or audio clips that mimic well-known personalities for scams.

10) Fake identification documents: Generating fraudulent IDs, such as driver’s licenses or credentials, for identity fraud and impersonation.

11) Investment fraud schemes: Deploying AI-generated materials to convince victims to invest in non-existent opportunities.

12) Ransom demands: Impersonating loved ones in distress to solicit ransom payments from victims.

13) Manipulating voice recognition systems: Using cloned voices to bypass security measures that rely on voice authentication.

Advertisement

14) Fake charity appeals: Creating deepfake content that solicits donations under false pretenses, often during crises.

15) Business email compromise: Crafting emails that appear to come from executives or trusted contacts to authorize fraudulent transactions.

16) Creating misinformation campaigns: Utilizing deepfake videos as part of broader disinformation efforts, particularly around significant events like elections.

17) Exploiting crisis situations: Generating urgent requests for help or money during emergencies, leveraging emotional manipulation.

FBI's new warning about AI-driven scams that are after your cash

Image of AI illustration (Kurt “CyberGuy” Knutsson)

These tactics highlight the increasing sophistication of fraud schemes facilitated by generative AI and the importance of vigilance in protecting personal information.

Advertisement

FCC NAMES ITS FIRST-EVER AI SCAMMER IN THREAT ALERT

Tips for protecting yourself from deepfakes

Implementing the following strategies can enhance your security and awareness against deepfake-related fraud.

1) Limit your online presence: Reduce the amount of personal information, especially high-quality images and videos, available on social media by adjusting privacy settings.

2) Invest in personal data removal services: The less information is out there, the harder it is for someone to create a deepfake of you. While no service promises to remove all your data from the internet, having a removal service is great if you want to constantly monitor and automate the process of removing your information from hundreds of sites continuously over a longer period of time. Check out my top picks for data removal services here.

3) Avoid sharing sensitive information: Never disclose personal details or financial information to strangers online or over the phone.

Advertisement

4) Stay vigilant with new connections: Be cautious when accepting new friends or connections on social media; verify their authenticity before engaging.

5) Check privacy settings on social media: Ensure that your profiles are set to private and that you only accept friend requests from trusted individuals. Here’s how to switch any social media accounts, including Facebook, Instagram, Twitter and any others you may use, to private.

6) Use two-factor authentication (2FA): Implement 2FA on your accounts to add an extra layer of security against unauthorized access.

7) Verify callers: If you receive a suspicious call, hang up and independently verify the caller’s identity by contacting their organization through official channels.

8) Watermark your media: When sharing photos or videos online, consider using digital watermarks to deter unauthorized use.

Advertisement

9) Monitor your accounts regularly: Keep an eye on your financial and online accounts for any unusual activity that could indicate fraud.

10) Use strong and unique passwords: Employ different passwords for various accounts to prevent a single breach from compromising multiple services. Consider using a password manager to generate and store complex passwords.

11) Regularly backup your data: Maintain backups of important data to protect against ransomware attacks and ensure recovery in case of data loss.

12) Create a secret verification phrase: Establish a unique word or phrase with family and friends to verify identities during unexpected communications.

13) Be aware of visual imperfections: Look for subtle flaws in images or videos, such as distorted features or unnatural movements, which may indicate manipulation.

Advertisement

14) Listen for anomalies in voice: Pay attention to the tone, pitch and choice of words in audio clips. AI-generated voices may sound unnatural or robotic.

15) Don’t click on links or download attachments from suspicious sources: Be cautious when receiving emails, direct messages, texts, phone calls or other digital communications if the source is unknown. This is especially true if the message is demanding that you act fast, such as claiming your computer has been hacked or that you have won a prize. Deepfake creators attempt to manipulate your emotions, so you download malware or share personal information. Always think before you click.

The best way to safeguard yourself from malicious links that install malware, potentially accessing your private information, is to have antivirus software installed on all your devices. This protection can also alert you to phishing emails and ransomware scams, keeping your personal information and digital assets safe. Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android and iOS devices.

16) Be cautious with money transfers: Do not send money, gift cards or cryptocurrencies to people you do not know or have met only online or over the phone.

17) Report suspicious activity: If you suspect that you have been targeted by scammers or have fallen victim to a fraud scheme, report it to the FBI’s Internet Crime Complaint Center

Advertisement
FBI's new warning about AI-driven scams that are after your cash

A woman typing on her laptop (Kurt “CyberGuy” Knutsson)

By following these tips, individuals can better protect themselves from the risks associated with deepfake technology and related scams.

30% OF AMERICANS OVER 65 WANT TO BE REMOVED FROM THE WEB. HERE’S WHY

Kurt’s key takeaways

The increasing use of generative AI technologies, particularly deepfakes, by criminals highlights a pressing need for awareness and caution. As the FBI warns, these sophisticated tools enable fraudsters to impersonate individuals convincingly, making scams harder to detect and more believable than ever. It’s crucial for everyone to understand the tactics employed by these criminals and to take proactive steps to protect their personal information. By staying informed about the risks and implementing security measures, such as verifying identities and limiting online exposure, we can better safeguard ourselves against these emerging threats.

In what ways do you think businesses and governments should respond to the growing threat of AI-powered fraud? Let us know by writing us at Cyberguy.com/Contact.

For more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter.

Advertisement

Ask Kurt a question or let us know what stories you’d like us to cover.

Follow Kurt on his social channels:

Answers to the most asked CyberGuy questions:

New from Kurt:

Copyright 2024 CyberGuy.com. All rights reserved.

Advertisement

Continue Reading

Technology

Nvidia is bringing a native GeForce Now app to Steam Deck

Published

on

Nvidia is bringing a native GeForce Now app to Steam Deck

Nvidia plans to release a native GeForce Now app for Steam Deck “later this year,” according to a blog post. It’s already relatively straightforward to get Nvidia’s cloud gaming service set up on Steam Deck thanks to a special script from Nvidia, but a native app should be easier to install and will support up to 4K resolution and 60 fps with HDR when connected to a TV.

Nvidia also plans to bring GeForce Now to some major VR headsets later this month, including the Apple Vision Pro, Meta Quest 3 and 3S, and Pico “virtual- and mixed-reality devices.” When GeForce Now version 2.0.70 is available, people using those headsets will be able to access an “extensive library of games” they can stream by visiting play.geforcenow.com in their browser.

The company also says that two major titles from Microsoft will be available on GeForce Now when they come out this year: Avowed, which launches February 18th, and DOOM: The Dark Ages, which is set to be available sometime this year.

Continue Reading

Trending