Technology
Watch out for the new ‘ghost hackers’
Imagine if this happened to you. Your spouse passed away and a few weeks after the funeral, you get a message from them that says, “Hi, hope you’re having a great day.” Other friends report they’re getting similar messages from your spouse. Some messages offer big returns in crypto investments.
Join over 500,000 people who get tech smart with my free newsletter.
It’s easy to sign up and one click to cancel if you don’t like it.
“Ghost hackers” have taken over your spouse’s account. It’s a sick new scam. With account owners dead and families focused on grief, the hacking is more likely to go unnoticed. It’s awful, and I want to make sure this doesn’t happen to you or someone you love.
SEE WHAT THE HOME YOU GREW UP IN LOOKS LIKE NOW AND OTHER MAPS TRICKS
It’s not just trolling and ‘investments’
Ghost hackers monitor obituaries and death notices for potential targets. Then, they use their arsenal (hacking weak passwords, guessing security questions and accessing previously leaked credentials) to break in. Often, hackers leapfrog into banking and retirement accounts, making it easy to steal directly from the person who passed.
The best offense is a good defense
I know firsthand there are a ton of administrative tasks to take care of when a close family member dies — everything from canceling cellphone plans to executing the will. This list now also needs to include memorializing or deleting their social media accounts.
Luckily, social networks have processes in place for this. For Facebook, ask Facebook to memorialize the account. You’ll need a link to an obituary. You can also request the profile be removed. Instagram has similar steps to Facebook, and the same goes for X.
Now, take time to protect yourself
On Facebook, you can designate a legacy contact to manage your account if you die. They won’t be able to log in, read your messages or delete friends.
CAN YOU SPOT ELECTION DEEPFAKES? HERE’S HOW NOT TO BE DUPED
- On mobile, select the three-line icon at the bottom right. Scroll and tap Settings & privacy > Settings. Under “Accounts Center,” tap Personal details > Personal details > Account ownership and control > Memorialization.
- Click your name to select your legacy contact (and notify your contact they’re now in that role). You can also decide if you’d rather have your account deleted after you pass.
Apple’s Legacy Contact is a safe, secure way to give someone access to data stored in your Apple account after you die. You can add more than one Legacy Contact, and all of them can access the account to make decisions. The person must be 13 or older.
Here’s how to set it up on your iPhone:
- Open Settings and tap your name.
- Go to Sign-In & Security > Legacy Contact.
- Tap Add Legacy Contact. You may have to use Face ID, Touch ID or your passcode to authenticate.
- You can choose a group member if you’re in a Family Sharing group. Or you can tap Choose Someone Else to add someone from your Contacts.
- Select the person from your Contacts. Tap Continue.
- You’ll be asked how you want to share your access key. Select Print Access Key or Send Access Key.
- If you choose to send the key digitally, Apple will create a message letting your contact know you’ve added them as your legacy contact. Tap Send.
Finally, adjust your Google account. You probably have a few things you’d prefer to keep private in your search, watch and location history. By default, Google auto-deletes account records after 18 months. If you want to shorten that window, you can do so in a few steps.
- Go to your Google Activity controls and log in with your Google account.
- Under Web & App Activity, you’ll see Auto-delete. Be sure this is turned On.
- Click the arrow to choose your preferred timeframe: 3 months, 18 months or 36 months.
Really, you need a digital estate plan
It’s not a legal document but rather a rundown of all your accounts, passwords and online assets with instructions on how to find them. My Mom made one before she passed, and I can’t tell you how much time and stress it saved me during an incredibly emotional time.
TOP LAWMAKER SHARES ‘BIGGEST FEAR’ ABOUT HER KIDS’ DATA ONLINE AS CONGRESS EYES FEDERAL PRIVACY RULES
Your list can be as formal or informal as you like. It could be an Excel spreadsheet or Word doc that includes websites, login details and anything else you want to leave behind. If you go this route, password-protect the file and leave the password in your will.
If you’re comfortable with it, I highly recommend you do this in a password manager. Most have the option to set up a contact who can access your logins when you pass. Use a password notebook if you’re more comfortable with pen and paper.
Here’s a checklist to get you started:
- Email, social media, financial and cloud storage accounts.
- Online shopping credentials.
- Streaming services and other recurring charges.
- Loyalty programs, including travel rewards.
- Domain names and website hosting.
I know it’s not fun to think about, but you’ll be helping your loved ones immensely if you do.
Get tech-smarter on your schedule
Award-winning host Kim Komando is your secret weapon for navigating tech.
Copyright 2024, WestStar Multimedia Entertainment. All rights reserved.
Technology
Millions of AI chat messages exposed in app data leak
NEWYou can now listen to Fox News articles!
A popular mobile app called Chat & Ask AI has more than 50 million users across the Google Play Store and Apple App Store. Now, an independent security researcher says the app exposed hundreds of millions of private chatbot conversations online.
The exposed messages reportedly included deeply personal and disturbing requests. Users asked questions like how to painlessly kill themselves, how to write suicide notes, how to make meth and how to hack other apps.
These were not harmless prompts. They were full chat histories tied to real users.
Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.
HOW TECH IS BEING USED IN NANCY GUTHRIE DISAPPEARANCE INVESTIGATION
Security researchers say Chat & Ask AI exposed hundreds of millions of private chatbot messages, including complete conversation histories tied to real users. (Neil Godwin/Getty Images)
What exactly was exposed
The issue was discovered by a security researcher who goes by Harry. He found that Chat & Ask AI had a misconfigured backend using Google Firebase, a popular mobile app development platform. Because of that misconfiguration, it was easy for outsiders to gain authenticated access to the app’s database. Harry says he was able to access roughly 300 million messages tied to more than 25 million users. He analyzed a smaller sample of about 60,000 users and more than one million messages to confirm the scope.
The exposed data reportedly included:
- Full chat histories with the AI
- Timestamps for each conversation
- The custom name users gave the chatbot
- How users configured the AI model
- Which AI model was selected
That matters because many users treat AI chats like private journals, therapists or brainstorming partners.
How this AI app stores so much sensitive user data
Chat & Ask AI is not a standalone artificial intelligence model. It acts as a wrapper that lets users talk to large language models built by bigger companies. Users could choose between models from OpenAI, Anthropic and Google, including ChatGPT, Claude and Gemini. While those companies operate the underlying models, Chat & Ask AI handles the storage. That is where things went wrong. Cybersecurity experts say this type of Firebase misconfiguration is a well-known weakness. It is also easy to find if someone knows what to look for.
We reached out to Codeway, which publishes the Chat & Ask AI app, for comment, but did not receive a response before publication.
149 MILLION PASSWORDS EXPOSED IN MASSIVE CREDENTIAL LEAK
The exposed database reportedly included timestamps, model settings and the names users gave their chatbots, revealing far more than isolated prompts. (Elisa Schu/Getty Images)
Why this matters to everyday users
Many people assume their chats with AI tools are private. They type things they would never post publicly or even say out loud. When an app stores that data insecurely, it becomes a gold mine for attackers. Even without names attached, chat histories can reveal mental health struggles, illegal behavior, work secrets and personal relationships. Once exposed, that data can be copied, scraped and shared forever.
YOUR PHONE SHARES DATA AT NIGHT: HERE’S HOW TO STOP IT
Because the app handled data storage itself, a simple Firebase misconfiguration made sensitive AI chats accessible to outsiders, according to the researcher. (Edward Berthelot/Getty)
Ways to stay safe when using AI apps
You do not need to stop using AI tools to protect yourself. A few informed choices can lower your risk while still letting you use these apps when they are helpful.
1) Be mindful of sensitive topics
AI chats can feel private, especially when you are stressed, curious or looking for answers. However, not all apps handle conversations securely. Before sharing deeply personal struggles, medical concerns, financial details or questions that could create legal risk if exposed, take time to understand how the app stores protects your data. If those protections are unclear, consider safer alternatives such as trusted professionals or services with stronger privacy controls.
2) Research the app before installing
Look beyond download counts and star ratings. Check who operates the app, how long it has been available, and whether its privacy policy clearly explains how user data is stored and protected.
3) Assume conversations may be stored
Even when an app claims privacy, many AI tools log conversations for troubleshooting or model improvement. Treat chats as potentially permanent records rather than temporary messages.
4) Limit account linking and sign-ins
Some AI apps allow you to sign in with Google, Apple, or an email account. While convenient, this can directly connect chat histories to your real identity. When possible, avoid linking AI tools to primary accounts used for work, banking or personal communication.
5) Review app permissions and data controls
AI apps may request access beyond what is required to function. Review permissions carefully and disable anything that is not essential. If the app offers options to delete chat history, limit data retention or turn off syncing, enable those settings.
6) Use a data removal service
Your digital footprint extends beyond AI apps. Anyone can find personal details about you with a simple Google search, including your phone number, home address, date of birth and Social Security number. Marketers buy this information to target ads. In more serious cases, scammers and identity thieves breach data brokers, leaving personal data exposed or circulating on the dark web. Using a data removal service helps reduce what can be linked back to you if a breach occurs.
While no service can guarantee the complete removal of your data from the internet, a data removal service is really a smart choice. They aren’t cheap, and neither is your privacy. These services do all the work for you by actively monitoring and systematically erasing your personal information from hundreds of websites. It’s what gives me peace of mind and has proven to be the most effective way to erase your personal data from the internet. By limiting the information available, you reduce the risk of scammers cross-referencing data from breaches with information they might find on the dark web, making it harder for them to target you.
Check out my top picks for data removal services and get a free scan to find out if your personal information is already out on the web by visiting Cyberguy.com.
Get a free scan to find out if your personal information is already out on the web: Cyberguy.com.
Kurt’s key takeaways
AI chat apps are moving fast, but security is still lagging behind. This incident shows how a single configuration mistake can expose millions of deeply personal conversations. Until stronger protections become standard, you need to treat AI chats with caution and limit what you share. The convenience is real, but so is the risk.
Do you assume your AI chats are private, or has this story changed how much you are willing to share with these apps? Let us know your thoughts by writing to us at Cyberguy.com.
Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Copyright 2026 CyberGuy.com. All rights reserved.
Technology
Republicans attack ‘woke’ Netflix — and ignore YouTube
When Netflix co-CEO Ted Sarandos entered the Senate office building on Tuesday, he got thrown a curveball. What started as a standard antitrust hearing relating to the Warner Bros. merger quickly devolved into a performative Republican attack about the spread of “woke” ideology on the streaming service. At the same time, arguably a much more influential platform was completely ignored: YouTube.
After grilling Sarandos about residual payments, Sen. Josh Hawley (R-MO) launched into a completely different line of questioning: “Why is it that so much of Netflix content for children promotes a transgender ideology?” Hawley asked, making an unsubstantiated claim that “almost half” of the platform’s children’s content contains so-called “transgender ideology.” The statement harkened to a pressure campaign launched by Elon Musk months ago in which he called on X users to unsubscribe from Netflix for having a “transgender woke agenda,” citing its few shows with trans characters — shows that were canceled years ago.
“Our business intent is to entertain the world,” Sarandos replied. “It is not to have a political agenda.” Still, other Republican lawmakers, including Sens. Ashley Moody (R-FL) and Eric Schmitt (R-MO), piled on, bringing up a post Netflix made following the murder of George Floyd, and the French film Cuties, which sparked a right-wing firestorm years ago. Sen. Ted Cruz (R-TX) even asked Sarandos what he thought about Billie Eilish’s “no one is illegal on stolen land” comment at the Grammys. It seemed like they were grasping at straws to support their narrative that Netflix’s acquisition of Warner Bros. could somehow poison the well of content for viewers.
“My concern is that you don’t share my values or those of many other American parents, and you want the United States government to allow you to become one of the largest — if not the largest — streaming monopolist in the world,” Hawley said. “I think we ought to be concerned about what content you’re promoting.”
While it’s true that Netflix will control a substantial portion of the streaming market when — and or if — it acquires Warner Bros. and its streaming service HBO Max, it’s hard to criticize Netflix without bringing up YouTube.
“YouTube is not just cat videos anymore. YouTube is TV.”
For years now, Netflix has been trying to topple YouTube as the most-watched streaming service. Data from Nielsen says Netflix made up 9 percent of total TV and streaming viewing in the US in December 2025, while Warner Bros. Discovery’s services made up 1.4 percent. Combining the two doesn’t even stack up to YouTube, which held a 12.7 percent share of viewership during that time. “YouTube is not just cat videos anymore,” Sarandos told the subcommittee. “YouTube is TV.”
Unlike Netflix, YouTube is free and has an ever-growing library of user-created content that doesn’t require it to spend billions of dollars in production costs and licensing fees. YouTube doesn’t have to worry about maintaining subscribers, as anyone with access to a web browser or phone can open up and watch YouTube. The setup brings YouTube a constant stream of viewers that it can rope in with a slew of content it can recommend to watch next.
But not all creators on YouTube are striving for quality. As my colleague Mia Sato wrote, YouTube is home to creators who try to feed an algorithm that boosts inflammatory content and attempts to hook viewers, in addition to an array of videos that may be less than ideal for kids.
Like it or not, YouTube is the dominant streamer, with an endless supply of potentially offensive agendas for just about anyone. But for some reason, it’s not the target of this culture war. If these lawmakers actually cared about what their kids are watching, maybe they’d start looking more closely at how YouTube prioritizes content. Or, if they don’t like the shows and movies on Netflix, they could just do what Sarandos suggested during the hearing: unsubscribe.
Technology
Microsoft crosses privacy line few expected
NEWYou can now listen to Fox News articles!
For years, we’ve been told that encryption is the gold standard for digital privacy. If data is encrypted, it is supposed to be locked away from hackers, companies and governments alike. That assumption just took a hit.
In a federal investigation tied to alleged COVID-19 unemployment fraud in Guam, a U.S. territory where federal law applies, Microsoft confirmed it provided law enforcement with BitLocker recovery keys. Those keys allowed investigators to unlock encrypted data on multiple laptops.
This is one of the clearest public examples to date of Microsoft providing BitLocker recovery keys to authorities as part of a criminal investigation. While the warrant itself may have been lawful, the implications stretch far beyond one investigation. For everyday Americans, this is a clear signal that “encrypted” does not always mean “inaccessible.”
Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.
HACKERS ABUSE GOOGLE CLOUD TO SEND TRUSTED PHISHING EMAILS
In the Guam investigation, Microsoft provided BitLocker recovery keys that allowed law enforcement to unlock encrypted laptops. (David Paul Morris/Bloomberg via Getty Images)
What happened in the Guam BitLocker case?
Federal investigators believed three Windows laptops held evidence tied to an alleged scheme involving pandemic unemployment funds. The devices were protected with BitLocker, Microsoft’s built-in disk encryption tool enabled by default on many modern Windows PCs. BitLocker works by scrambling all data on a hard drive so it cannot be read without a recovery key.
Users can store that key themselves, but Microsoft also encourages backing it up to a Microsoft account for convenience. In this case, that convenience mattered. When served with a valid search warrant, Microsoft provided the recovery keys to investigators. That allowed full access to the data stored on the devices. Microsoft says it receives roughly 20 such requests per year and can only comply when users have chosen to store their keys in the cloud.
We reached out to Microsoft for comment, but did not hear back before our deadline.
How Microsoft was able to unlock encrypted data
According to John Ackerly, CEO and co-founder of Virtru and a former White House technology advisor, the problem is not encryption itself. The real issue is who controls the keys. He begins by explaining how convenience can quietly shift control. “Microsoft commonly recommends that users back up BitLocker recovery keys to a Microsoft account for convenience. That choice means Microsoft may retain the technical ability to unlock a customer’s device. When a third party holds both encrypted data and the keys required to decrypt it, control is no longer exclusive.”
Once a provider has the ability to unlock data, that power rarely stays theoretical. “When systems are built so that providers can be compelled to unlock customer data, lawful access becomes a standing feature. It is important to remember that encryption does not distinguish between authorized and unauthorized access. Any system designed to be unlocked on demand will eventually be unlocked by unintended parties.”
Ackerly then points out that this outcome is not inevitable. Other companies have made different architectural choices. “Other large technology companies have demonstrated that a different approach is possible. Apple has designed systems that limit its own ability to access customer data, even when doing so would ease compliance with government demands. Google offers client-side encryption models that allow users to retain exclusive control of encryption keys. These companies still comply with the law, but when they do not hold the keys, they cannot unlock the data. That is not obstruction. It is a design choice.”
Finally, he argues that Microsoft still has room to change course. “Microsoft has an opportunity to address this by making customer-controlled keys the default and by designing recovery mechanisms that do not place decryption authority in Microsoft’s hands. True personal data sovereignty requires systems that make compelled access technically impossible, not merely contractually discouraged.”
In short, Microsoft could comply because it had the technical ability to do so. That single design decision is what turned encrypted data into accessible data.
“With BitLocker, customers can choose to store their encryption keys locally, in a location inaccessible to Microsoft, or in Microsoft’s consumer cloud services,” a Microsoft spokesperson told CyberGuy in a statement. “We recognize that some customers prefer Microsoft’s cloud storage, so we can help recover their encryption key if needed. While key recovery offers convenience, it also carries a risk of unwanted access, so Microsoft believes customers are in the best position to decide whether to use key escrow and how to manage their keys.”
WHY CLICKING THE WRONG COPILOT LINK COULD PUT YOUR DATA AT RISK
When companies hold encryption keys, lawful requests can unlock far more data than most people expect. (Kurt “CyberGuy” Knutsson)
Why this matters for data privacy
This case has reignited a long-running debate over lawful access versus systemic risk. Ackerly warns that centralized control has a long and troubling history. “We have seen the consequences of this design pattern for more than two decades. From the Equifax breach, which exposed the financial identities of nearly half the U.S. population, to repeated leaks of sensitive communications and health data during the COVID era, the pattern is consistent: centralized systems that retain control over customer data become systemic points of failure. These incidents are not anomalies. They reflect a persistent architectural flaw.”
When companies hold the keys, they become targets. That includes hackers, foreign governments and legal demands from agencies like the FBI. Once a capability exists, it rarely goes unused.
How other tech giants handle encryption differently
Apple has designed systems, such as Advanced Data Protection, where it cannot access certain encrypted user data even when served with government requests. Google offers client-side encryption for some services, primarily in enterprise environments, where encryption keys remain under the customer’s control. These companies still comply with the law, but in those cases, they do not possess the technical means to unlock the data. That distinction matters. As encryption experts often note, you cannot hand over what you do not have.
What we can do to protect our privacy
The good news is that personal privacy is not gone. The bad news is that it now requires intention. Small choices matter more than most people realize. Ackerly says the starting point is understanding control. “The main takeaway for everyday users is simple: if you don’t control your encryption keys, you don’t fully control your data.”
That control begins with knowing where your keys are stored. “The first step is understanding where your encryption keys live. If they’re stored in the cloud with your provider, your data can be accessed without your knowledge.”
Once keys live outside your control, access becomes possible without your consent. That is why the way data is encrypted matters just as much as whether it is encrypted. “Consumers should look for tools and services that encrypt data before it reaches the cloud — that way, it is impossible for your provider to hand over your data. They don’t have the keys.” Defaults are another hidden risk. Many people never change them. “Users should also look to avoid default settings designed for convenience. Default settings matter, and when convenience is the default, most individuals will unknowingly trade control for ease of use.”
When encryption is designed so that even the provider cannot access the data, the balance shifts back to the individual. “When data is encrypted in a way that even the provider can’t access, it stays private — even if a third party comes asking. By holding your own encryption keys, you’re eliminating the possibility of the provider sharing your data.” Ackerly says the lesson is simple but often ignored. “The lesson is straightforward: you cannot outsource responsibility for your sensitive data and assume that third parties will always act in your best interest. Encryption only fulfills its purpose when the data owner is the sole party capable of unlocking it.” Privacy still exists. It just no longer comes by default.
700CREDIT DATA BREACH EXPOSES SSNS OF 5.8M CONSUMERS
Reviewing default security and backup settings can help you keep control of your private data. (Kurt “CyberGuy” Knutsson)
Practical steps you can take today
You do not need to be a security expert to protect your data. A few practical checks can go a long way.
1) Start by checking where your encryption keys live
Many people do not realize that their devices quietly back up recovery keys to the cloud. On a Windows PC, sign in to your Microsoft account and look under device security or recovery key settings. Seeing a BitLocker recovery key listed online means it is stored with Microsoft.
For other encrypted services, such as Apple iCloud backups or Google Drive, open your account security dashboard and review encryption or recovery options. Focus on settings tied to recovery keys, backup encryption, or account-based access. When those keys are linked to an online account, your provider may be able to access them. The goal is simple. Know whether your keys live with you or with a company.
2) Avoid cloud-based key backups unless you truly need them
Cloud backups are designed for convenience, not privacy. If possible, store recovery keys offline. That can mean saving them to a USB drive, printing them and storing them in a safe place, or using encrypted hardware you control. The exact method matters less than who has access. If a company does not have your keys, it cannot be forced to turn them over.
3) Choose services that encrypt data before it reaches the cloud
Not all encryption works the same way, even if companies use similar language. Look for services that advertise end-to-end or client-side encryption, such as Signal for messages, or Apple’s Advanced Data Protection option for iCloud backups. These services encrypt your data on your device before it is uploaded, which means the provider cannot read it or unlock it later. Here is a simple rule of thumb. If a service can reset your password and restore all your data without your involvement, it likely holds the encryption keys. That also means it could be forced to hand over access. When encryption happens on your device first, providers cannot unlock your data because they never had the keys to begin with. That design choice blocks third-party access by default.
4) Review default security settings on every new device
Default settings usually favor convenience. That can mean easier recovery, faster syncing and weaker privacy. Take five minutes after setup and lock down the basics.
iPhone: tighten iCloud and account recovery
Turn on Advanced Data Protection for iCloud (strongest iCloud protection)
- Open Settings
- Tap your name
- Tap iCloud
- Scroll down and tap Advanced Data Protection
- Tap Turn On Advanced Data Protection
- Follow the prompts to set up Account Recovery options, like a Recovery Contact or Recovery Key
Review iCloud Backup
- Open Settings
- Tap your name
- Tap iCloud
- Tap iCloud Backup
- Decide if you want it on or off, based on your privacy comfort level
Strengthen your Apple ID security
- Open Settings
- Tap your name
- Tap Sign-In & Security
- Make sure Two-Factor Authentication (2FA) is turned on and review trusted phone numbers and devices
- Review trusted phone numbers and devices
Android: lock your Google account and backups
Review and control device backup
Settings may vary depending on your Android phone’s manufacturer.
- Open Settings
- Tap Google
- Tap Backup (or All services then Backup)
- Tap Manage backup
- Choose what backs up and confirm which Google account stores it
NEW ANDROID MALWARE CAN EMPTY YOUR BANK ACCOUNT IN SECONDS
Strengthen your screen lock, since it protects the device itself
Settings may vary depending on your Android phone’s manufacturer.
- Open Settings
- Tap Security or Security & privacy
- Set a strong PIN or password
- Turn on biometrics if you want, but keep the PIN strong either way
Secure your Google account
Settings may vary depending on your Android phone’s manufacturer.
- Open Settings
- Tap Google
- Tap Manage your Google Account
- Go to Security
- Turn on 2-Step Verification and review recent security activity
Mac: enable FileVault and review iCloud settings
Turn on FileVault disk encryption
- Click the Apple menu
- Select System Settings
- Click Privacy & Security
- Scroll down and click FileVault
- Click Turn On
- Save your recovery method securely
Review iCloud syncing
- Open System Settings
- Click your name
- Click iCloud
- Review what apps and data types sync
- Turn off anything you do not want stored in the cloud
Windows PC: check BitLocker and where the recovery key is stored
Confirm BitLocker status and settings
- Open Settings
- Go to Privacy & security
- Tap Device encryption or BitLocker (wording varies by device)
Check whether your BitLocker recovery key is stored in your Microsoft account
- Go to your Microsoft account page
- Open Devices
- Select your PC
- Look for Manage recovery keys or a BitLocker recovery key entry
- If you see a key listed online, it means the key is stored with Microsoft. That is why Microsoft was able to provide keys in the Guam case.
If your account can recover everything with a few clicks, a third party might be able to recover it too. Convenience can be helpful, but it can also widen access.
5) Treat convenience features as privacy tradeoffs
Every shortcut comes with a cost. Before enabling a feature that promises easy recovery or quick access, pause and ask one question. If I lose control of this account, who else gains access? If the answer includes a company or third party, decide whether the convenience is worth it.
These steps are not extreme or technical. They are everyday habits. In a world where lawful access can quietly become routine access, small choices now can protect your privacy later.
Strengthen protection beyond encryption
Encryption controls who can access your data, but it does not stop every real-world threat. Once data is exposed, different protections matter.
Strong antivirus software adds device-level protection
Strong antivirus software helps block malware, spyware and credential-stealing attacks that can bypass privacy settings altogether. Even encrypted devices are vulnerable if malicious software gains control before encryption comes into play.
The best way to safeguard yourself from malicious links that install malware, potentially accessing your private information, is to have strong antivirus software installed on all your devices. This protection can also alert you to phishing emails and ransomware scams, keeping your personal information and digital assets safe.
Get my picks for the best 2026 antivirus protection winners for your Windows, Mac, Android and iOS devices at Cyberguy.com
An identity theft protection service helps when exposure turns into fraud
If personal data is accessed, sold, or misused, identity protection services can monitor for suspicious activity, alert you early and help lock down accounts before damage spreads. Identity Theft companies can monitor personal information like your Social Security Number (SSN), phone number and email address, and alert you if it is being sold on the dark web or being used to open an account. They can also assist you in freezing your bank and credit card accounts to prevent further unauthorized use by criminals.
See my tips and best picks on how to protect yourself from identity theft at Cyberguy.com.
Kurt’s key takeaways
Microsoft’s decision to comply with the BitLocker warrant may have been legal. That doesn’t make it harmless. This case exposes a hard truth about modern encryption. Privacy depends less on the math and more on how systems are built. When companies hold the keys, the risk falls on the rest of us.
Do you trust tech companies to protect your encrypted data, or do you think that responsibility should fall entirely on you? Let us know by writing to us at Cyberguy.com.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.
Copyright 2026 CyberGuy.com. All rights reserved.
-
Indiana4 days ago13-year-old rider dies following incident at northwest Indiana BMX park
-
Massachusetts5 days agoTV star fisherman, crew all presumed dead after boat sinks off Massachusetts coast
-
Tennessee6 days agoUPDATE: Ohio woman charged in shooting death of West TN deputy
-
Indiana4 days ago13-year-old boy dies in BMX accident, officials, Steel Wheels BMX says
-
Politics1 week agoVirginia Democrats seek dozens of new tax hikes, including on dog walking and dry cleaning
-
Austin, TX7 days ago
TEA is on board with almost all of Austin ISD’s turnaround plans
-
Politics3 days agoTrump unveils new rendering of sprawling White House ballroom project
-
Texas5 days agoLive results: Texas state Senate runoff