Technology
Scammers can exploit your data from just 1 ChatGPT search
ChatGPT and other large language models (LLMs) have become amazing helpers for everyday tasks. Whether it’s summarizing complex ideas, designing a birthday card or even planning your apartment’s layout, you can get impressive results with just a simple prompt. But as helpful as these AI tools are, their convenience comes with hidden risks, especially when it comes to your personal privacy.
Join the FREE “CyberGuy Report”: Get my expert tech tips, critical security alerts and exclusive deals, plus instant access to my free “Ultimate Scam Survival Guide” when you sign up!
A man using ChatGPT on his laptop (Kurt “CyberGuy” Knutsson)
How these tools work and why that matters
If you haven’t tried an LLM like ChatGPT before, here’s the gist: They’re advanced language processors that chat with you through text. No special commands or coding needed, just type what you want to know or do, and they respond. For example, asking “Why is the conclave kept secret?” will get you a detailed explanation in seconds.
This simplicity is what makes LLMs so useful, but it also opens the door to risks. Instead of harmless questions, someone could ask for a detailed profile on a person, and the model might generate a surprisingly thorough report. While these tools have safeguards and often refuse certain requests, clever phrasing can sometimes bypass those limits.
Unfortunately, it doesn’t take much effort for someone to use ChatGPT to gather personal information about you. But don’t worry, there are ways to protect yourself from this kind of digital snooping.
A person using ChatGPT on their phone (Kurt “CyberGuy” Knutsson)
WHAT HACKERS CAN LEARN ABOUT YOU FROM A DATA BROKER FILE
How to stop it
These AI tools don’t just pull information out of thin air. They need to access real online sources to work. In other words, your data is already out there on the internet; AI tools just make it easier to find. And if you look at the sources, most of the information you wouldn’t want shared online, like your address, relatives and so on, is made public by people-search sites. Other sources include social media, like LinkedIn and Facebook, as well as public databases. But none of them are as invasive as people-search sites.
Let’s see what you can do to limit how much of your information is exposed online.
A woman using ChatGPT on her laptop (Kurt “CyberGuy” Knutsson)
THINK YOU CAN DELETE YOUR OWN DATA? WHY IT’S HARDER THAN YOU THINK
Essential steps and precautions to protect your privacy
To effectively safeguard your personal information from being exposed or misused, it’s important to follow these steps and adopt key precautions.
1) Opt out of people-search sites one by one
Although not all people-search sites are required to offer it, most of them do provide an option to request an opt-out. But that comes with a few challenges.
Where to start: Identifying people-search sites that expose your personal information
There are hundreds of people-search sites registered in the U.S. Going through each and every one is, realistically speaking, impossible. You’ll need to narrow your search somehow.
Using AI tools: How to find and list data broker sites with your personal data
Use AI tools and ask them to run a deep search on yourself. It’s not a perfect or complete solution; LLMs tend to shorten their responses to save resources. But it will give you a good starting point, and if you keep asking for more results, you should be able to put together a decent list of people-search sites that might have your profile.
Submitting opt-out requests: How to remove your information from people-search sites
Now, you’ll have to go through each of these people-search sites and submit opt-out requests. These usually aren’t complicated, but they’re definitely time-consuming. The opt-out forms are typically located at the bottom of each site, in the footer. The naming can vary from “Do Not Sell My Info” to “Opt-Out” or something similar. Each people-search site is a little different. Opting out of every people-search site that exposes your personal information is a mammoth task. I’ve discussed it in more detail here. Alternatively, you can automate this process.
A woman using ChatGPT on her laptop (Kurt “CyberGuy” Knutsson)
DATA REMOVAL DOES WHAT VPNS DON’T: HERE’S WHY YOU NEED BOTH
2) Opt out using data removal services
Data removal services are real-time and energy savers when it comes to protecting your personal information online. The way these services work is simple. They send hundreds of data removal requests on your behalf to people-search sites you might not even know exist but are still exposing your data. And with some services, the process goes even further than that.
People-search sites aren’t the only places exposing your personal information without your knowledge. In fact, they’re just a small part of the larger data broker industry.
There are marketing, health, financial, risk and many other types of data brokers trading your information. Your data is a commodity they use to make a profit, often without you even realizing it.
Data removal services have taken on the challenge of fighting this threat to your privacy. They continuously scour the web, looking for your profiles. This way, you can just sign up and let them handle the work in the background. And here’s the best part: They take about 10 minutes to set up, roughly the same time it takes to opt out of a single people-search site.
- Go to a data removal service that fits your needs
- Choose a subscription plan
- Provide the minimal information needed for them to effectively locate your profiles on people-search sites
And that’s it. The removal process is entirely automated and requires little to no effort on your part. With this small initial effort, you may save yourself from privacy-related risks, including scams and even identity theft. But what if your data is exposed on a people-search site not covered by any data removal service?
Every removal service out there has limitations on the number of data brokers it supports. It’s not about a lack of effort; it’s mostly because brokers are generally unwilling to cooperate, to put it mildly. But there’s a way to address this issue without going back to manual opt-outs. The top names in the data removal industry now offer custom removals. In simple terms, this means you can ask them to remove your personal information from websites not currently covered by their standard plans.
The catch is that you’ll need to do the research yourself and point out which sites are exposing your data. It’s not as convenient as having everything done automatically, but it’s a relatively minor inconvenience for the sake of your online privacy.
Check out my top picks for data removal services here.
3) Be careful what you share with AI tools
Being mindful of the information you provide to AI tools is the first and most crucial step in protecting your privacy. Don’t share sensitive details such as your full name, home address, financial information, passwords or any other personal data that could be used to identify or harm you or others.
4) Secure your AI accounts
Protecting your AI accounts from unauthorized access helps keep your interactions and data safe. Always use strong, unique passwords and consider using a password manager to generate and store those complex passwords. Enable multifactor authentication whenever possible to add an extra layer of security. Regularly review your account permissions and remove access for any devices or applications you no longer use. Get more details about my best expert-reviewed password managers of 2025 here.
5) Review and tighten social media privacy
Adjusting your social media privacy settings can greatly reduce the amount of personal information available to data brokers. Make your profiles private, limit who can see your posts and be selective about accepting friend or follower requests. Periodically audit your privacy settings and remove any unnecessary third-party app connections to further minimize your exposure.
6) Use strong antivirus software
Protecting your devices with strong antivirus software adds an essential layer of security against digital threats. Antivirus programs defend against malware, phishing and identity theft. Be sure to choose reputable software and regularly update it to stay protected against the latest threats. Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android and iOS devices.
7) Use alias emails for opt-outs and online forms
Using a dedicated email address for opt-outs and online sign-ups helps reduce spam and protects your primary email. This practice also makes it easier to track which sites and services have your contact information. If your alias email becomes compromised, you can quickly change it without disrupting your main accounts. See my review of the best secure and private email services here.
Get a free scan to find out if your personal information is already out on the web.
Kurt’s key takeaways
Large language models like ChatGPT are transforming how we work, create and solve problems, but they also introduce new privacy and security risks that can’t be ignored. As these tools become more powerful and accessible, it’s up to each of us to take proactive steps to safeguard our personal information and understand where our data might be exposed. By staying alert and making use of available privacy tools, we can enjoy the benefits of AI while minimizing the risks.
Should OpenAI be held legally accountable when its tools are used to collect or expose private data without consent? Let us know your experience or questions by writing us at Cyberguy.com/Contact. Your story could help someone else stay safe.
For more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter.
Ask Kurt a question or let us know what stories you’d like us to cover.
Follow Kurt on his social channels:
Answers to the most-asked CyberGuy questions:
New from Kurt:
Copyright 2025 CyberGuy.com. All rights reserved.
Technology
Two more xAI co-founders are among those leaving after the SpaceX merger
Since the xAI-SpaceX merger announced last week, which combined the two companies (as well as social media platform X) for a reported $1.25 trillion valuation — the biggest merger of all time — a handful of xAI employees and two of its co-founders have abruptly exited the company, penning long departure announcements online. Some also announced that they were starting their own AI companies.
Co-founder Yuhai (Tony) Wu announced his departure on X, writing that it was “time for [his] next chapter.” Jimmy Ba, another co-founder, posted something similar later that day, saying it was “time to recalibrate [his] gradient on the big picture.” The departures mean that xAI is now left with only half of its original 12 co-founders on staff.
It all comes after changing plans for the future of the combined companies, which Elon Musk recently announced would involve “space-based AI” data centers and vertical integration involving “AI, rockets, space-based internet, direct-to-mobile device communications and the world’s foremost real-time information and free speech platform.” Musk reportedly also talked of plans to build an AI satellite factory and city on the moon in an internal xAI meeting.
Musk wrote on X Wednesday that “xAI was reorganized a few days ago to improve speed of execution” and claimed that the process “unfortunately required parting ways with some people,” then put out a call for more people to apply to the company. He also posted a recording of xAI’s 45-minute internal all-hands meeting that announced the changes.
“We’re organizing the company to be more effective at this scale,” Musk said during the meeting. He added that the company will now be organized in four main application areas: Grok Main and Voice, Coding, Imagine (image and video), and Macrohard (“which is intended to do full digital emulation of entire companies,” Musk said).
Technology
2026 Valentine’s romance scams and how to avoid them
NEWYou can now listen to Fox News articles!
Valentine’s Day should be about connection. However, every February also becomes the busiest season of the year for romance scammers. In 2026, that risk is higher than ever.
These scams are no longer simple “lonely hearts” schemes. Instead, modern romance fraud relies on artificial intelligence, data brokers and stolen personal profiles. Rather than sending random messages and hoping for a response, scammers carefully select victims using detailed personal data. From there, they use AI to impersonate real people, create convincing conversations and build trust at scale.
As a result, if you are divorced, widowed or returning to online dating after the holidays, this is often the exact moment scammers target you.
Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.
WHEN DATING APPS GET HACKED, YOUR PRIVATE LIFE GOES PUBLIC
Romance scams surge around Valentine’s Day as criminals use artificial intelligence and stolen data to target widowed, divorced and older adults returning to online dating. (Omar Karim/Middle East Images/AFP via Getty Images)
The new face of romance scams in 2026
Romance scams are no longer slow, one-on-one cons. They’re now high-tech operations designed to target hundreds of people at once. Here’s what’s changed:
1) AI-generated personas that look and sound real
In the past, fake profiles used stolen photos and broken English. Today, scammers use AI-generated faces, voices and videos that don’t belong to any real person, making them almost impossible to reverse search.
You may be interacting with a profile that:
- Has years of realistic-looking social media posts
- Shares daily photos that match the story they tell
- Sends customized voice notes that sound natural
- Appears on “video calls” using AI face-mapping software.
Some scam networks even create entire fake families and friend groups online, so the person appears to have a real life, real friends and real history. To the victim, it feels like a genuine connection because the “person” behaves like one in every way.
2) Automated relationship scripts that adapt to you
Behind the scenes, many scammers now use software platforms that manage dozens of conversations at once. This is known as “scamware” and is incredibly hard to flag.
These systems:
- Track your replies
- Flag emotional triggers (grief, loneliness, fear, trust)
- Suggest responses based on your mood and history.
When you mention that you are widowed, the tone quickly becomes more comforting. Meanwhile, if you say you are financially stable, the story shifts toward so-called “business opportunities.” And if you hesitate, the system responds by introducing urgency or guilt. It feels personal, but in reality, you’re being guided through a pre-written emotional funnel designed to lead to one outcome: money.
3) Crypto and “investment romance” scams
One of the fastest-growing versions of romance fraud now blends love and money. A BBC World Service investigation recently revealed that many romance scams are now run by organized criminal networks across Southeast Asia, using what insiders call the “pig butchering” model, where victims are slowly “fattened up” with trust before being financially destroyed.
These operations use call center style setups, data broker profiles, scripted conversations and AI tools to target thousands of people at once. This is not accidental fraud. It’s an industry.
And the reason you were selected is simple. Your personal data made you easy to find, easy to profile and easy to target.
After weeks of trust-building, the scammer introduces:
- A “private” crypto platform
- A fake trading app
- A business or investment opportunity, “they use themselves.”
They may show fake dashboards, fake profits and even let you “withdraw” small amounts at first to build trust. But once larger sums are sent, the site disappears and so does the person. There is no investment. There is no account. And there is no way to recover the funds.
AI DEEPFAKE ROMANCE SCAM STEALS WOMAN’S HOME AND LIFE SAVINGS
Data brokers selling personal details fuel a new wave of romance fraud by helping scammers select financially stable, older victims before contact is made. (Jens Büttner/picture alliance via Getty Images)
How scammers find you before you ever match
The biggest misconception is that romance scams begin on dating apps. They don’t. They begin long before that, inside massive databases run by data brokers. These companies collect and sell profiles that include:
- Your age and marital status
- Whether you’re widowed or divorced
- Your home address history
- Your phone number and email
- Your family members and relatives
- Your income range and retirement status.
Scammers buy this data to build shortlists of ideal victims.
The data brokers behind romance scams
They filter for:
- Age 55-plus
- Widowed or divorced
- Living alone
- Financially stable
- Not active on social media.
That’s how they know who to target before the first message is ever sent.
Why are widowed and retired adults targeted first?
Scammers aren’t cruel by accident. They target people who are statistically more likely to respond. If you’ve lost a spouse, moved recently or reentered the dating world, your personal data often shows that. That makes you a priority target. And once your name lands on a scammer’s list, it can be sold again and again. That’s why many victims say, “I blocked them, but new ones keep showing up.” It’s not a coincidence. It’s data recycling.
How the scam usually unfolds
Most romance scams follow the same pattern:
- Friendly introduction: A warm message. No pressure. Often references something personal about you.
- Fast emotional bonding: They mirror your values, your experiences, even your grief.
- Distance and excuses: They can’t meet. There’s always a reason: military deployment, overseas job, business travel.
- A sudden “crisis”: Medical bills, business losses, frozen accounts, investment opportunities.
- Money requests: Wire transfers, gift cards, crypto or “temporary help.”
By the time money is involved, the emotional connection is already strong. Many victims send thousands before realizing it’s a scam.
The Valentine’s Day cleanup that stops scams at the source
If you want fewer scam messages this year, you need to remove your personal information from the places scammers buy it. That’s where a data removal service comes in. While no service can guarantee the complete removal of your data from the internet, a data removal service is really a smart choice. They aren’t cheap, and neither is your privacy.
These services do all the work for you by actively monitoring and systematically erasing your personal information from hundreds of websites. It’s what gives me peace of mind and has proven to be the most effective way to erase your personal data from the internet. By limiting the information available, you reduce the risk of scammers cross-referencing data from breaches with information they might find on the dark web, making it harder for them to target you.
Check out my top picks for data removal services and get a free scan to find out if your personal information is already out on the web by visiting Cyberguy.com.
Get a free scan to find out if your personal information is already out on the web: Cyberguy.com.
Practical steps to protect yourself this February
Here’s what you can do right now:
- Never send money to someone you haven’t met in person
- Be skeptical of fast emotional bonding
- Verify profiles with reverse image searches
- Don’t share personal details early
- Remove your data from broker sites.
- Use strong antivirus software to block malicious links and fake login pages. Get my picks for the best 2026 antivirus protection winners for your Windows, Mac, Android and iOS devices at Cyberguy.com.
When you combine these steps, you remove the access, urgency and leverage scammers rely on.
SUPER BOWL SCAMS SURGE IN FEBRUARY AND TARGET YOUR DATA
Cybercriminals now deploy AI-generated faces, voices and scripted conversations to impersonate real people and build trust at scale in modern romance scams. (Martin Bertrand/Hans Lucas/AFP via Getty Images)
Kurt’s key takeaways
Romance scams are no longer random. They are targeted, data-driven and emotionally engineered. This Valentine’s Day, the best gift you can give yourself is privacy. By removing your personal data from broker databases, you make it harder for scammers to find you, profile you and exploit your trust. And that’s how you protect not just your heart, but your identity, your savings and your peace of mind.
Have you or someone you love been contacted by a Valentine’s Day romance scam that felt real or unsettling? Let us know your thoughts by writing to us at Cyberguy.com.
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.
Copyright 2026 CyberGuy.com. All rights reserved.
Technology
Uber Eats adds AI assistant to help with grocery shopping
Uber announced a new AI feature called “Cart Assistant” for grocery shopping in its Uber Eats app.
The new feature works a couple different ways. You can use text prompts, as you would with any other AI chatbot, to ask it to build a grocery list for you. Or you can upload a picture of your shopping list and ask it to populate your cart with all your favorite items, based on your order history. You can be as generic as you — “milk, eggs, cereal” — and the bot will make a list with all your preferred brands.
And that’s just to start out. Uber says in the coming months, Cart Assistant will add more features, including “full recipe inspiration, meal plans, and the ability to ask follow up questions, and expand to retail partners.”
But like all chatbots, Uber acknowledges that Cart Assistant may make mistakes, and urges users to double-check and confirm the results before placing any orders.
It will also only work at certain grocery stores, with Uber announcing interoperability at launch with Albertsons, Aldi, CVS, Kroger, Safeway, Sprouts, Safeway, Walgreen, and Wegmans. More stores will be added in the future, the company says.
Uber has a partnership with OpenAI to integrate Uber Eats into its own suite of apps. But Uber spokesperson Richard Foord declined to say whether the AI company’s technology was powering the new chatbot in Uber Eats. “Cart Assistant draws on publicly available LLM models as well as Uber’s own AI stack,” Foord said in an email.
Uber has been racing to add more AI-driven features to its apps, including robotaxis with Waymo and sidewalk delivery robots in several cities. The company also recently revived its AI Labs to collaborate with its partners on building better products using delivery and customer data.
-
Politics6 days agoWhite House says murder rate plummeted to lowest level since 1900 under Trump administration
-
Indiana1 week ago13-year-old rider dies following incident at northwest Indiana BMX park
-
Alabama4 days agoGeneva’s Kiera Howell, 16, auditions for ‘American Idol’ season 24
-
Indiana1 week ago13-year-old boy dies in BMX accident, officials, Steel Wheels BMX says
-
Politics1 week agoTrump unveils new rendering of sprawling White House ballroom project
-
Culture1 week agoTry This Quiz on Mysteries Set in American Small Towns
-
San Francisco, CA1 week agoExclusive | Super Bowl 2026: Guide to the hottest events, concerts and parties happening in San Francisco
-
Massachusetts1 week agoTV star fisherman’s tragic final call with pal hours before vessel carrying his entire crew sinks off Massachusetts coast