Connect with us

Technology

Thousands of iPhone apps expose data inside Apple App Store

Published

on

Thousands of iPhone apps expose data inside Apple App Store

NEWYou can now listen to Fox News articles!

Apple often promotes the App Store as a secure place to download apps. The company highlights strict reviews and a closed system as key protections for iPhone users. That reputation now faces serious questions.

New research shows that thousands of iOS apps approved by Apple contain hidden security flaws. These flaws can expose user data, cloud storage and even payment systems. 

The issue is not malware; it’s poor security practices baked directly into the app code.

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.

Advertisement

APPLE WARNS MILLIONS OF IPHONES ARE EXPOSED TO ATTACK

Cybernews researchers found that many iOS apps store sensitive secrets directly inside app files, where they can be easily extracted. (Kurt “CyberGuy” Knutsson)

What researchers discovered inside iOS apps

Security researchers at Cybernews, a cybersecurity research firm, analyzed the code of more than 156,000 iPhone apps. That represents about 8% of all apps available worldwide.

Here is what they found:

  • Over 815,000 hidden secrets inside app code
  • An average of five secrets per app
  • 71% of apps leaked at least one secret

These secrets include passwords, API keys and access tokens. Developers place them directly inside apps, where anyone can extract them. According to Cybernews researcher Aras Nazarovas, this makes attackers’ jobs much easier than most users realize.

What are hardcoded secrets in simple terms?

A hardcoded secret is sensitive information saved directly inside an app instead of being protected on a secure server. Think of it like writing your bank PIN on the back of your debit card. Once someone downloads the app, they can inspect its files and pull out those secrets. Attackers do not need special access or advanced hacking tools. Both the Cybersecurity and Infrastructure Security Agency and the Federal Bureau of Investigation warn developers not to do this. Yet it is happening at a massive scale.

Advertisement

Cloud storage leaks exposed huge amounts of data

One of the most serious problems involves cloud storage. More than 78,000 iOS apps contained direct links to cloud storage buckets. These buckets store files such as photos, documents, receipts and backups. In some cases, no password was required at all. Researchers found:

  • 836 storage buckets are fully open to the public
  • Over 76 billion exposed files
  • More than 406 terabytes of leaked data

This data included user uploads, registration details, app logs and private records. Anyone who knew where to look could view or download it.

APPLE PATCHES TWO ZERO-DAY FLAWS USED IN TARGETED ATTACKS

This chart shows the most common types of hardcoded secrets found inside iOS apps, with Google-related keys appearing most often, according to Cybernews research. (Cybernews)

Firebase databases were also left open

Many iOS apps rely on Google Firebase to store user data. Cybernews found more than 51,000 Firebase database links hidden in app code. While some were protected, over 2,200 had no authentication. That exposed:

  • Nearly 20 million user records
  • Messages, profiles, and activity logs
  • Databases that are mostly hosted in the U.S.

If a Firebase database is not locked down, attackers can browse user data like a public website.

Payment and login systems were at risk too

Some of the leaked secrets were far more dangerous than analytics or ads. Researchers discovered secret keys for:

Advertisement
  • Stripe, which handles payments and refunds
  • JWT authentication systems that control logins
  • Order management tools used by shopping apps

A leaked Stripe secret key can allow attackers to issue refunds, move money or access billing details. Leaked login keys can let attackers impersonate users or take over accounts.

AI and social apps were among the worst offenders

Some of the apps with the largest leaks were related to artificial intelligence. According to VX Underground, security firm CovertLabs identified 198 iOS apps leaking user data. The worst known case was Chat & Ask AI by Codeway. Researchers say it exposed chat histories, phone numbers and email addresses tied to millions of users. Another app, YPT – Study Group, reportedly leaked messages, user IDs and access tokens. CovertLabs tracks these incidents in a restricted repository called Firehound. The full list of affected apps has not been publicly released, and researchers say the data is limited to prevent further exposure and to give developers time to fix security flaws.

MALICIOUS GOOGLE CHROME EXTENSIONS HIJACK ACCOUNTS

This example shows how sensitive keys like Google API credentials and Stripe payment secrets can be stored directly inside an iOS app’s files, where they are easy to extract. (Cybernews)

Why Apple’s App review can miss hidden security risks

Apple reviews apps before they appear in the App Store. However, the review process does not scan app code for hidden secrets. If an app behaves normally during testing, it can pass review even if sensitive keys are buried inside its files. This creates a gap between Apple’s security claims and real-world risks. Removing leaked secrets is not simple for developers. They must revoke old keys, create new ones and rebuild parts of their apps. That can break features and delay updates. Even though Apple says most app updates are reviewed within 24 hours, some updates take weeks. During that time, vulnerable apps can remain available.

CyberGuy contacted Apple for comment, but did not receive a response before publication.

Advertisement

Ways to stay safe right now

You cannot easily inspect an app for hidden secrets. Apple does not provide tools for that. Still, you can reduce your risk and limit exposure by being selective and cautious. These steps help reduce the risk if an app leaks data behind the scenes.

1) Stick to established app developers

Well-known developers tend to have stronger security teams and better update practices. Smaller or unknown apps may rush features to market and overlook security basics. Before downloading, check how long the developer has been active and how often the app is updated.

2) Review and limit app permissions

Many apps ask for more access than they need. Location, contacts, photos and microphone access all increase the risk of data leaks. Go into your iPhone settings and remove permissions that are not essential for the app to work.

3) Delete apps you no longer use

Unused apps still retain access to data you shared in the past. They may also store information on remote servers long after you stop opening them. If you have not used an app in months, remove it. Here’s how: Open Settings, tap General, select iPhone Storage, and scroll through the list of apps to see when each one was last used. Tap any app you no longer need and select Delete App to remove it and reduce ongoing data exposure.

4) Be cautious with personal and financial details

Avoid entering sensitive information unless it is absolutely necessary. This includes full names, addresses, payment details and private conversations. AI apps are especially risky if you share deeply personal content.

Advertisement

5) Use a password manager for every account

A password manager creates strong, unique passwords for each app and service. This prevents attackers from accessing multiple accounts if one app leaks data. Never reuse passwords tied to your email address.

Next, see if your email has been exposed in past breaches. Our No. 1 password manager pick includes a built-in breach scanner that checks whether your email address or passwords have appeared in known leaks. If you discover a match, immediately change any reused passwords and secure those accounts with new, unique credentials.

Check out the best expert-reviewed password managers of 2026 at Cyberguy.com.

6) Change passwords tied to exposed apps

If an app uses your email address for login, change that password immediately. Do this even if there is no confirmation of a breach. Attackers often test leaked credentials across other services.

7) Consider using a data removal service

Some leaked data ends up with data brokers that sell personal information online. A data removal service can help find and remove your details from these databases. This reduces the chance that exposed app data gets reused for scams or identity theft.

Advertisement

While no service can guarantee the complete removal of your data from the internet, a data removal service is really a smart choice. They aren’t cheap, and neither is your privacy. These services do all the work for you by actively monitoring and systematically erasing your personal information from hundreds of websites. It’s what gives me peace of mind and has proven to be the most effective way to erase your personal data from the internet. By limiting the information available, you reduce the risk of scammers cross-referencing data from breaches with information they might find on the dark web, making it harder for them to target you.

Check out my top picks for data removal services and get a free scan to find out if your personal information is already out on the web by visiting Cyberguy.com.

Get a free scan to find out if your personal information is already out on the web: Cyberguy.com.

8) Monitor your accounts for unusual activity

Watch for unexpected emails, password reset notices, login alerts, or payment confirmations. These can signal that leaked data is already being abused. Act quickly if something looks off.

9) Pause use of risky AI and chat apps

If you use AI apps for private conversations, consider stopping until the developer confirms security fixes. Once data is exposed, it cannot be pulled back. Avoid sharing sensitive details with apps that store conversations remotely.

Advertisement

Kurt’s key takeaways

Apple’s App Store still offers important protections, but this research shows it is not foolproof. Many trusted iPhone apps quietly expose data due to basic security mistakes. Until app reviews improve, you need to stay alert and limit how much data you share.

How many apps on your iPhone have access to information you would not want exposed? Let us know by writing to us at Cyberguy.com.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter. 

Copyright 2026 CyberGuy.com. All rights reserved.

Advertisement

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Technology

Defense secretary Pete Hegseth designates Anthropic a supply chain risk

Published

on

Defense secretary Pete Hegseth designates Anthropic a supply chain risk

This week, Anthropic delivered a master class in arrogance and betrayal as well as a textbook case of how not to do business with the United States Government or the Pentagon.

Our position has never wavered and will never waver: the Department of War must have full, unrestricted access to Anthropic’s models for every LAWFUL purpose in defense of the Republic.

Instead, @AnthropicAI and its CEO @DarioAmodei, have chosen duplicity. Cloaked in the sanctimonious rhetoric of “effective altruism,” they have attempted to strong-arm the United States military into submission – a cowardly act of corporate virtue-signaling that places Silicon Valley ideology above American lives.

The Terms of Service of Anthropic’s defective altruism will never outweigh the safety, the readiness, or the lives of American troops on the battlefield.

Their true objective is unmistakable: to seize veto power over the operational decisions of the United States military. That is unacceptable.

Advertisement

As President Trump stated on Truth Social, the Commander-in-Chief and the American people alone will determine the destiny of our armed forces, not unelected tech executives.

Anthropic’s stance is fundamentally incompatible with American principles. Their relationship with the United States Armed Forces and the Federal Government has therefore been permanently altered.

In conjunction with the President’s directive for the Federal Government to cease all use of Anthropic’s technology, I am directing the Department of War to designate Anthropic a Supply-Chain Risk to National Security. Effective immediately, no contractor, supplier, or partner that does business with the United States military may conduct any commercial activity with Anthropic. Anthropic will continue to provide the Department of War its services for a period of no more than six months to allow for a seamless transition to a better and more patriotic service.

America’s warfighters will never be held hostage by the ideological whims of Big Tech. This decision is final.

Advertisement
Continue Reading

Technology

What Trump’s ‘ratepayer protection pledge’ means for you

Published

on

What Trump’s ‘ratepayer protection pledge’ means for you

NEWYou can now listen to Fox News articles!

When you open a chatbot, stream a show or back up photos to the cloud, you are tapping into a vast network of data centers. These facilities power artificial intelligence, search engines and online services we use every day. Now there is a growing debate over who should pay for the electricity those data centers consume.

During President Trump’s State of the Union address this week, he introduced a new initiative called the “ratepayer protection pledge” to shift AI-driven electricity costs away from consumers. The core idea is simple. 

Tech companies that run energy-intensive AI data centers should cover the cost of the extra electricity they require rather than passing those costs on to everyday customers through higher utility rates.

It sounds simple. The hard part is what happens next.

Advertisement

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter.

At the State of the Union address Feb. 24, 2026, President Trump unveiled the “ratepayer protection pledge” aimed at shielding consumers from rising electricity costs tied to AI data centers. (Nathan Posner/Anadolu via Getty Images)

Why AI is driving a surge in electricity demand

AI systems require enormous computing power. That computing power requires enormous electricity. Today’s data centers can consume as much power as a small city. As AI tools expand across business, healthcare, finance and consumer apps, energy demand has risen sharply in certain regions.

Utilities have warned that the current grid in many parts of the country was not built for this level of concentrated demand. Upgrading substations, transmission lines and generation capacity costs money. Traditionally, those costs can influence rates paid by homes and small businesses. That is where the pledge comes in.

What the ratepayer protection pledge is designed to do

Under the ratepayer protection pledge, large technology companies would:

Advertisement
  • Cover the full cost of additional electricity tied to their data centers
  • Build their own on-site power generation to reduce strain on the public grid

Supporters say this approach separates residential energy costs from large-scale AI expansion. In other words, your household bill should not rise simply because a new AI data center opens nearby. So far, Anthropic is the clearest public backer. CyberGuy reached out to Anthropic for a comment on its role in the pledge. A company spokesperson referred us to a tweet from Anthropic Head of External Affairs Sarah Heck.

“American families shouldn’t pick up the tab for AI,” Heck wrote in a post on X. “In support of the White House ratepayer protection pledge, Anthropic has committed to covering 100% of electricity price increases that consumers face from our data centers.”

That makes Anthropic one of the first major AI companies to publicly state it will absorb consumer electricity price increases tied to its data center operations. Other major firms may be close behind. The White House reportedly plans to host Microsoft, Meta and Anthropic in early March to discuss formalizing a broader deal, though attendance and final terms have not been confirmed publicly.

Microsoft also expressed support for the initiative. 

“The ratepayer protection pledge is an important step,” Brad Smith, Microsoft vice chair and president, said in a statement to CyberGuy. “We appreciate the administration’s work to ensure that data centers don’t contribute to higher electricity prices for consumers.”  

Industry groups also point to companies such as Google and utilities including Duke Energy and Georgia Power as making consumer-focused commitments tied to data center growth. However, enforcement mechanisms and long-term regulatory details remain unclear.

Advertisement

CHINA VS SPACEX IN RACE FOR SPACE AI DATA CENTERS

The White House plans talks with Microsoft, Meta and Anthropic about shifting AI energy costs away from consumers. (Eli Hiller/For The Washington Post via Getty Images)

How this could change the economics of AI

AI infrastructure is already one of the most expensive technology buildouts in history. Companies are investing billions in chips, servers and real estate. If firms must also finance dedicated power plants or pay premium rates for grid upgrades, the cost of running AI systems increases further. That could lead to:

  • Slower expansion in some markets
  • Greater investment in renewable energy and storage
  • More partnerships between tech firms and utilities

Energy strategy may become just as important as computing strategy. For consumers, this shift signals that electricity is now a central part of the AI conversation. AI is no longer only about software. It is also about infrastructure.

The bigger consumer tech picture

AI is becoming embedded in smartphones, search engines, office software and home devices. As adoption grows, so does the hidden infrastructure supporting it. Energy is now part of the conversation around everyday technology. Every AI-generated image, voice command or cloud backup depends on a power-hungry network of servers.

By asking companies to account more directly for their electricity use, policymakers are acknowledging a new reality. The digital world runs on very physical resources. For you, that shift could mean more transparency. It also raises new questions about sustainability, local impact and long-term costs.

Advertisement

ARTIFICIAL INTELLIGENCE HELPS FUEL NEW ENERGY SOURCES

As AI expansion strains the grid, a new proposal would require tech firms to fund their own power needs. (Sameer Al-Doumy/AFP via Getty Images)

What this means for you

If you are a homeowner or renter, the practical question is simple. Will this protect my electric bill? In theory, separating data center energy costs from residential rates could reduce the risk of price spikes tied to AI growth. If companies fund their own generation or grid upgrades, utilities may have less reason to spread those costs among all customers.

That said, utility pricing is complex. It depends on state regulators, long-term planning and local energy markets.

Here is what you can watch for in your area:

Advertisement
  • New data center construction announcements
  • Utility filings that mention large commercial load growth
  • Public service commission decisions on rate adjustments

Even if you rarely use AI tools, your community could feel the effects of a nearby data center. The pledge is intended to keep those large-scale power demands from showing up in your monthly bill.

Take my quiz: How safe is your online security?

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com.

Kurt’s key takeaways

The ratepayer protection pledge highlights an important turning point. AI is no longer only about innovation and speed. It is also about energy and accountability. If tech companies truly absorb the cost of their expanding power needs, households may avoid some of the financial strain tied to rapid AI growth. If not, utility bills could become an unexpected front line in the AI era.

As AI tools become part of daily life, how much extra power are you willing to support to keep them running? Let us know by writing to us at Cyberguy.com.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

Advertisement

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter.

Copyright 2026 CyberGuy.com. All rights reserved.

Related Article

Scoop: Trump brings Big Tech to White House to curb power costs amid AI boom
Advertisement
Continue Reading

Technology

Here’s your first look at Kratos in Amazon’s God of War show

Published

on

Here’s your first look at Kratos in Amazon’s God of War show

Amazon has slowly been teasing out casting details for its live-action adaptation of God of War, and now we have our first look at the show. It’s a single image but a notable one showing protagonist Kratos and his son Atreus. The characters are played by Ryan Hurst and Callum Vinson, respectively, and they look relatively close to their video game counterparts.

There aren’t a lot of other details about the show just yet, but this is Amazon’s official description:

The God of War series storyline follows father and son Kratos and Atreus as they embark on a journey to spread the ashes of their wife and mother, Faye. Through their adventures, Kratos tries to teach his son to be a better god, while Atreus tries to teach his father how to be a better human.

That sounds a lot like the recent soft reboot of the franchise, which started with 2018’s God of War and continued through Ragnarök in 2022. For the Amazon series, Ronald D. Moore, best-known for his work on For All Mankind and Battlestar Galactica, will serve as showrunner. The rest of the cast includes: Mandy Patinkin (Odin), Ed Skrein (Baldur), Max Parker (Heimdall), Ólafur Darri Ólafsson (Thor), Teresa Palmer (Sif), Alastair Duncan (Mimir), Jeff Gulka (Sindri), and Danny Woodburn (Brok).

While production is underway on the God of War series, there’s no word on when it might start streaming.

Advertisement
Continue Reading

Trending