Connect with us

Technology

Meta employee accused of accessing private images

Published

on

Meta employee accused of accessing private images

NEWYou can now listen to Fox News articles!

When you upload a photo to Facebook, you expect it to stay private unless you decide otherwise. That expectation just took a hit after a former employee of Meta was accused of accessing thousands of private images.

According to details confirmed by the company, the London-based employee allegedly created a program to bypass internal safeguards. Investigators say this may have allowed access to about 30,000 private Facebook images that were not meant to be viewed.

The individual is now under criminal investigation and is out on bail as authorities continue to review the case. Here’s how investigators say the access may have happened.

Sign up for my FREE CyberGuy Report

Advertisement
  • Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
  • For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com, trusted by millions who watch CyberGuy on TV daily.
  • Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.

META SMART GLASSES PRIVACY CONCERNS GROW
 

A former Meta employee is accused of accessing thousands of private Facebook images, raising new concerns about how user data is protected. (Fabian Sommer/picture alliance via Getty Images)

How the Meta employee allegedly accessed private images

Authorities believe the employee may have written a script to get around Meta’s internal detection systems. In simple terms, the system that should flag unusual behavior may not have caught the activity right away. This detail matters because large tech platforms rely on monitoring tools to detect suspicious access patterns. When those checks are bypassed, it raises questions about how internal access is controlled. 

The investigation is being handled by the cybercrime unit of the Metropolitan Police in London. At the same time, security experts often point out that insider threats are difficult to eliminate. Even strong systems can be tested when someone inside the company misuses access.

What Meta says about the employee investigation

Meta says it discovered the improper access more than a year ago and took action after identifying the issue. 

“Protecting user data is our top priority,” a Meta spokesperson told CyberGuy. “After discovering improper access by an employee over a year ago, we immediately terminated the individual, notified users, referred the matter to law enforcement and enhanced our security measures. We are cooperating with the ongoing investigation.”

Advertisement

Legal risks in the Meta private images case

Data protection experts say cases like this often come down to both intent and safeguards. If an employee accesses personal data without authorization, that can lead to criminal charges under data protection and computer misuse laws. However, the company’s responsibility depends on the protections it had in place. If proper safeguards existed, the focus usually remains on the individual. 

If not, regulators may consider penalties or legal claims against the company. The Information Commissioner’s Office, the U.K.’s data privacy watchdog, has acknowledged the incident. The agency stressed that social media users should be able to trust how their personal information is handled. 

Why the Meta investigation is drawing attention now

This case is unfolding at a time when scrutiny of major tech platforms is already high. Recent legal challenges have raised broader concerns about how companies protect users and manage risk. That context adds weight to this investigation. It reflects a larger conversation about privacy and accountability in the tech industry. As more people rely on digital platforms, expectations of data protection continue to rise. Incidents like this tend to reinforce those concerns.

META REPORTEDLY BUILDING AN AI VERSION OF MARK ZUCKERBERG TO INTERACT WITH COMPANY EMPLOYEES

Mark Zuckerberg walks through the U.S. Capitol after a meeting on March 26, 2026. Investigators in London say a former Meta employee may have used a script to bypass safeguards and view about 30,000 private Facebook images. (Tom Williams/CQ-Roll Call, Inc via Getty Images)

Advertisement

Simple ways to protect your private photos

Even though this case involves an insider, there are still simple steps you can take to better protect your photos and limit who can see them.

1) Check your Facebook privacy settings

You cannot control what happens inside a company, but you can limit how much of your personal content is exposed. Start by reviewing your Facebook privacy settings.

(Settings may vary depending on device and app version)

Mobile (iPhone/Android):
Facebook: MenuSettings & privacy > Settings > Audience and visibilityPostsWho can see your future posts > select Friends (or a custom audience) > Save

Desktop (Mac/PC):
Facebook: Profile picture (top right) > Settings & privacySettingsAudience and visibility section > PostsWho can see your future posts > select Friends (or a custom audience) > Done

Advertisement

2) Review older photos and albums

Next, go through older photos and albums. Many people forget that photos shared years ago may still be visible under outdated settings.

(Settings may vary depending on device and app version)

Mobile (iPhone/Android):
Facebook: MenuSettings & privacySettingsAudience and visibilityPostsLimit who can see past postsLimit who can see past postsLimit past posts > confirm

Desktop (Mac/PC):
Facebook: Profile pictureSettings & privacySettingsAudience and visibility  section > Posts > Limit who can see past posts > Limit past posts > confirm

And check individual albums:

Advertisement

Mobile (iPhone/Android):
Facebook: Go to your profilePhotosAlbums > select an album > tap Edit (top right) > Who can see this? > choose who can see it > Done

Desktop (Mac/PC):
Facebook: click your name on the left > Photos > Albums > select an album > click the three dots > Edit album > choose who can see it > Done

Not all albums can be changed, and some system albums have limited privacy options. 

3) Be careful what you upload

It also helps to limit what you upload in the first place. Sensitive images, documents or anything you would not want widely seen may be better kept off social platforms entirely.

META AI EDITS YOUR CAMERA ROLL FOR BETTER FACEBOOK POSTS
 

Advertisement

Authorities are investigating whether a former Meta employee improperly accessed private Facebook photos that users never intended to share. (Gabby Jones/Bloomberg via Getty Images)

4) Turn on account activity alerts and two-factor authentication

You can also enable alerts for unusual account activity. While this case involves an insider, account alerts still help you spot unauthorized access to your own profile. You can also turn on two-factor authentication (2FA) to add another layer of protection to your account.

How to turn on account activity alerts

(Settings may vary depending on device and app version)

Mobile (iPhone/Android):
FacebookMenuSettings & privacySettingsAccounts CenterPassword and securitySecurity Checkupreview and complete recommended security steps

Desktop (Mac/PC):
Facebook: Profile picture (top right) > Settings & privacySettingsAccounts CenterPassword and security > Security Checkupreview and complete recommended security steps

Advertisement

How to turn on two-factor authentication

(Settings may vary depending on device and app version)

Mobile (iPhone/Android):
Facebook: MenuSettings & privacySettingsPassword and securityTwo-factor authentication > choose text message or authentication appfollow prompts

Desktop (Mac/PC):
Facebook: Profile pictureSettings & privacy > Settings > Password and securityTwo-factor authentication > choose text message or authentication appfollow prompts

5) Check third-party app access

Take a few minutes to review which apps have access to your Facebook account. Third-party apps can sometimes hold more access than you expect.

(Settings may vary depending on device and app version)

Advertisement

Mobile (iPhone/Android):
Facebook: MenuSettings & privacy > SettingsApps and websitesActive > tap an app > Remove

Desktop (Mac/PC):
Facebook: Profile picture (top right) > Settings & privacySettingsApps and websitesActive > click an appRemove

If you don’t see any apps listed or options like “Active,” it likely means you don’t have any connected apps to review.

What this means to you

If you use Facebook or similar platforms, this situation highlights something many people overlook. Even with strong safeguards, insider access still exists. Employees often need certain permissions to keep systems running. That creates a level of trust between users and the company. 

When that trust is broken, it can feel personal. At the same time, there are still steps you can take on your end. Reviewing your privacy settings, limiting what you share and enabling security features can reduce how much of your content is exposed. It also shows why detection and response matter. 

Advertisement

In this case, Meta says it identified the issue, removed the employee and notified users. Those steps can limit damage, but they do not erase the concern. The bigger takeaway is that privacy depends on both technology and human behavior. Systems can reduce risk, but they cannot remove it completely.

Take my quiz: How safe is your online security?

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com    

Kurt’s key takeaways

This case is still under investigation, and no final legal outcome has been announced. Even so, it highlights a risk many people rarely think about. Most privacy conversations focus on hackers. This situation is different. It shows how access from inside a company can create its own set of risks. Meta says it acted quickly by removing the employee, notifying users and strengthening its systems. Those steps matter, but they also show how much trust users place in the platforms they use every day. The reality is simple. Once you upload something online, you are trusting more than just the technology behind it.

If someone inside a company can access private data, how much control do you really have over what you share online? Let us know by writing to us at Cyberguy.com.

Advertisement

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

Sign up for my FREE CyberGuy Report

  • Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
  • For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com trusted by millions who watch CyberGuy on TV daily.
  • Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.

Copyright 2026 CyberGuy.com. All rights reserved. 

Advertisement

Technology

YouTube’s mobile app finally lets you share timestamped videos

Published

on

YouTube’s mobile app finally lets you share timestamped videos

YouTube is making some changes that might affect how you share videos from the mobile app. From the app, you can finally share videos from a specific timestamp, which will make it easier to point someone to a part of a video you might want them to see while you’re on your phone. However, this change will replace the Clips feature that lets you make a shareable clip from a video.

You’ll still be able to watch any Clips that you’ve already made. But moving forward, “the ability to set an end time or include a custom description when sharing will no longer be available,” YouTube says. The company notes that while clipping is “important way for creators to reach new audiences,” it says that “a number of third-party tools with advanced clipping features and authorized creator programs are now available to do this across different video platforms.”

The company originally introduced the Clips feature in 2021.

Continue Reading

Technology

Govee’s new LED Lightwall comes with its own self-standing frame

Published

on

Govee’s new LED Lightwall comes with its own self-standing frame

Govee has announced an upgraded version of its hanging Curtain Lights Pro that can instead be used nearly anywhere you have access to an outlet or large battery. At $449.99, Govee’s new Lightwall is more than twice as expensive as the $199.99 Curtain Lights Pro, but comes with more LEDs in a denser array and a self-standing aluminum frame that can be assembled in 10 to 15 minutes without the need for any tools.

When hung from its stand the Lightwall measures 7.9 feet wide and 5.3 feet tall and features 1,536 color-changing LEDs spaced about 1.96 inches apart in a 48 x 32 grid. It’s water-resistant, and with the ability to refresh at up to 35fps the Lightwall almost sounds like it could be used as a personal backyard Jumbotron, but it’s not designed for watching TV or movies.

The Lightwall instead connects to Govee’s Home app where you can select from over 200 preset scenes and simple animations, choose from 10 different music modes that generate lighting patterns matched to beats, or synchronize its colors to other Govee lighting products to create a cohesive mood.

The app can also use AI to create custom animated GIFs from simple text prompts, or you can take matters into your own hands and create custom designs by sketching in the app with your finger and stacking up to 30 layers of doodles. The Lightwall is smart home compatible and supports Matter, too, so in addition to managing it through Govee’s app you can control it using voice commands through smart devices with Google Assistant or Amazon Alexa.

Continue Reading

Technology

Roblox adds age-based accounts for kids and teens

Published

on

Roblox adds age-based accounts for kids and teens

NEWYou can now listen to Fox News articles!

If your child plays Roblox, they are part of a massive global audience. Roblox has reported more than 144 million daily active users, with a large share made up of kids and teens who log in to play games, create content and connect with friends. That reach is exactly why a new change rolling out in early June matters.

Advertisement

Roblox is introducing two new account types designed to better match what kids play and who they can talk to based on age. The shift centers on structure. Instead of one shared experience with layered controls, Roblox is building separate environments for different age groups. As a result, content, chat and parental controls will adjust automatically as a child grows.

Sign up for my FREE CyberGuy Report

  • Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
  • For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com trusted by millions who watch CyberGuy on TV daily.
  • Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.

OPENAI TIGHTENS AI RULES FOR TEENS BUT CONCERNS REMAIN

Roblox rolls out a new AI system that analyzes entire scenes in real time to detect harmful content across its platform. (Brent Lewin/Bloomberg via Getty Images)

 

What are Roblox Kids and Roblox Select accounts?

Roblox is dividing younger users into two groups, each with its own rules and experience.

Roblox Kids (ages 5 to 8)

This is the most restricted environment. It is designed for younger children who need tighter guardrails.

Advertisement
  • Access limited to games rated Minimal or Mild
  • Only games that pass a three-step review process
  • Chat is turned off by default
  • A distinct visual design so parents can easily recognize the account

The idea here is simple. Kids see a limited version of Roblox that removes riskier content and disables communication.

Roblox Select (ages 9 to 15)

AUSTRALIA REMOVES 4.7M KIDS FROM SOCIAL MEDIA PLATFORMS IN FIRST MONTH OF HISTORIC BAN

This group gets more flexibility, but still within limits.

  • Access to games rated up to Moderate
  • Same multi-step game screening process
  • Chat settings remain on by default in most regions
  • Visual indicators show the account type

At this stage, Roblox assumes users can handle a broader range of experiences, but still keeps filters in place.

How Roblox decides what games kids can play

Not every game makes the cut. Roblox is adding a continuous evaluation system that runs behind the scenes. Here’s how it works:

1) Developer verification

Creators must verify their identity, enable two-step security and maintain a Roblox Plus subscription.

2) Real-time evaluation

Older users, age 16 and up, effectively test new games first. Roblox studies how they interact and reviews reports before exposing those games to younger players.

Advertisement

3) Content eligibility check

Games receive maturity ratings such as Minimal, Mild or Moderate. Certain categories, like social hangouts or free-form drawing, are excluded by default for younger users. This layered approach combines AI moderation, human review and real-world gameplay signals.

Age checks now control the entire experience

Roblox is expanding the same age-check system it introduced earlier this year for chat.

  • Users under 9 Roblox Kids
  • Users 9 to 15 Roblox Select
  • Users 16 and older standard with Roblox account

If a user does not complete an age check, they face stricter limits. They can only access lower-rated games and cannot use chat. Once verified, the system automatically moves them into the correct account type.

Roblox officials say the new system aims to proactively protect children while maintaining gameplay for compliant users. (Riccardo Milani/Hans Lucas/AFP via Getty Images)

 

Accounts evolve as kids grow

There is no need to manually switch settings over time.

  • At age 9, users move from Kids to Select
  • At age 16, they move to a standard account

This automatic progression is designed to simplify things for families while keeping protections in place at each stage.

Parental controls get more precise

Roblox is also expanding what parents can do.

Advertisement
  • Block specific games through age 15
  • Manage direct chat settings until age 15
  • Approve access to individual games outside default limits
  • View what games kids play and who they interact with

These tools give parents more direct control instead of relying only on broad content filters.

A move toward global content ratings

Later this year, Roblox plans to align with the International Age Rating Coalition framework. That includes familiar systems like ESRB in the U.S. and PEGI in Europe. The goal is to make ratings clearer and more consistent across regions. 

Why this matters to families

This update changes how Roblox works at a fundamental level. Instead of asking parents to constantly adjust settings, the platform builds age-appropriate experiences from the start. It also reflects a broader shift in tech. Platforms are under pressure to design safety into the product, not tack it on later.

As Larry Magid, CEO of ConnectSafely, an organization focused on helping families navigate digital safety, put it:

“By combining age assurance, stronger creator accountability, and parental controls, Roblox is helping set a higher standard for how platforms can better protect younger users while preserving positive online experiences.”

Take my quiz: How safe is your online security?

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com.

Advertisement

Kurt’s key takeaways

Roblox targets nuanced rule-breaking by analyzing avatars, text and environments together instead of in isolation. (JasonDoiy/Getty Images)

Roblox is not removing risk entirely. No platform can. What it is doing is tightening the structure around how kids interact with content and other players. For parents, this could make things simpler. For kids, the experience will feel more tailored to where they are in life. The bigger question is whether this becomes the norm across gaming and social platforms.

If platforms start shaping experiences based on age by default, does that improve safety or limit how kids explore and learn online? Let us know by writing to us at Cyberguy.com.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

Sign up for my FREE CyberGuy Report

Advertisement
  • Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
  • For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com trusted by millions who watch CyberGuy on TV daily.
  • Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.

Copyright 2026 CyberGuy.com. All rights reserved.  

Continue Reading
Advertisement

Trending