Connect with us

Technology

Microsoft’s Edge Copilot update uses AI to pull information from across your tabs

Published

on

Microsoft’s Edge Copilot update uses AI to pull information from across your tabs

Microsoft Edge is adding a new feature that will allow its Copilot AI chatbot to gather information from all of your open tabs. When you start a conversation with Copilot, you can ask the chatbot questions about what’s in your tabs, compare the products you’re looking at, summarize your open articles, and more.

In its announcement, Microsoft says you can “select which experiences you want or leave off the ones you don’t.” The company is retiring Copilot Mode as well, which could similarly draw information from your tabs but offered some agentic features, like the ability to book a reservation on your behalf. Microsoft has since folded these agentic capabilities into its “Browse with Copilot” tool.

Several other AI features are coming to Edge, including an AI-powered “Study and Learn” mode that can turn the article you’re looking at into a study session or interactive quiz. There’s a new tool that turns your tabs into AI-powered podcasts as well, similar to what you’d find on NotebookLM, and an AI writing assistant that will pop up when you start entering text on a webpage.

You can also give Copilot permission to access your browsing history to provide more “relevant, high-quality answers,” according to Microsoft. Copilot in Edge on desktop and mobile will come with “long-term memory” as well, which can tailor its responses based on your previous conversations. And, when you open up a new tab, you’ll see a redesigned page that combines chat, search, and web navigation, along with the Journeys feature, which uses AI to organize your browsing history into categories that you can revisit.

Meanwhile, an update to Edge’s mobile app will allow you to share your screen with Copilot and talk through the questions about what you’re seeing. Microsoft says you’ll see “clear visual cues” when Copilot is active, “so you know when it’s taking an action, helping, listening, or viewing.”

Advertisement

Technology

Apple’s $250M Siri settlement: Are you owed cash?

Published

on

Apple’s 0M Siri settlement: Are you owed cash?

NEWYou can now listen to Fox News articles!

If you bought a newer iPhone because Apple made Siri sound like it was about to become your personal artificial intelligence sidekick, you may want to pay attention.

Apple has agreed to pay $250 million to settle a class-action lawsuit over claims that it misled customers about new Apple Intelligence and Siri features. The case centers on the iPhone 16 launch and certain iPhone 15 models that were marketed as ready for Apple’s next wave of AI. The settlement still needs court approval, and Apple denies wrongdoing.

The lawsuit argues that Apple promoted a smarter, more personal Siri before those features were actually available. For some buyers, that was a big deal. A new iPhone can cost hundreds of dollars, and many people upgrade only when they think they are getting something meaningfully new.

 Sign up for my FREE CyberGuy Report

Advertisement
  • Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
  • For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com trusted by millions who watch CyberGuy on TV daily.
  • Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.

WHY IPHONE USERS ARE THE NEW PRIME SCAM TARGETS

U.S. buyers of certain iPhone 16 and iPhone 15 Pro models may qualify for payments if a judge approves Apple’s proposed settlement. (Getty Images)

What Apple is accused of promising

Apple introduced Apple Intelligence in June 2024 and promoted it as a major step forward for iPhone, iPad and Mac. A key part of that pitch was a more personalized Siri that could understand context, work across apps and help with everyday tasks in a more useful way.

The lawsuit claims Apple’s marketing made consumers believe those advanced Siri features would arrive with the iPhone 16 or soon after. Instead, buyers received phones that had some Apple Intelligence tools, but not the full Siri overhaul that many expected.

That gap is the heart of the case. Plaintiffs say customers bought or upgraded devices based on AI features that were not ready. Apple says it has rolled out many Apple Intelligence features and settled the case, so it can stay focused on its products. 

How much money could iPhone owners get?

The proposed settlement creates a $250 million fund. Eligible customers who file approved claims are expected to receive at least $25 per eligible device. That amount could rise to as much as $95 per device, depending on how many people file claims and other settlement factors.

Advertisement

That means this will not be a huge payday for most people. Still, if you bought one of the covered phones, it may be worth watching for a claim notice. A few minutes of paperwork could put some money back in your pocket.

Which iPhones may qualify?

The proposed settlement covers U.S. buyers who purchased any iPhone 16 model, iPhone 15 Pro or iPhone 15 Pro Max between June 10, 2024, and March 29, 2025.

Covered iPhone 16 models include the iPhone 16, iPhone 16 Plus, iPhone 16 Pro, iPhone 16 Pro Max and iPhone 16e. The settlement also includes the iPhone 15 Pro and iPhone 15 Pro Max, but not every iPhone 15 model.

The key details are the device model, the purchase date and whether the phone was bought in the United States.

HOW YOU CAN GET A SLICE OF APPLE’S $250M IPHONE SETTLEMENT

Advertisement

Apple has agreed to pay $250 million to settle claims it misled customers about Apple Intelligence and Siri features on newer iPhones. (Michael Nagle/Bloomberg)

How will you file a claim?

You do not need to do anything immediately. The settlement still needs a judge’s approval. Once the claims process opens, eligible customers are expected to receive a notice by email or mail with instructions on how to file through a settlement website.

That notice matters because scammers love moments like this. A real settlement notice should not ask for your Apple ID password, bank login or payment to claim your money. If you receive a message about this settlement, do not click blindly. Go slowly, check the sender and look for the official settlement administrator details once they are available.

Why this case matters beyond one Siri feature

This case hits a bigger nerve. Tech companies are racing to sell AI as the next must-have feature. That creates a problem for shoppers. You are often asked to buy now based on what a company says will arrive later.

That can be frustrating when the feature is the reason you upgraded. A smarter Siri sounds useful. A phone that can understand your personal context, search across apps and help with daily tasks could save time. But if those tools are delayed, limited or missing, the value of the upgrade changes.

Advertisement

This settlement also sends a message about AI marketing. Companies can talk about future features, but consumers need clear timing and plain explanations. “Coming soon” can mean very different things when you are spending $800, $1,000 or more.

We reached out to Apple for comment, but did not hear back before our deadline.

FIRST 15 THINGS TO DO OR TRY FIRST WHEN YOU GET A NEW IPHONE

Apple denies wrongdoing but agreed to settle claims tied to its marketing of Apple Intelligence and Siri features. (Qilai Shen/Bloomberg)

What this means to you

If you bought a covered iPhone during the settlement period, keep an eye on your email and regular mail. You may qualify for a payment if the court approves the deal.

Advertisement

You should also keep your receipt or proof of purchase if you have it. Your Apple purchase history, carrier account or retailer receipt may help if the claim process asks for details.

More broadly, this is a reminder to treat AI features like any other big tech promise. Before you upgrade, ask one simple question: Can the feature do what is being advertised today, or is the company asking me to wait?

That question can save you from buying a device for a future feature that may arrive much later than expected.

Take my quiz: How safe is your online security?

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my quiz here: CyberGuy.com.

Advertisement

Kurt’s key takeaways

Apple has built its brand on making technology feel polished, personal and easy to use. That is why this Siri settlement hits a nerve. People were buying phones they use every day for texts, photos, directions, reminders and everything in between. Many expected AI to make those everyday tasks easier, which is why the delay felt frustrating. The proposed payout may be modest, but the bigger issue is trust. When a company sells AI as a reason to upgrade, customers deserve to know what actually works now and what is still coming later.

Would you still buy a new phone for promised AI features, or would you wait until they actually show up? Let us know by writing to us at CyberGuy.com.

Sign up for my FREE CyberGuy Report

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

  • Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
  • For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com trusted by millions who watch CyberGuy on TV daily.
  • Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.

Copyright 2026 CyberGuy.com. All rights reserved.

Advertisement
Continue Reading

Technology

Instagram hits the copy button again with new disappearing Instants photos

Published

on

Instagram hits the copy button again with new disappearing Instants photos

Instagram is once again cribbing from competitors like Snapchat and BeReal with a new photo-sharing format it calls “Instants,” which are ephemeral photos that you can’t edit and that you can only share with your close friends or followers that follow you back. Instants are available globally beginning on Wednesday as a feature in the inbox in the Instagram app and as a separate app that’s now in testing in select countries.

To access Instants from the Instagram app, go to your DM inbox and look in the bottom-right corner for an icon or a stack of photos. After you post a photo, your friends can emoji react to it and send a reply to your DMs, but after they see it, the photo disappears for them. Instants also disappear after 24 hours, and they can’t be captured in screenshots or screen recordings.

However, your Instants will remain in an archive for you for up to a year, and you can reshare them as a recap to your Instagram Stories if you’d like. You can also undo sending an Instant right after you post it or delete it from your archive.

The Instants mobile app, which popped up in Italy and Spain in April, gives you “immediate access to the camera” and only requires an Instagram account, Instagram says. “Instants you share on the separate app will show up for friends on Instagram and vice versa. We’re trying this separate app out to see how our community uses it, and we’ll continue to evolve it as we learn more.”

Instagram, in its testing, has seen that people “tend to use Instants to share much more casual, much more authentic moments about their day,” according to Instagram boss Adam Mosseri. “And we know that this type of sharing of personal moments with friends is a core part of what makes Instagram Instagram, but we also know that a lot of people don’t really share a lot to their profile grids anymore.”

Advertisement

Continue Reading

Technology

Facial recognition jails innocent grandmother, attorney says

Published

on

Facial recognition jails innocent grandmother, attorney says

NEWYou can now listen to Fox News articles!

Angela Lipps says she has never been to North Dakota. She says she had never even been on an airplane. That didn’t stop the U.S. marshals from showing up at her home in Tennessee and arresting her.

Lipps, a 50-year-old grandmother of five from Elizabethton, Tennessee, was taken into custody in July 2025 in connection with a bank fraud case more than 1,000 miles away in Fargo, North Dakota. She was not released until around Christmas Eve, meaning she spent more than five months in custody before the case was dismissed.

Investigators had used facial recognition software to compare surveillance images from the bank fraud case with photos of Lipps from her driver’s license and social media. The result, according to her defense attorney, was a case that never should have gone this far.  

Sign up for my FREE CyberGuy Report

Advertisement
  • Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
  • For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com trusted by millions who watch CyberGuy on TV daily.
  • Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.

HOW SURVEILLANCE TECH LED POLICE TO ACCUSE THE WRONG PERSON

A Tennessee grandmother says a facial recognition match helped lead to her arrest in a North Dakota bank fraud case. (Fargo Police Department)

How facial recognition led police to Angela Lipps

The case began with bank fraud reports in Fargo and nearby West Fargo. Police were looking for a suspect who allegedly used a false military ID to take money from accounts.

Detectives reviewed surveillance footage and used facial recognition technology to search for a possible match. Then-Fargo Police Chief Dave Zibolski has described the tool as “an AI function through the North Dakota State Intelligence Center.”

Jay Greenwood, the Fargo defense attorney appointed to represent Lipps, joined our “CyberGuy Report” podcast at CyberGuyPodcast.com to explain how a facial recognition lead helped set the case in motion. His warning was simple: police can use facial recognition as a tool, but they still need to verify what the technology claims. Greenwood said the images used in the case were not exactly crystal clear.

“They had security footage of some terribly placed security cameras from above,” Greenwood said. “And they had a couple of still images, poor still images from these cameras that they sent to a company to do facial recognition.”

Advertisement

That search pointed investigators to Lipps. Greenwood said detectives then looked at her social media pages and moved forward with the case. “They did not do any other investigation prior to her arrest in bringing her to North Dakota,” Greenwood said. Police then sought an arrest warrant. Lipps was arrested in Carter County, Tennessee, and held as a fugitive from justice.

Grandmother arrested at gunpoint while babysitting

Lipps says U.S. marshals arrested her at gunpoint while she was babysitting young children. She was taken to a local jail in Elizabethton, Tennessee, while she waited to be extradited to North Dakota.

Greenwood said Lipps told authorities from the beginning that she had never been to North Dakota. “She told them I’d never been to North Dakota. I’ve never been on an airplane,” Greenwood said. “She really doesn’t leave the 100- to 200-mile radius of Elizabethton ever.”

Still, Lipps remained in jail for months. Fargo Police Chief Travis Stefonowicz told CyberGuy that the department’s review found Lipps was arrested in Tennessee on July 14, 2025, and held on a probation violation.

“Tennessee authorities notified the Cass County Sheriff’s Office on October 20, 2025, that Ms. Lipps had a waiver of extradition to North Dakota and was available for transport to the Cass County Jail,” Stefonowicz said.

Advertisement

Stefonowicz said Fargo Police could not determine from available information why Lipps remained in Tennessee custody for as long as she did before being transported to North Dakota.

“We have been unable to determine based on available information if the length of time Ms. Lipps was in jail in Tennessee before being transported to North Dakota was due to serving time for a probation violation or if it was because she fought extradition,” Stefonowicz said.

Greenwood said she fought extradition and waited in Tennessee before she was taken to North Dakota around Halloween. “Gave her her first ever plane ticket, ever plane ride,” Greenwood said. “And she spent it in custody, flying to North Dakota.”

A woman who says she had never flown before got her first plane ride in custody, headed to fight charges in a state she says she had never visited.

Fargo police respond to facial recognition concerns

Stefonowicz was appointed interim chief on March 30 after former Chief Dave Zibolski retired on March 27. Fargo Police said Zibolski’s retirement was family-related and unrelated to this case. Stefonowicz was officially selected as Fargo’s next police chief during the Fargo City Commission meeting on May 11.

Advertisement

In a statement to CyberGuy, Stefonowicz said the arrest warrant reflected that prosecutors and a judge had found probable cause.

“The Fargo Police Department takes the civil rights and due process of all individuals involved in our investigations very seriously. Regarding the case of Ms. Lipps, the issuance of an arrest warrant indicated that the Cass County State’s Attorney and a judge determined probable cause existed for the charges,” Stefonowicz said.

He said the charges were dismissed without prejudice, meaning they could be refiled if additional investigation supports doing so.

“This remains an ongoing investigation, and we are still working to verify and corroborate information to determine, definitively, who was and was not involved in this home equity loan bank fraud scheme,” Stefonowicz said.

Fargo Police also clarified that the department does not own facial recognition technology or contract directly with vendors that provide it.

Advertisement

“However, there are state and national law enforcement intelligence centers that incorporate facial recognition technology and are used by agencies across the country, including in our state,” Stefonowicz said. “On occasion, FPD investigators may submit inquiries to those intelligence centers, in order to help generate leads through facial recognition for potential suspects or persons of interest in local investigations.”

That distinction matters. Fargo Police says it does not run facial recognition in-house, but investigators may still use outside intelligence centers to generate leads. That puts the focus back on what guardrails exist before those leads support an arrest. 

That response adds an important caveat. Lipps’ defense says she was wrongly accused and later cleared by basic records. Fargo Police say the case remains open and investigators are still trying to determine who was involved.

AMAZON ADDS CONTROVERSIAL AI FACIAL RECOGNITION TO RING

Lipps says she lost her home, car, reputation and dog after spending months behind bars in a case tied to facial recognition technology. (Antranik Tavitian/Bloomberg via Getty Images)

Advertisement

Basic records helped clear Angela Lipps

Once Greenwood got involved, he started looking for proof of where Lipps had been during the alleged bank fraud. The answer came from everyday records. Her family sent bank records showing activity near her home in Tennessee during the same period Fargo authorities claimed she was in North Dakota.

The records showed her depositing Social Security checks and making local purchases. “She was in Elizabethton and the surrounding communities depositing her Social Security checks,” Greenwood said. “Buying Ubers, cigarettes, gas, all that stuff.”

Greenwood said he forwarded the records to the state’s attorney. After a police interview, the case was dismissed. Lipps was released on Christmas Eve.

Fargo Police gave CyberGuy a more detailed timeline. Stefonowicz said Lipps made her first court appearance in North Dakota on Oct. 31, 2025, but the detective assigned to the case did not learn she was in custody in North Dakota until Dec. 5.

“Because she had legal representation, attorney consent was required before our detectives could interview her,” Stefonowicz said. “An interview was first granted by Ms. Lipps’s defense attorney on December 19, 2025.”

Advertisement

After that interview, Stefonowicz said Fargo Police determined that further investigation was needed.

“On December 23, 2025, the FPD detective, the Cass County State’s Attorney and the presiding judge mutually agreed to dismiss the charges without prejudice to allow for additional investigation,” Stefonowicz said. “Ms. Lipps was subsequently released from the Cass County Jail on December 24, 2025.”

By then, she says the damage was already done. She says she lost her home, her car, her reputation and her dog while she was locked up.

Fargo police adopted a facial recognition policy

Fargo Police said it conducted a comprehensive internal review after the case. Stefonowicz said former Chief Dave Zibolski addressed the investigation at a March 24 news conference.

“With respect to this case, we have conducted a comprehensive internal review,” Stefonowicz said. “During a news conference on March 24, former Chief of Police Dave Zibolski addressed areas where our initial investigation could have been more complete and emphasized that further work is required to fully understand who was and was not involved in this scheme.”

Advertisement

Fargo Police has since adopted a formal facial recognition technology policy. Stefonowicz said the department did not previously have a standalone policy because Fargo Police does not conduct facial recognition analysis, provide that service to other agencies or maintain in-house facial recognition technology.

“We have since adopted a formal policy for facial recognition technology (FRT) use for our agency,” Stefonowicz said.

He said the case prompted Fargo Police leadership to revisit that approach.

“This case has prompted FPD leadership to re-evaluate that approach related to having a specific FRT policy,” Stefonowicz said. “FPD Policy 610, which formally establishes parameters and expectations for the use of FRT, was published as of Wednesday, March 25.”

That policy change matters because it shows the case prompted Fargo Police to formalize how investigators may use facial recognition leads, even when the department does not run the technology itself.

Advertisement

Why facial recognition mistakes are so dangerous

Facial recognition can help generate leads. However, critics warn that it can also produce false matches, especially when image quality is poor or the system compares faces against massive databases.

Some systems pull from public photos online, including social media images and other public-facing photos. That means many people may appear in search databases without realizing it.

Greenwood said police need to treat the technology as one investigative tool, not a shortcut around basic detective work.

“I’ve told numerous people, like, it’s a tool,” Greenwood said. “It should be one of the tools that law enforcement can use.”

Then he explained what needs to happen next. “They’ve got to learn to use the other tools to verify what they’re being told by this machine,” Greenwood said.

Advertisement

That is the key issue. A facial recognition hit should push investigators to ask more questions. It should never end the conversation.

Other facial recognition wrongful arrest cases

Angela Lipps is not the first person to say facial recognition helped put them in handcuffs. Other cases have involved people wrongfully arrested after software produced a mistaken match. Civil liberties groups have also warned that facial recognition systems can perform worse on some groups, including darker-skinned men and women. 

That raises a serious question for every police department using this technology. What safeguards exist before a person gets arrested? A bad match on a screen can turn into a search warrant and jail time. For Lipps, that risk became painfully real.

9 ONLINE PRIVACY RISKS YOU PROBABLY DON’T KNOW ABOUT 

The Federal Building and U.S. Courthouse in Fargo, North Dakota, is pictured next to Angela Lipps’ mugshot. Lipps, a Tennessee grandmother, says she spent more than five months in custody after facial recognition linked her to a North Dakota bank fraud case. (Carol M. Highsmith/Buyenlarge via Getty Images / Fargo Police Department)

Advertisement

Ways to stay safe from facial recognition mistakes

Most people will never face anything like this. Still, the Lipps case shows how your digital footprint can follow you in ways you may never expect.

1) Say nothing until you speak with a lawyer

If law enforcement contacts you about something you did not do, do not try to talk your way out of it. Stay calm and ask for an attorney. Even innocent people can say something that gets misunderstood.

2) Keep records that show where you were

Bank transactions, receipts, phone location records, work schedules and medical appointments can help establish where you were on a certain date. You do not need to track every moment of your life. However, basic digital records can help if a serious mistake ever happens.

3) Review your public photos online

Check what photos you post publicly. Also, look at tagged photos from friends and family. Your face can appear online even when you did not post the picture yourself.

4) Remove personal information from data broker sites

Data broker sites collect and sell personal details. A data removal service can help remove your information from these databases. You can also do it manually, but it takes time, and the information can reappear. Check out my top picks for data removal services and get a free scan to find out if your personal information is already out on the web by visiting CyberGuy.com.

Advertisement

5) Ask police and lawmakers about police AI rules

Your city or state may already use facial recognition tools. Ask what rules police must follow before they use an AI match in a criminal case. At a minimum, departments should require independent evidence before an arrest.

What facial recognition mistakes mean for you

AI can help investigators move faster, but speed creates risk when people skip basic steps. Police still need records, timelines and common sense. Facial recognition can make mistakes. It can misread poor images. It can point to the wrong person. And when that happens, the consequences do not stay on a screen. They show up at someone’s front door.

Kurt’s key takeaways

This case should make every police department pause. Facial recognition may help find leads, but it should never be enough to upend someone’s life. Angela Lipps says she lost months behind bars for a crime she did not commit in a state she had never visited. Her attorney says basic records later helped prove she was in Tennessee. That should have happened before spending months in jail. Greenwood summed up the case this way: Ridiculous case never should have happened.” Technology can help police solve crimes. But when a computer match replaces real detective work, innocent people can pay the price. For the full conversation with Angela Lipps’ defense attorney and more on how this case unfolded, listen to the “CyberGuy Report” podcast at CyberGuyPodcast.com.

If a facial recognition match can help send a grandmother to jail, what guardrails should every police department be forced to follow before someone loses their freedom? Let us know by writing to us at CyberGuy.com.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

Advertisement

Sign up for my FREE CyberGuy Report

  • Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
  • For simple, real-world ways to spot scams early and stay protected, visit CyberGuy.com trusted by millions who watch CyberGuy on TV daily.
  • Plus, you’ll get instant access to my Ultimate Scam Survival Guide free when you join.

Copyright 2026 CyberGuy.com. All rights reserved.

Continue Reading
Advertisement

Trending