Connect with us

Technology

In No Other Choice, the real job killer is this guy

Published

on

In No Other Choice, the real job killer is this guy

Park Chan-wook’s 12th feature-length movie, No Other Choice, begins with Man-su (Lee Byung-hun) as a proud patriarch at the barbecue, a vision of the platonic ideal domestic life he will spend most of the movie defending. In the long middle where life is lived, the movie offers its audience mirth and pathos and deep social critique. Also: murders. After being laid off from a paper company, Man-su realizes that his best chance at getting hired for his next job is to knock off the three other qualified candidates.

Adapted from Donald Westlake’s novel The Ax, No Other Choice captures — most delightfully and cathartically — the perpetual and unsolvable anxiety of living under an economic system built around extracting surplus value from its workers. Or the dark irony that if a corporation makes a person redundant, it is strategy; if a human does the same, it’s a crime.

With this film, not to mention his earlier works like Oldboy and The Handmaiden, Park establishes himself as a director who understands intimately that tragedy and comedy cannot be separated. Here, it’s the tragedy that life must be lived, that we ought to work at all, that so much in this life in fact depends on this work, set against the comedy of how somebody like Man-su sets about solving this impossible riddle for himself.

The Verge spoke with Park about his relationship to his source material, artificial intelligence, and how he recovers after wrapping a picture.

Director Park Chan-wook
Courtesy of Neon
Advertisement

This interview has been edited and condensed.

The Verge: Have you ever been fired from a job?

Park Chan-wook: That’s never happened to me, mercifully. Those kinds of things actually happen quite often in our industry. I’ve been fortunate enough to avoid that fate, but there have been many times when I’ve been afraid of being let go. While working on any project, invariably comes a time when differences in opinion form between the studio or the producers. In that instance, whenever I stubbornly stick to my original position, I do so knowing I am exposing myself to that kind of danger.

And when a movie comes out and it doesn’t do well, then comes the fear that I won’t be able to find a job again, or that I won’t be able to raise funds for my next project.

But also that fear isn’t something that accompanies you after you get your report card from the box office exclusively. All throughout the filmmaking process, it stays with you, that fear. It stays with you from the initial planning stages of a movie. And then if the movie doesn’t do well, that fear sharpens, and it never goes away. It is near to you always.

Advertisement

At the screening I attended, you said you first encountered the source material, the Donald Westlake novel The Ax, via your love of the movie Point Blank, which you cite as your favorite noir. Do you remember how you discovered the movie, and are there other Westlake novels you are curious about?

Point Blank is a film directed by John Boorman, a British director, and I watched it for two reasons. The first is that I’ve always liked John Boorman. The first Boorman film I ever saw was Excalibur.

Second, I’m a fan of the actor Lee Marvin. Because Point Blank was a collaboration between a director I like and an actor I also like, I had always wanted to see it. But accessing the movie was difficult in Korea for a long time, so it was only later that I got to watch it.

As for Westlake, surprisingly not too many of his books are in translation. That The Ax was translated into Korean was itself an anomaly. And so I’ve only read a few of his books.

You’ve been trying to make No Other Choice for 16 years. You also said you tried going through Hollywood first. How come?

Advertisement

Since the novel was written with an American setting, I naturally thought making it into an American film would be the best option. At that time, I had already made Oldboy, Thirst, Lady Vengeance, and Stoker, and so making a movie in America was not intimidating.

What was the most common feedback you received in these early years?

In 2010, we secured the rights and began actively pursuing the project. Initially, we met with French investors. Although it was to be an American movie filmed in America, we met with French investors thanks to Michèle Ray-Gavras, wife of [director] Costa-Gavras, who was among our producers, and through her we contacted various studios, from France to the United States.

Starting then, I continued receiving offers that were slightly less than what I wanted, which is why I could not possibly accept them.

As for notes from the studios, beyond anything, they doubted whether the audience would believe that Man-su would resort to murder because he lost his job. They wanted to know how I was going to bring the audience along.

Advertisement

Other than that, people’s senses of humor varied slightly. Some said this part isn’t funny. Others said that part isn’t funny. We faced some challenges.

You mentioned there are Easter eggs strewn about the movie and I am curious about them. You mentioned that the oven mitt Man-su uses during his attempted murder can be seen later back in his kitchen. A Christmas stocking from the same scene can be seen in a family photo in the background. What other such details are there to look out for?

I can’t guarantee that the framed photo with the Santa Claus costume can be seen properly. We did place it on set during filming. In fact, we gathered the entire family, dressed them up and took pictures specifically for that framed photo. But I don’t know if it is actually visible in the final movie. It will definitely, however, be in the extended cut that I’m preparing for the Blu-ray release.

And rather than considering it an Easter egg, it might be more accurate to consider it part of creating a believable world for the actors. So that once the actors enter that world, they feel like they can more easily become their characters. And for there to be that trust and sense of a stable reality, the better it is to attend to props or anything else spatially. The more consideration, the better.

AI shows up at the end of the movie, which I imagine was not part of the original idea you had when you began the project. When did you know to add AI to the film?

Advertisement

Had this been made into an American film, such a plot point would not have been available. It was only because the process took so long that the issue could be incorporated.

Any director making a movie about employment, or unemployment rather, would be remiss to not mention AI. Moreover — and this was important for me — by the end, Man-su’s family catches on to what he has done in the name of the family. Of course, Man-su isn’t entirely sure if they know, but the audience knows. The very thing he does for his family will be the thing that leads to its collapse. All of his efforts are for naught, which echoes the situation with AI.

He painstakingly eliminated his human competitors to secure a job. But what he confronts at his new workplace is a competitor more formidable than any mortal. Meaning Man-su likely won’t last long before AI takes over. He will lose his job, yet again, at which point, what was it all for? What were the murders for? This too can be seen as a colossal wasted effort.

Therefore, the introduction of AI technology from a creative perspective was a great addition to the movie.

How do you feel about the use of AI in film? Would you use it in your own work? I am sensing the answer is “no.”

Advertisement

I hope that never happens.

It’s not easy for young film students out there. And if there were a technology that allows them to make their own movies at a reduced cost, in a way that could not have been possible before, who could stop them? It would not be possible to tell them not to.

A still from the film No Other Choice

Man-su (Lee Byung-hun) is a hapless killer.
Courtesy of Neon

What is the question No Other Choice is asking?

Those who have arrived at the middle class, those who have become accustomed to a certain way of life, and it wasn’t inherited, they obtained it of their own accord — for that class of people, giving all that up would be very difficult. Slipping from that station would be challenging to accept. I would certainly find it difficult to accept.

Of course, that doesn’t mean I am going to commit murder — three, no less — but it’s an impossible situation.

Advertisement

“My child desperately needs private cello lessons. Not only that, it’s a vital part of them becoming an independent adult.” Giving that up would be staggeringly hard. I am imagining what I might be capable of in such a scenario.

I wanted to create a space in which people might ask themselves that question. Not to simply criticize Man-su, but to ask themselves, what if, what might happen, if there was such a person in such a situation? It’s an exercise in imagination.

What was the most difficult time in your career and how did you recover from it?

When my first two films failed at the box office. Before I made JSA, the period between the first film and the second film, and between the second film and the third film, was most difficult. I had no choice but to make the rounds with my screenplay — not unlike how Man-su does with his resume — looking for producers and studio executives. Often I was rejected. That was a tough time.

By then I had married and had dependents and so I resorted to film criticism to make a living. Being a film critic is a great profession, but it was not what I wanted, so I suffered. What’s more, I wanted to be making my own movie, but instead I was reduced to analyzing other people’s movies. If I watched an excellent movie, I would be filled with envy. The reality that demanded I live like that seemed to also be mocking my pain, a kind of taunting. But I had no other means of surviving.

Advertisement

What will you work on next?

Actually, I have two projects that are already prepared. I have a script for a Western that has been written and revised several times. There is also a sci-fi action film for which I haven’t written the script yet, but I put together a fairly involved treatment for.

A photo of director Park Chan-wook on set

Park giving notes on set.
Courtesy of Neon

How do you recover after filming a movie?

Luckily, I am traveling with Lee Byung-hun at the moment. I might drink a glass of wine with him. He is rather serious about wine, and so if I drink with him, I am bound to drink something good.

Have you any deep and profound advice for young filmmakers?

Advertisement

In film school, you might learn certain lessons from your instructors. You might also learn from directors who are already successful. If you are a fan of genre, you might study the convention of your chosen genre.

That is all very well, but before anything, the first order is to really have your own voice. And to examine yourself honestly. And to tell the story that comes spontaneously from within. In my opinion, spontaneity is the most important thing. Not to say “this is popular,” or “people like this,” but what is the true thing that comes from your own and inner self? Follow that thread with sincerity.

Of course it’s easy for me to say this — anybody can say it — but putting it into practice is another thing entirely.

No Other Choice is in select theaters December 25, 2025, with a wider release planned in January.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.
Advertisement

Technology

Anthropic upgrades Claude’s memory to attract AI switchers

Published

on

Anthropic upgrades Claude’s memory to attract AI switchers

Anthropic is making it easier to switch to its Claude AI from other chatbots with an update that brings Claude’s memory feature to users on the free plan, along with a new prompt and dedicated tool for importing data from other chatbots. These upgrades could allow users who have been using rivals like OpenAI’s ChatGPT or Google’s Gemini to quickly copy the data their preferred AI has collected on them and bring it over to Anthropic’s chatbot. That way, they don’t have to “start over” teaching Claude the context and history their previous chatbot already knows.

The option to import and export memories from Claude has been available since October, when Anthropic also rolled out the option for users to turn on Claude’s memory. Up until now, the memory feature was only available to users on paid Claude subscriptions, but now all Claude users can turn it on by going into “settings” then “capabilities.” This menu is also where users can find the new memory importing tool, which has users copy a pre-written prompt into their previous AI then copy the output from that prompt back into Claude’s importing tool.

Anthropic is introducing the upgraded memory importing tool as Claude is seeing a rise in popularity, driven by tools like Claude Code and Claude Cowork. Last month, Anthropic launched its new Opus 4.6 and Sonnet 4.6 models, which the company says are better at coding and completing complex tasks like working through a spreadsheet or filling out forms.

Anthropic has also been experiencing a spike in attention recently after pushing back against demands from the Pentagon to loosen the guardrails on its AI models, with the company stating publicly that they drew “red lines” around mass surveillance and fully autonomous lethal weapons.

Continue Reading

Technology

Why the Microsoft 365 Copilot bug matters for data security

Published

on

Why the Microsoft 365 Copilot bug matters for data security

NEWYou can now listen to Fox News articles!

You trust your email security settings for a reason. So when an AI assistant quietly reads and summarizes messages marked confidential, that trust takes a hit.

Microsoft says a bug in Microsoft 365 Copilot allowed its AI chat feature to process sensitive emails since late January.

The issue bypassed Data Loss Prevention policies that organizations rely on to protect private information. Put simply, emails that were supposed to stay locked down were being summarized anyway.

Sign up for my FREE CyberGuy Report 

Advertisement

Get my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter    

Microsoft 365 Copilot’s work chat interface sits at the center of the issue after a bug allowed it to summarize confidential emails. (Microsoft)

Microsoft 365 Copilot bug summarized confidential emails

Microsoft says a coding error impacted Microsoft 365 Copilot Chat, specifically the “work tab” feature. The AI assistant helps business users summarize content, draft responses and analyze information across Word, Excel, PowerPoint, Outlook and OneNote.

Beginning Jan. 21, an internal bug labeled CW1226324 caused Copilot to read and summarize emails stored in Sent Items and Drafts folders.

The real concern runs deeper. Several of those messages carried confidentiality or sensitivity labels.

Advertisement

Companies apply those labels along with DLP policies to block automated systems from accessing restricted content. Despite those safeguards, Copilot still generated summaries. 

We reached out to Microsoft, and a spokesperson provided CyberGuy with the following statement:

“We identified and addressed an issue where Microsoft 365 Copilot Chat could return content from emails labeled confidential authored by a user and stored within their Draft and Sent Items in Outlook desktop. This did not provide anyone access to information they weren’t already authorized to see. While our access controls and data protection policies remained intact, this behavior did not meet our intended Copilot experience, which is designed to exclude protected content from Copilot access. A configuration update has been deployed worldwide for enterprise customers.” 

Why the Microsoft 365 Copilot bug matters for data security

AI tools feel helpful. They save time and reduce busy work. But they also rely on deep access to your data. When safeguards fail, even temporarily, sensitive content can move in ways you did not expect.

YOUR PHONE SHARES DATA AT NIGHT: HERE’S HOW TO STOP IT

Advertisement

For businesses, that could mean:

Legal discussions summarized outside intended controls

Financial projections processed despite restrictions

HR communications are exposed to automated analysis

Even if no data leaves the organization, the bypass itself raises concerns about how AI integrates with enterprise security systems.

Advertisement

Business users rely on Copilot to streamline work, but a recent bug raised concerns about how it handles sensitive email content. (Microsoft)

How Microsoft is fixing the Microsoft 365 Copilot bug

Microsoft says it began rolling out a fix in early February. The company continues to monitor deployment and is contacting some affected users to verify the fix works.

However, Microsoft has not provided a final timeline for full remediation. It has also not disclosed how many organizations were affected.

The issue is tagged as an advisory, which usually signals limited scope or impact. Still, many security professionals will want deeper clarity before feeling comfortable.

What this Microsoft 365 Copilot issue reveals about AI security

This incident highlights something many companies are wrestling with right now. AI assistants sit inside productivity platforms. They need access to email, documents and collaboration tools to work well.

Advertisement

TIKTOK AFTER THE US SALE: WHAT CHANGED AND HOW TO USE IT SAFELY

At the same time, those platforms contain your most sensitive information. When AI features expand quickly, security policies must evolve just as fast. Otherwise, even a small code mistake can create unexpected exposure.

The Copilot chat feature was designed to boost productivity, yet a code error let it process emails labeled confidential. (Microsoft)

Ways to stay safe after the Microsoft 365 Copilot bug

If your organization uses Microsoft 365 Copilot, here are practical steps to reduce risk:

1) Review Copilot access settings

Work with your IT team to confirm which folders and data sources Copilot can access.

Advertisement

2) Revalidate DLP policies

Test sensitivity labels and DLP (Data Loss Prevention)  rules to ensure they block AI processing as intended.

3) Monitor advisory updates

Stay current on Microsoft service alerts and verify that the fix is fully deployed in your tenant.

4) Limit AI scope during investigations

If you have concerns, consider temporarily restricting Copilot features until verification is complete.

5) Train employees on AI boundaries

Remind staff that AI assistants can process drafts and send messages. Encourage careful handling of sensitive content.

6) Audit Copilot activity logs

Review audit logs to see whether Copilot accessed or summarized labeled emails. This helps determine actual exposure rather than assumed risk.

Advertisement

7) Review sensitivity label configuration

Confirm that confidential labels are configured to block AI processing where required. Misconfigured labels can create gaps even after a bug is fixed.

8) Reassess retention and draft policies

Because the issue involved Sent Items and Drafts, evaluate whether sensitive drafts should be stored long-term or deleted after sending.

9) Limit Copilot to specific user groups

Instead of enabling Copilot organization-wide, consider a phased deployment to departments with lower sensitivity exposure.

10) Conduct a post-incident security review

Use this moment to reassess how AI tools integrate with compliance controls. Treat it as a learning opportunity rather than a one-time glitch.

Pro Tip: This Copilot bug centers on enterprise controls. Even so, AI tools operate on your devices and accounts, so keeping software up to date and using strong antivirus software adds an important layer of defense. Get my picks for the best 2026 antivirus protection winners for your Windows, Mac, Android & iOS devices at Cyberguy.com

Advertisement

Considering a more private email provider

Enterprise AI bugs raise a bigger question: how much access should email platforms have to your data in the first place? If you want an added layer of privacy beyond mainstream providers, privacy-focused email services are worth exploring.

Some offer end-to-end encryption, support for PGP encryption and a strict no-ads business model that avoids scanning messages for marketing purposes.

AI WEARABLE HELPS STROKE SURVIVORS SPEAK AGAIN

Many also allow you to create disposable email aliases, which can reduce spam and limit exposure if one address is compromised.

While no provider is immune to software bugs, choosing an email service built around privacy rather than data monetization can limit how much of your information is accessible to automated systems in the first place.

Advertisement

For individuals, journalists and small businesses especially, that added control can make a meaningful difference.

For recommendations on private and secure email providers that offer alias addresses, visit Cyberguy.com

Kurt’s key takeaways

AI assistants are becoming part of daily work life. They promise speed, efficiency and smarter workflows. But convenience should never outrun security.

This Copilot bug may have a limited impact. Still, it serves as a reminder that AI tools are only as strong as the guardrails behind them.

When those guardrails slip, even briefly, sensitive information can move in unexpected ways. As AI becomes more embedded in business software, trust will depend on transparency, fast fixes and clear communication.

Advertisement

Here is the real question: If your AI assistant can see everything you write, are you fully confident it respects every boundary you set? Let us know by writing to us at Cyberguy.com

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

Sign up for my FREE CyberGuy Report Get my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join my CYBERGUY.COM newsletter 

Copyright 2026 CyberGuy.com.  All rights reserved.  

Advertisement

Related Article

149 million passwords exposed in massive credential leak
Continue Reading

Technology

Samsung’s Digital Home Key lets you use your phone as your key

Published

on

Samsung’s Digital Home Key lets you use your phone as your key

Just days after showing off the Galaxy S26, Samsung is finally rolling out the ability for users to unlock their home with a tap of their phone or by simply approaching their door. The new feature, called Digital Home Key, will live inside Samsung Wallet and is powered by the Aliro smart home standard.

Samsung first teased its Digital Home Key feature in 2024 and said the feature would be available in 2025. That didn’t pan out, as the CSA’s Aliro standard — which will let users unlock smart locks with any phone — only arrived in February of this year. The new standard uses near-field communication (NFC) for its tap-to-unlock technology. It also supports ultra-wideband (UWB), giving users the ability to unlock their door as they approach and without pulling out their phone.

To add a Digital Home Key to your wallet, you’ll need to set up a compatible smart lock through SmartThings using Matter. Only some Galaxy smartphones support both NFC and UWB, including the Galaxy Z Fold 4 and up, as well as the Galaxy S22 Ultra and up. You can view the full list of compatible devices on Samsung’s website.

Continue Reading

Trending