Connect with us

News

Intel venture arm’s China tech stakes raises alarm in Washington

Published

on

Intel venture arm’s China tech stakes raises alarm in Washington

Intel’s venture capital arm has emerged as one of the most active foreign investors in Chinese artificial intelligence and semiconductor start-ups, at a time the $147bn chipmaker receives billions of dollars from Washington to fund a technological arms race with Beijing.

Intel Capital owns stakes in 43 China-based technology start-ups, according to an FT analysis of its portfolio. Since the venture fund was launched in the early 1990s, it has invested in more than 120 Chinese groups, according to data provider Crunchbase.

The fund, which invests off the chipmaker’s balance sheet, has continued to back fledgling Chinese companies in the past year, even as many of its American peers exited the market under pressure from US authorities.

In February Intel Capital invested in a $20mn fundraising round by Shenzhen-based AI-Link, a 5G and cloud infrastructure platform, and last year led a $91mn round for Shanghai-headquartered North Ocean Photonics, a maker of micro-optics hardware.

Rising geopolitical tensions between Washington and Beijing have led to greater scrutiny of private investment flows between the two economic powers as they jostle for technological and military supremacy.

Advertisement

In June, the Biden administration unveiled rules to curb US financing for Chinese technology that could have military purposes, such as AI, quantum computing and semiconductors. The regulations are expected to be finalised this year.

Intel Capital’s “investments were poster children that helped build consensus for the outbound restrictions”, according to one person familiar with the Biden administration’s thinking on the new rules.

Its current investments in China include around 16 AI start-ups and 15 in the semiconductor industry, as well as companies developing cloud services, electric vehicles, telecoms, virtual reality systems and batteries.

Intel Capital may be forced to divest from some companies once the US regulations take effect, though the US Treasury is examining whether to include some exemptions for some venture capital transactions.

However, the US group has slowed down its dealmaking in China over the past 18 months, according to data provider ITjuzi, completing just three deals since the start of 2023. Investment controls and a slowdown in the Chinese economy, as well as lasting repercussions from Beijing’s crackdown on tech companies, have hit start-up valuations and viability.

Advertisement

A report by a US House China committee on the Chinese Communist party in February said that American venture capital firms had invested billions of dollars into companies that were fuelling China’s “military, surveillance state and Uyghur genocide”. This includes funnelling $1.9bn into AI companies and a further $1.2bn into semiconductors.

The report singled out five US venture firms — Sequoia, GGV, GSR Ventures, Qualcomm Ventures and Walden International — but did not mention Intel Capital, despite the fund becoming one of the largest US investors in China after the departure of some of its rivals.

Intel Capital is “way more active” than Qualcomm’s venture arm in China, said the head of a large US fund with a long history of doing business in China. “Intel is active in everything.”

John Moolenaar, Republican head of the House China committee, said the case highlighted the need for tighter regulation.

“The Chinese Communist party remembers the old communist slogan that ‘the capitalists will sell us the rope with which we will hang them’,” said Moolenaar. “We need strong outbound capital restrictions to prevent American firms from investing in companies closely tied to the CCP’s armed forces.”

Advertisement

Intel Capital declined to comment.

Sequoia Capital and GGV Capital, two of the largest US venture investors in China, spun out their Chinese businesses last year amid the mounting political pressure. Qualcomm, Walden and GSR also continue to invest in Chinese start-ups.

In March Intel received about $20bn in grants and loans from the US to fund an expansion of its semiconductor factories, the largest award from the government’s 2022 Chips and Science Act designed to enhance the domestic chip industry. The package will support more than $100bn in US investments from Intel for advanced chipmaking facilities, including building mega-plants in Ohio and Arizona.

You are seeing a snapshot of an interactive graphic. This is most likely due to being offline or JavaScript being disabled in your browser.

Nasdaq-listed Intel has a large China business, where it employs around 12,000 people and accounted for 27 per cent of global revenue in 2023.

Advertisement

Chinese multinational Lenovo is one of the three largest customers of its chips, alongside Dell and HP, generating 11 per cent of global revenue. Last month, Intel’s China arm acquired a 3 per cent stake in Shenzhen telecoms equipment maker Luxshare.

Intel Capital’s China business is run by Tianlin Wang, a life-long Intel employee and head of the unit since 2017. It has six other investment directors in the country. Globally, Intel Capital has invested more than $20bn since the early 1990s and is led by Anthony Lin in San Francisco.

Intel Capital has participated in Chinese start-up deals worth a total $1.4bn since 2015, according to data from PitchBook. That figure relates to the total value of the deals rather than Intel Capital’s individual contribution, which the firm does not make public.

As early as 2014, Intel Capital announced it had invested $670mn in more than 110 Chinese technology companies, and in 2015 alone it gave $67mn to eight Chinese tech companies. Since then, Intel Capital has not publicly revealed the scale of its investments in China.

A report in February 2023 by the US Center for Security and Emerging Technology, a DC think-tank, into the national security risks associated with US investment in Chinese AI companies, found that Intel Capital participated in 11 deals for such companies between 2015 and 2021. A person close to Intel said there were only four AI deals during this time.

Advertisement

In some cases, the US fund obtained a board seat, such as at Horizon Robotics, a chipmaker, and Eeasy Tech, which designs AI chips for facial recognition and that was also backed by the Zhuhai provincial government.

“Intel Capital’s investments in Chinese AI firms have led to the formation of strategic collaborations that could benefit the Chinese companies in a way that complements Chinese government strategies,” that report said.

In one case, Intel Capital helped fund the creation of a Chinese company that was later sanctioned by the US. The fund was one of the earliest investors in AI voice recognition group iFlytek, acquiring a 3 per cent stake in 2002 before selling the shareholding two years later. The company was one of six Chinese companies banned by the US in 2019 for their roles in alleged human rights abuses in Xinjiang.

“The fear of missing out in the AI era has created a sense of urgency for Intel Capital,” said the head of a rival Chinese venture firm that has co-invested alongside them. “Intel is under such fierce competition in AI in the US, they can’t afford to be left behind, so they have to look around the world for where to deploy money into AI and China is one of the very few options.”

Advertisement
Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

News

Mars defensive move in snacking isn’t a light bite

Published

on

Mars defensive move in snacking isn’t a light bite

Unlock the Editor’s Digest for free

It is snack time in the packaged food industry. Confectionery giant Mars’ $36bn swoop on Pringles maker Kellanova could put dealmaking back on the table for other food and drink multinationals.

Privately held Mars, home of Snickers and Skittles, will pay $83.50 a share in cash for the maker of Cheez-it and Eggo waffles. The price represents a 42 per cent premium over Kellanova’s undisturbed three month average. 

With few big US snacks groups left, a deal was never going to come cheap. Including Kellanova, there are just seven companies in the US packaged food sector with market values of over $20bn. 

Advertisement

Mars is paying the equivalent of 16 times EV to forward ebitda for Kellanova. The median ratio for recent deals in the sector was around 15 times, according to JPMorgan. And the deal looks even pricier considering the difficult outlook for snacking — particularly the less healthy varieties that are in Kellanova’s portfolio. 

Salty snacks have been the fastest-growing category in the packaged food sector over the past 14 years, with a compound annual growth rate of around 5.8 per cent between 2010 and 2023, according to Citi. 

But that growth has slowed sharply this year. Inflation-wary consumers — particularly lower-income ones — are cutting back. At the same time, the rise of GLP-1 weight loss drugs like Ozempic, Wegovy and Zepbound is reshaping America’s waistlines. In a study by Morgan Stanley earlier this year, about two-thirds of GLP-1 drug users surveyed said they have cut back on snacking by over 50 per cent. Half of those surveyed also said they have cut back on sweets by more than 75 per cent or have stopped scoffing them altogether.

The strain is starting to show. Kellanova’s organic net sales were up 5 per cent in the first six months of the year. But that was largely driven by price increases. That strategy isn’t sustainable: consumers will buy less or choose private label brands.

Mars, as a private company with more than $50bn in sales, opted not to provide cost savings targets to justify its deal, but overlap between the two looks limited. Mars is clearly prepared to pay up to diversify away from its chocolate-heavy snack portfolio. Kellanova, which makes about half its $13bn annual sales from savoury chips and crackers, would do that. Still, $36bn is a big mouthful for what looks like a dubious defensive move.

Advertisement

pan.yuk@ft.com

Continue Reading

News

AI is changing video games — and striking performers want their due

Published

on

AI is changing video games — and striking performers want their due

Actor and stunt performer Andi Norris wears a full body suit covered in sensors — part of the behind-the-scenes process that makes video game characters come to life. Norris is part of the negotiating team for SAG-AFTRA, which is on strike against major gaming companies. The future of AI in game development has become a central issue.

Andi Norris


hide caption

toggle caption

Advertisement

Andi Norris

Jasiri Booker’s parkour and breaking movements are used to animate the title character in Marvel’s Spider-Man: Miles Morales video game. 

“I stick to walls. I beat people up. I get beaten up constantly, get electrocuted and turn invisible,” the 26-year-old says.

He and other performers act out action sequences that make video games come to life.

Advertisement

But earlier this month, Booker picketed outside Warner Bros. Studios in Burbank, Calif., along with hundreds of other video game performers and members of the union SAG-AFTRA. They plan to picket again outside Disney Character Voices in Burbank on Thursday.

After 18 months of contract negotiations, they began their work stoppage in late July against video game companies such as Disney, WB Games, Microsoft’s Activision, and Electronic Arts. Members of the union have paused voice acting, stunts, and other work they do for video games.

YouTube

Advertisement

The bargaining talks stalled over language about protections from the use of artificial intelligence in video game production. Booker says he’s not completely against the use of AI, but “we’re saying at the very least, please inform us and allow us to consent to the performances that you are generating with our AI doubles.” He and other members of SAG-AFTRA are upset over the idea that video game companies could eventually replace him and now may see his very human stunts as simply digital reference points for animation.

In a statement, a spokesperson for the companies, Audrey Cooling, wrote, “Under our AI proposal, if we want to use a digital replica of an actor to generate a new performance of them in a game, we have to seek consent and pay them fairly for its use. These are robust protections, which are entirely consistent with or better than other entertainment industry agreements the union has signed.”  

But video game doubles say those protections don’t extend to all of them – and that’s part of why they’re on strike.

Andi Norris, a performer on the union’s negotiating team, says that under the gaming companies’ proposal, performers whose body movements are captured for video games wouldn’t be granted the same AI protections as those whose faces and voices are captured for games.

Norris says the companies are trying to get around paying the body movement performers at the same rate as others, “because essentially at that point they just consider us data.” She says, “I can crawl all over the floor and the walls as such-and-such creature, and they will argue that is not performance, and so that is not subject to their AI protections.”

It’s a nuanced distinction: the companies have included “performance capture” in their proposal, including recordings of voice and face performers, but not behind-the-scenes “motion capture” work from body doubles and other movement performers that are used to render motion.

Advertisement

But Norris and others like her consider themselves “performance capture artists” – “because if all you were capturing is motion, then why are you hiring a performer?”

Andi Norris (left) and Jasiri Booker (right) picketing outside Warner Bros. Studios in early August.

Andi Norris (left) and Jasiri Booker (right) picketing outside Warner Bros. Studios in early August.

Mandalit del Barco/NPR


hide caption

toggle caption

Advertisement

Mandalit del Barco/NPR

How motion capture works

Another Spider-Man double, Seth Allyn Austin, says video game performance artists work in studio spaces known as “volumes,” surrounded by digital cameras. They wear full body suits – a bit like wetsuits – dotted with reflective sensors captured by cameras, “So the computer can have our skeletons and they can put whatever they want on us.”

Those digitized moving skeletons are fed into video software and then rendered into animated video game characters, says mechanical engineer Alberto Menache, cofounder of NPCx, which develops AI tools to capture human motion data for video games and movies.  “Motion capture,” he says, “They call it mocap for short.”

Advertisement

Menache is a pioneer in the field, and has consulted or supervised the visual effects for films including The Polar Express, Spider-Man, Superman Returns, and Avatar: The Way of Water. He’s also worked at PDI (which became DreamWorks Animation before shuttering), Sony Pictures, Microsoft, Lucasfilm and Electronic Arts. (Electronic Arts along with Activision, owned by Microsoft, are both involved in negotiations with SAG-AFTRA and currently involved in the work stoppage.)

Performers are outfitted with suits covered in sensors. Behind the scenes, visual effects crews use these sensors to construct a digital version of performers' bodies.

Performers are outfitted with suits covered in sensors. Behind the scenes, visual effects crews use these sensors to construct a digital version of performers’ bodies.

Alberto Menache


hide caption

toggle caption

Advertisement

Alberto Menache

It takes an entire crew of digital artists, he says, to animate the motions created by human performers. “You need a modeler to build the character, then you need a person doing the texture mapping, as it’s called, which is painting the body or painting the Spider-Man suit,” he says. “Then you need a rigger, which is the person that draws the skeleton, and then you need an animator to move the skeleton. And then you need someone to light the character.”

From hand-drawn animation to motion capture

During the silent picture era more than a century ago, hand-drawn animators began using live-action footage of humans. They created sequences by tracing over projected images, frame by frame – a time consuming process that became known as “rotoscoping.” Filmmaker Max Fleischer patented the first Rotoscope in 1915, creating short films by hand-drawing over hand-cranked footage of his brother as the character Koko the Clown. According to Fleischer Studios, one minute of film time initially required almost 2,500 individual drawings. Fleischer went on to animate Popeye the Sailor and Betty Boop this way, as well as characters in Gulliver’s Travels,  Mr. Bug Goes to Town, Superman and his version of Snow White.

Later, Walt Disney animators used rotoscope techniques, beginning with the 1937 film Snow White and the Seven Dwarfs.

Advertisement

By the 1980s, animation techniques advanced with computer generated images. During the 1985 Super Bowl, viewers watched an innovative 30-second commercial made by visual effects pioneer Robert Abel and his team. To create the ad for the Canned Food Information Council, they painted dots onto a real woman performer as the basis for a “sexy” robot character that was then rendered on a computer.

Canned Food Information Council, “Sexy Robot,” Super Bowl 1985

YouTube

Menache says similar technology had been used by the military to track aircraft, and in the medical field to diagnose conditions such as cerebral palsy. In the early 1990s, he innovated the technique by developing an animation software for an arcade video game called Soul Edge.

Advertisement

“It was a Japanese ninja fighting game. And they brought a ninja from Japan,” he says.  “We put markers on the ninja and we only had a seven by seven foot area where he could act because we only had four cameras. So the ninja spent maybe two weeks doing motions inside that little square. It was amazing to see. And then it took us maybe a month to process all that data.”

Will human performers be needed in the future?

Besides Spider-Man, Seth Allyn Austin has portrayed heroes, villains and creatures in such games as The Last of Us, and Star Wars Jedi: Fallen Order. He says the technology has evolved even since he started a decade ago. He remembers wearing a suit with LED lights powered by a battery pack. “Whenever I did a flip, the battery pack would fly off,” he recalls. “I’ve had engineers have to try to solder the wires back on while I’m wearing the suit because it would save time. Luckily we’ve moved away from that technology.

Seth Allyn Austin has performed stunts and voice work on various Marvel Spider-Man video games. He picketed at the Warner Bros. Studios in Burbank, Calif., in early August.

Seth Allyn Austin has performed stunts and voice work on various Marvel Spider-Man video games. He picketed at the Warner Bros. Studios in Burbank, Calif., in early August.

Mandalit del Barco/NPR


hide caption

Advertisement

toggle caption

Mandalit del Barco/NPR

These days, he says, some new technologies allow performers to watch themselves performing on screen as fully animated characters in 3-D, reacting to animated settings and other characters.

“We can adjust our performance in real time to make it look even more creepy or cool or realistic or heroic,” he says. “That’s the thing with AI, the tool is pretty cool. The tool can help us a lot. But if the tool is used to replace us, then it’s not the tool, it’s who’s wielding it.”

Advertisement

Menache says replacing human performers for video games or films is unlikely any time soon.

“If you want it to look real, you can’t animate,” he says. “There’s a lot of very good animators, but their expertise is mostly for stylized motion. But real human motion: Some people get close, but the closer you get to that look, the weirder it looks. Your brain knows.”

He likens it to the phenomenon of the uncanny valley – as he describes it, “that one percent that is missing, that tells your brain something’s wrong,” he says.

How AI is changing video game development

Menache is now developing AI technology that doesn’t require people to wear sensors or markers. “To train the AI, you need data from people,” he says. “We don’t just grab people’s motions, we get their permission.”

For example, he says he could hire and film team players from LA Galaxy, like he once did in the 1990s. Their moves could be stored to train the AI model to develop new soccer video games. “With our new system,” he says, “They won’t even need to go to the studio… You just need footage. And the more angles, the better.”

Advertisement

Menache has also developed technology for face tracking and “de-aging” actors, and to create “deep fakes” where actors’ faces can be scanned and altered. All of this, he says, still requires the consent of human performers.

Even AI still needs humans to train the models, says Menache. I built a system for face tracking, and I trained it with maybe 2,000 hours of footage of different faces. And now it doesn’t need to be trained anymore. But a face is a lot less complex than a full body,” he says, adding that would need footage of “thousands and thousands of hours of people of different proportions.”

“Maybe you wouldn’t need people to do that anymore,” he says, “but the people that were used to train it should get their piece of whatever this is useful. That’s what the strike is all about. And I agree with that. We don’t use any data that is not under permission from the performers.”

Editor’s note: Many NPR employees are members of SAG-AFTRA, but are under a different contract and are not on strike.

Advertisement
Continue Reading

News

Tracking the Swing States for Harris and Trump

Published

on

Tracking the Swing States for Harris and Trump
Advertisement

The presidential race will most likely come down to voters in 10 states that remain competitive, according to the most recent ratings by the Cook Political Report.

Note: Nebraska and Maine each award two electoral votes to the statewide winner and one to the top vote-getter in each congressional district. Nebraska’s Second District is rated as “Lean Democratic” by Cook Political Report.

Advertisement

There are many combinations of states that could put either Vice President Kamala Harris or former President Donald J. Trump over the threshold of the 270 electoral votes needed to win.

If both candidates win all of the states in their solid, likely and lean categories, the race would come down to the six tossup states in yellow. Ms. Harris would need 44 electoral votes from the tossup states to win. Mr. Trump would need just 35.

Tossups

State Elec. votes 2020 margin 2024 polling

Michigan

15

Advertisement

D +2.8

Harris +1.5 ›

Pennsylvania

19

D +1.2

Trump +0.1 ›

Wisconsin

Advertisement

10

D +0.6

Harris +1.8 ›

Georgia

16

D +0.2

Advertisement
Not enough polls

Arizona

11

D +0.3

Not enough polls

Nevada

6

Advertisement

D +2.4

Not enough polls

The six states rated as tossups were all won narrowly by Joseph R. Biden Jr. in 2020. If the race comes down to these places, Mr. Trump has an advantage in the electoral math. He could win with just two electorally valuable tossup states — Pennsylvania and Georgia — while all of Ms. Harris’s paths to victory include at least three. If Mr. Trump loses Georgia but wins Pennsylvania, Ms. Harris would still need at least three other tossups (in addition to Georgia) to win.

Of course, some of the states currently rated as leaning or likely Democrat or Republican could also come into play.

Lean Democrat

State E.V. 2020

Minnesota

Advertisement

Minn.

10

D +7.1

Nebraska 2nd District

Neb. 2

Advertisement

1

D +6.5

New Hampshire

N.H.

4

Advertisement

D +0.4

Lean Republican

State E.V. 2020

North Carolina

N.C.

16

Advertisement

R +1.3

Likely Democrat

State E.V. 2020

New Mexico

N.M.

5

Advertisement

D +10.8

Virginia

Va.

13

D +10.1

Advertisement

Maine

Maine

2

D +9.1

Likely Republican

Advertisement
State E.V. 2020

Florida

Fla.

30

R +3.4

Maine 2nd District

Advertisement

Me. 2

1

R +7.9

Texas

Texas

Advertisement

40

R +5.6

Note: Nebraska and Maine each award two electoral votes to the statewide winner and one to the top vote-getter in each congressional district. Nebraska’s second district is rated as “Lean Democratic” by Cook Political Report.

Advertisement
Continue Reading

Trending