Microsoft
Microsoft HoloLens

Quest for AI Leadership Pushes Microsoft Further Into Chip Development

July 24, 2017
This is one of the few times Microsoft is playing all roles (except manufacturing) in developing a new processor. The company says this is the first chip of its kind designed for a mobile device.

Tech companies are keen to bring cool artificial intelligence features to phones and augmented reality goggles—the ability to show mechanics how to fix an engine, say, or tell tourists what they are seeing and hearing in their own language. But there's one big challenge: how to manage the vast quantities of data that make such feats possible without making the devices too slow or draining the battery in minutes and wrecking the user experience.

Microsoft Corp. says it has the answer with a chip design for its HoloLens goggles—an extra AI processor that analyzes what the user sees and hears right there on the device rather than wasting precious microseconds sending the data back to the cloud. The new processor, a version of the company's existing Holographic Processing Unit, is being unveiled at an event in Honolulu, Hawaii, today. The chip is under development and will be included in the next version of HoloLens; the company didn't provide a date.

This is one of the few times Microsoft is playing all roles (except manufacturing) in developing a new processor. The company says this is the first chip of its kind designed for a mobile device.

Bringing chipmaking in-house is increasingly in vogue as companies conclude that off-the-shelf processors aren't capable of fully unleashing the potential of AI. Apple is testing iPhone prototypes that include a chip designed to process AI, a person familiar with the work said in May. Google is on the second version of its own AI chips. To persuade people to buy the next generation of gadgets—phones, VR headsets, even cars—the experience will have to be lightning fast and seamless.

"The consumer is going to expect to have almost no lag and to do real-time processing," says Jim McGregor, an analyst at Tirias Research. "For an autonomous car, you can't afford the time to send it back to the cloud to make the decisions to avoid the crash, to avoid hitting a person. The amount of data coming out of autonomous vehicles is tremendous you can't send all of that to the cloud." By 2025, he says, every device people interact with will have AI built in.

For years, the central processing units built by Intel Corp. and others have provided enough oomph and smarts to power the world's gadgets and servers. But the rapid development of artificial intelligence has left some traditional chip makers facing real competition for the first time in over a decade. The accelerating abilities of AI owe much to neural networks that mimic the human brain by analyzing patterns and learning from them. The general-purpose chips used in PCs and servers aren't designed to rapidly process multiple things at once, a requirement for AI software.

Microsoft has been working on its own chips for a few years now. It built a motion-tracking processor for its Xbox Kinect video-game system. More recently, in an effort to take on Google and Amazon.com Inc. in cloud services, the company used customizable chips known as field programmable gate arrays to unleash its AI prowess on real-world challenges. Microsoft buys the chips from Altera, a subsidiary of Intel, and adapts them for its own purposes using software, an ability that's unique to that type of chip.

In a show of strength last year, Microsoft used thousands of these chips at once to translate all of English Wikipedia into Spanish—three billion words across five million articles—in less than a tenth of a second. Next Microsoft will let its cloud customers use these chips to speed up their own AI tasks—a service the company will make available sometime next year. Customers could use it to do things like recognize images from huge sets of data or use machine learning algorithms to predict customer purchasing patterns. “We’re taking this very seriously," says Doug Burger, a distinguished engineer in Microsoft Research, who works on the company's chip development strategy for the cloud. “Our aspiration is to be the No. 1 AI cloud.”

Microsoft has plenty of competition. Amazon also uses field programmable gate arrays and plans to use a new state-of-the-art chip design called Volta for AI built by Nvidia Corp., which is now the leading maker of graphics processors used to train AI systems. Meanwhile Google has built its own AI semiconductors, called Tensor Processing Units, and is already letting customers use them. Creating chips in-house is expensive, but Microsoft says it has no choice because the technology is changing so fast it’s easy to get left behind.

Moving this expertise from the cloud down to the device in a person's hand or on their face is a key priority for Microsoft's AI-focused CEO Satya Nadella. In a May speech he touted the idea of using AI to track industrial equipment, telling the user things like where to find a jackhammer, how to use it and generating a warning in case of unauthorized use or a chemical spill. The new HoloLens chip will make that and much more possible. Says Microsoft Chief Technology Officer Kevin Scott: “We really do need custom silicon to help power some of the scenarios and applications that we are building."

By Dina Bass and Ian King

Popular Sponsored Recommendations

Voice your opinion!

To join the conversation, and become an exclusive member of IndustryWeek, create an account today!