Coming Soon: Gadgets That Learn From You

Sept. 16, 2010
Intel developing context-aware technology that can anticipate your next move -- and make suggestions, accordingly.

Think of all the private information that is collected on mobile gadgets: locations visited, restaurants searched, appointments with doctors. Now imagine if that mobile device wasn't just collecting data, but could actually learn from it and make intelligent suggestions off a user's previous behavior.

Users already interact with the first generation of predictive software everyday on Google, Netflix and online music services like Pandora. But according to Intel researchers, "context-aware computing" is growing ever more sophisticated and in the very near future devices will be able to anticipate user's needs or wants and guide them accordingly.

Intels research chief Justin Rattner, during a keynote speech on Sept. 15 at Intels annual developer conference in San Francisco, gave several examples, among them, a prototype application Intel is testing with Fodor's Travel.

Through learning what types of foods a user eats, what types of attractions he or she has visited, and based off the searches typed into the phone or the locations identified using GPS, the software makes similar recommendations when visiting a new city.

"Context-aware computing is poised to fundamentally change how we interact with our devices," said Rattner. "Future devices will learn about you, your day, where you are and where you are going to know what you want. They know your likes and dislikes."

Lama Nachman, a researcher at Intel, showed a prototype TV remote that can collect data on how the device is being held by different users and build profiles based off that information then make personalized recommendations of shows the user may want to watch.

Developers have been trying to develop computer systems that work closely in tune with their users for more than 20 years. What has changed today is the prevalence of smart phones and GPS devices.

Context-aware devices will have to use a combination of "hard-sensing" -- which is raw physical data, such as cameras that detect movement and GPS-based locations -- and "soft sensing," such as calendar information or the data a user inputs into the device. Combined, that information provides a cognitive framework and context, so it can then offer intelligent suggestions.

"Things don't get really interesting until you fuse that hard sensor data with soft sensor data," Rattner said. "It gives devices almost this sixth sense of anticipating what a user will need in the future, whether thats the next few moments or at dinner later in the day."

Thus far, context-aware computing hasnt taken off commercially. But according to Intel, as mobile devices grow smarter and become intertwined in mass culture, apps will inevitably disappear and become part of a device's intelligence.

Sponsored Recommendations

Voice your opinion!

To join the conversation, and become an exclusive member of IndustryWeek, create an account today!