Did This Experimental Smartphone Just Solve One Of Tech’s Big Problems?

By Katharine Schwab

Smart devices continue to infiltrate our homes, but they’re often dependent on slow, clunky smartphone apps. Manually pulling up a different app just to turn on a light, turn up the AC, or reboot your Wi-Fi isn’t just annoying–it’s bad design. While the smart home market is projected to grow from $46.97 billion in 2015 to $121.73 billion by 2022, actually living in a smart home can be incredibly frustrating–an example of how poor UX could have serious business implications as the industry continues to grow.

[Photo: courtesy Chris Harrison]

A new prototype smartphone called the EM-Sensing phone from the Future Interfaces Group at Carnegie Mellon University has the potential to transform how people use appliances in their homes. Presented this week at ACM CHI, the largest human-computer interaction conference of the year, it could change not only how we interact with our smartphones, but how smart home appliances are designed and manufactured.

Any product that contains electronics or electromechanical parts emits an electromagnetic signature. CMU’s EM-Sensing phone is equipped with a sensor that can detect these electromagnetic signals, and a chip that uses machine learning to determine the likeliest match for each signal. In short, the smartphone “listens” to any appliance’s frequency signature to identify it.

“Everything acts like it’s own tiny radio station–only it’s broadcasting centimeters,” says assistant professor of human-computer interaction Chris Harrison, whose collaboration with researchers Robert Xiao, Gierad Laput, and Yang Zhang led to the EM-Sensing phone. By detecting and cataloging the differences between, say, a laptop’s signature versus a vacuum’s, Harrison and his team were able to build a device that actually knows what’s around it, with greater than 98% accuracy.

When a user simply taps the phone to whatever appliance they want to control–whether that’s a refrigerator or printer–the phone automatically pulls up the appliance’s dedicated application. The phone can also use other context clues to make controlling your home appliances easier; if you have a document open on your phone and you’re standing by the printer, tapping your phone to the printer pulls up an on-screen “print” option for the document.

For instance, tapping the phone to a thermostat or a television instantly brings up its corresponding app, allowing you to adjust the temperature or change the channel without having to search through your phone and remember which particular app enabled you to do that. Instead of logging in to your router’s controls on a desktop, tapping your phone to the device could present the same interface. It could also provide more context-dependent uses: Tap the phone to your laptop, and it can send files from the phone to the computer’s desktop.

[Photo: courtesy Chris Harrison]

Such technology could dramatically change what our devices look like, how much they cost, and how they’re designed. The prototype smartphone effectively replaces any buttons on your smart thermostat, fridge, router, or other home appliances; it’s feasible that it could eliminate the need for mechanical buttons at all. It would also make pricey touch screens on individual appliances irrelevant, effectively reducing e-waste. Why would you use buttons or a dedicated touch screen when you can experience more seamless functionality on your smartphone? It’s like 2017’s version of a universal remote: one screen to rule them all.

“Instead of having 50 touch screens in your house for every appliance, you could use the smartphone as this gateway,” Harrison says. “That’s a really powerful notion.”

The EM-Sensing phone is the extension of a project that began in 2015, where Harrison and his team worked with researchers from Disney to create the EM-Sense, a smartwatch that could detect what you were holding in your hand by using the person’s arm as an antennae to read electromagnetic signals. My colleague Mark Wilson named it one of the best user interfaces of 2016.

The new EM-Sensing phone is basically version 2.0, since the functionality of a smartphone is superior to that of a smartwatch. It’s similar to radio-frequency identification (RFID) technology, where objects can be tagged and their location tracked using radio waves, with some major benefits: The technology doesn’t require any external tagging, coordination between manufacturers, or third-party apps in order to manage a host of smart appliances.

“It’s bootstrapping a smart environment,” Harrison says.

[Photo: courtesy Chris Harrison]

It’s a significant step toward a world where the things around us are context-aware enough to create an entirely new paradigm in user experience design. And while Harrison won’t comment on companies that have approached the team about its technology, he made it clear that this development is around the corner. “This isn’t a 10-years-in-the-future kind of thing,” Harrison says. “This is something that conceivably could come to market pretty fast.”

Still, he believes that there’s more research to be done to make the user experience more seamless. For the technology to work, manufacturers would also have to get on board and start offering smartphone apps even for products that aren’t yet smart. And then there’s the barrier of consumer adoption.

“It’s not a magic bullet, but it gets us closer to the smartphone knowing the context around me,” Harrison says. “That’s a much more magical, powerful experience than what we’re seeing right now.”

Read more here:: fastcodesign