It first appeared on March 9 as a tweet on Andrew Bosworth’s timeline, the little corner of the internet that offers a rare glimpse into the mind of a Facebook executive today. Bosworth, who leads Facebook’s augmented and virtual reality research labs, had just shared a blog post outlining the company’s ten-year vision for the future of human-computer interaction. Then, in a continued tweet, he shared a photo of an as yet unseen wearable device. Facebook’s vision for the future of interacting with computers would involve strapping something that looks like an iPod Mini around your wrist.
Facebook already owns our social experience and some of the world’s most popular messaging apps, for better or for worse. Whenever the company digs into hardware, whether that’s a very good VR headset or a video chat device that tracks your every move, it gets noticed. And it raises not only intrigue, but also questions: Why does Facebook want to own this new computing paradigm?
In this case, the unanswered questions are less about the hardware itself and more about the research behind it — and whether the new interactions Facebook envisions will only deepen our ties to Facebook. (Answer: probably.) In a media briefing earlier this week, Facebook executives and researchers offered an overview of this technology. In the simplest terms, Facebook has been testing new computer inputs using a sensor-filled wrist wearable.
It is an electromyography device, which means it converts electrical motor nerve signals into digital commands. When it’s on your wrist, you can simply move your fingers in space to control virtual input, whether you’re wearing a VR headset or interacting with the real world. You can also “train” it to sense the intention of your fingers so that actions take place even when your hands are completely still.
This wearable wristband has no name. It’s just a concept and there are several versions of it, some of which include haptic feedback. Bosworth says it could take five to 10 years for the technology to become widely available.
All of this is linked to Facebook’s plans for virtual and augmented reality, technologies that can sometimes leave the user with a distinct lack of choice when it comes to their hands. Put on a VR headset and your hands disappear completely. Picking up a few hand controllers lets you play games or grab virtual objects, but then you lose the ability to take notes or draw accurately. Some AR or “mixed reality” headsets, such as Microsoft’s HoloLens, have cameras that track spatial gestures so you can use certain hand signals and the headset will interpret those signals… which sometimes works. So Facebook is using this EMG wearable in its virtual reality lab to see if such a device allows for more accurate hand-computer interactions.
But Facebook has visions for this wrist technology that extend beyond AR and VR, Bosworth says. “If you really had access to an interface that lets you type or use a mouse — without having to physically type or use a mouse, you could use this anywhere.” The keyboard is a good example, he says; this wrist computer is just another means of intentional input, except you can take it with you wherever you go.
Bosworth also suggested the kitchen microwave as a use case while clarifying that Facebook isn’t actually building a microwave. The interfaces of home appliances are all different, so why not program such a device to easily understand when you want to cook something for 10 minutes on medium power?
The virtual demo Facebook gave earlier this week showed a gamer wearing the wrist device and controlling a character in a rudimentary video game on a flat screen, all without having to move his fingers. These kinds of demos tend (pardon the pun) to gesture toward mind-reading technology, which Bosworth says isn’t the case. In this case, he said, the mind generates signals identical to those that would make the thumb move, but the thumb doesn’t move. The device takes a expressed . on intention move the thumb. “We don’t know what’s going on in the brain, which is full of thoughts, ideas and notions. We don’t know what happens until someone sends a signal through the wire.”
Bosworth also emphasized that this wearable wristband is different from the invasive implants used in a 2019 brain-computer interface study that Facebook collaborated with the University of California at San Francisco; and it’s different from Elon Musk’s Neuralink, a wireless implant that theoretically allows people to send neuroelectric signals from their brains directly to digital devices. In other words, Facebook doesn’t read our minds, even though it already knows a lot about what’s going on in our heads.
Researchers say there is still a lot of work to be done in using EMG sensors as virtual input devices. Precision is a big challenge. Chris Harrison, the director of the Future Interfaces Group in the Human-Computer Interaction Lab at Carnegie Mellon University, points out that every individual person’s nerves are a little different, as are the shape of our arms and wrists. “There is always a calibration process that has to take place with any muscle detection system or BCI system. It really depends on where the computer intelligence is,” Harrison says.
And even with haptic feedback built into these devices, as Facebook does with some of its prototypes, there is a risk of visuo-haptic mismatches, where the user’s visual experience – whether in AR, VR or real space – does not correlate with the haptic response. These friction points can make these human-computer interactions all feel frustrating unReal.
Even if Facebook can overcome these hurdles in its research labs, there’s still the question of whether Why Facebook – largely a software company – wants to take ownership of this new computing paradigm. And should we trust it? This hugely powerful tech company that has a track record of sharing user data in “exchange for other equally or more valuable things,” as WIRED’s Fred Vogelstein wrote in 2018? A more recent report in MIT Technology Review shows how a team at Facebook that came together to tackle “responsible AI” was undermined by the leadership’s relentless pursuit of growth.
Facebook executives said this week that these new human-computer interaction devices will perform as much computing as possible “on the device,” meaning the information isn’t shared to the cloud; but Bosworth doesn’t want to record how much data may eventually be shared with Facebook or how that data will be used. The whole thing is a prototype, so there’s nothing substantial to take apart yet, he says.
“Sometimes these companies have stacks of money big enough to invest in these massive R&D projects, and they’ll take a loss on things like this if it means they can be frontrunners in the future,” said Michelle Richardson, director of the Data and Privacy Project at the Nonprofit Center for Democracy and Technology. “But with companies of all sizes, any product, it’s so hard to overhaul it once it’s built. So anything that can start the conversation about this before the devices are built is a good thing.”
Bosworth says Facebook wants to lead this next paradigm shift in computing because the company sees such technology as fundamental to connecting people. In any case, the past year has shown us how important it is to connect – from feeling like being personal, Bosworth says. He also seems to believe that he can gain the required trust by not ‘surprising’ customers. “You say what you do, you set expectations, and you meet those expectations over time,” he says. “Trust comes on foot and leaves on horseback.” Pink AR glasses, activated.
This story originally appeared on wired.com.