Facebook Reality Labs (FRL) Research are building an interface for AR that will not force people to choose between interacting with their devices and the world around them. They are developing natural, intuitive ways to interact with always-available AR glasses because they believe this will transform the way we connect with people near and far. They are trying to create neural interfaces to let people control the machine directly, using the output of the peripheral nervous system — specifically the nerves outside the brain that animate your hand and finger muscles. A wrist-based wearable has the additional benefit of easily serving as a platform for compute, battery, and antennas while supporting a broad array of sensors.
The missing piece was finding a clear path to rich input, and a potentially ideal solution was electromyography (EMG). EMG uses sensors to translate electrical motor nerve signals that travel through the wrist to the hand into digital commands that you can use to control the functions of a device. These signals let you communicate crisp one-bit commands to your device, a degree of control that’s highly personalizable and adaptable to many situations. The signals through the wrist are so clear that EMG can understand finger motion of just a millimeter. That means input can be effortless. Ultimately, it may even be possible to sense just the intention to move a finger.
More information: