"When neural interfaces work properly, they feel like magic," says Thomas Reardon, director of motor neuron interfaces at Facebook's Reality Labs.
Facebook is investing massive sums in research and development for the next generation of human-machine interfaces.
The social media giant still sees virtual reality and augmented reality as key areas of growth, and wants to be at the forefront of creating enabling technology for the way people use computing platforms for the next decade and beyond.
The Realty Labs team is also working on developing a context-aware interface powered by artificial intelligence for augmented reality glasses.
And at the Facebook Connect conference last year, the company announced a new line of smart augmented reality glasses, starting with "Ray Ban" models due to be launched sometime in 2021.
"AR glasses enable us to exist and connect, and how we communicate with this new device will be critical," Andrew Bosworth, president of Realty Labs, said in a tweet.
Facebook says it plans later this year to unveil its soft robotics work to build comfortable, all-day wearable devices, as well as provide an update to its tactile glove research.
Facebook confirmed that the approach it uses with the wrist-based augmented reality controller is not the same as mind reading. Instead, the controller uses electromyography, or EMG, which uses sensors to interpret motor nerve signals. The electrical transmitted from the wrist to the hand and then into digital commands that you can use to control the functions of the device.
"What we are trying to do by using neural interfaces is to allow you to directly control the device, using the output of the peripheral nervous system ... specifically the nerves outside the brain that move the muscles of the hand and fingers," Facebook said.
Special thanks to
@booming01
@booming02
@booming03
@booming04
Because your support for us is the key for me to continue developing and continuing forward