Vertical Divider
Apple’s New Gestural Controls
During 2021’s Global Accessibility Awareness Day, Apple showed off a new accessibility feature for the Apple Watch, and it might be mistaken for Facebook’s neural click idea. It isn’t as subtle or seemingly cutting-edge, but there is clearly some overlap between the two. The watch’s new AssistiveTouch feature, coming later this year, allows for no-touch operation of the device. With the feature switched on, users will be able to do things like answer a call with a double-clench of the fingers and move a cursor by tilting and twisting the wrist and forearm. These inputs are detected by the gyroscope, accelerometer and heart rate monitor in the watch plus Apple machine-learning software, which interprets data from all three. Facebook’s forthcoming wrist input technology (acquired along with makers CTRL-labs) works with electromyography or EMG sensors, which pick up on neural impulses. Users can learn how to send inputs without the need for movement—and granted, most people would rather control a device invisibly than having to theatrically clench and unclench their hands all the time. At the same time, the EMG technology also relies on using machine learning to sift through a bunch of neural noise in order to accurately detect input. Apple Watch’s AssistiveTouch seems promising for its primary intended use, which is to help users who have difficulty operating the watch’s touchscreen. Apple will develop the technology to read increasingly subtle hand movements as its machine-learning models improve.
During 2021’s Global Accessibility Awareness Day, Apple showed off a new accessibility feature for the Apple Watch, and it might be mistaken for Facebook’s neural click idea. It isn’t as subtle or seemingly cutting-edge, but there is clearly some overlap between the two. The watch’s new AssistiveTouch feature, coming later this year, allows for no-touch operation of the device. With the feature switched on, users will be able to do things like answer a call with a double-clench of the fingers and move a cursor by tilting and twisting the wrist and forearm. These inputs are detected by the gyroscope, accelerometer and heart rate monitor in the watch plus Apple machine-learning software, which interprets data from all three. Facebook’s forthcoming wrist input technology (acquired along with makers CTRL-labs) works with electromyography or EMG sensors, which pick up on neural impulses. Users can learn how to send inputs without the need for movement—and granted, most people would rather control a device invisibly than having to theatrically clench and unclench their hands all the time. At the same time, the EMG technology also relies on using machine learning to sift through a bunch of neural noise in order to accurately detect input. Apple Watch’s AssistiveTouch seems promising for its primary intended use, which is to help users who have difficulty operating the watch’s touchscreen. Apple will develop the technology to read increasingly subtle hand movements as its machine-learning models improve.
Contact Us
|
Barry Young
|