Late last month, news broke that Elon Musk’s neurotechnology company, Neuralink, successfully implanted its first brain-computer interface (BCI) in a human subject.
Dubbed Telepathy by Musk, the implant is part of the company’s effort to develop a new method for people – particularly those rendered immobile due to medical conditions such as paralysis – to interact with devices like smartphones and computers.
According to Neuralink, the implant works by capturing and wirelessly transmitting neural activities from the brain to an app that decodes them into specific actions to be performed on a device.
This is just one instance of alternative control mechanisms developed for differently-abled individuals with the aim of enhancing autonomy through assistive technology, particularly given the increasing digitalisation of the world.
Another example are eye trackers, which can afford those with limited mobility the ability to use a computer with just their eyes.
Popular examples include trackers provided by Tobii Dynavox, myGaze and Zyteq.
In Sweden, a study found that among adults with severe physical and communication impairments, 96% use eye tracking technology when interacting with a computer.
The trackers use cameras mounted on top of a computer screen to track what a user is looking at by emitting an invisible infrared light that bounces off the human eye.
They work in tandem with software to interpret the eye movements into interactions such as clicks and keyboard inputs.
This was the case with former American National Football League (NFL) player Tim Green, who had been diagnosed with amyotrophic lateral sclerosis (ALS), a terminal neurodegenerative condition that also causes paralysis.
Despite his condition, Green went on to write a 304-page novel titled Final Season in 2021, which became a New York Times bestseller. He wrote the novel entirely with an eye tracker and a specialised tablet keyboard.
Similar eye tracking technology has been used together with a speech synthesiser by the late renowned theoretical physicist Stephen Hawking, who also suffered from ALS.
Recent technological advancements encompass head-mounted trackers for enhanced precision, while some studios have developed software to enhance webcams with similar capabilities, albeit offering more basic interactions and lower tracking accuracy.
Microsoft has even included native support for eye trackers in both Windows 10 and 11.
Those with limited mobility also have the option of using its native speech-to-text dictation and voice commands.
Additionally, it also includes accessibility features catering to users with low or impaired vision, offering built-in screen readers, high-contrast themes, and colour-blindness filters.
There are also third-party options that offer a variety of additional features. Non-Visual Desktop Access (NVDA) is an example of a screen reader that provides support for language add-ons and external, refreshable Braille displays.
Similar accessibility options are also available on MacOS, iOS, and Android.
Notably, iOS 17 expanded accessibility tools last May to include Point and Speak, allowing the device to read text captured by the camera.
Apple also introduced Live Speech in iOS 17, iPadOS 17, MacOS Sonoma, and WatchOS 10, which lets users type out messages during a FaceTime call or even an in-person conversation and have the device vocalise them.
Users can choose from existing voices provided by Apple or create a custom personal voice by submitting a 15-minute recording of themselves.
In cases where limited upper-body mobility is an issue, foot pedals may offer a solution, as they can be invaluable for binding more complex key combinations that would otherwise be impossible to perform.
Microsoft also offers a selection of peripherals it calls Adaptive Accessories, tailored for customisation with officially supported 3D-printed parts that can be swapped in depending on a specific user’s needs.
Meanwhile, for those with mutism, there are a number of text-to-speech options that work on PCs, both paid and free.
A good free option is Izabela, which was designed to allow those with speech disabilities to communicate with others in voice chat programs and games.
On the gaming front, Microsoft and Sony have both introduced highly customisable controllers, the Xbox Adaptive Controller and PS5 Access Controller, respectively, for those with limited mobility.
Artificial intelligence (AI) is also set to transform the field of assistive technology, with the potential to open up new avenues for those with disabilities.
In August last year, the story of Ann Johnson, who had been left paralysed after a stroke in 2005, made headlines after she received a brain implant that allowed her to communicate verbally again via a digital avatar made in her likeness.
The implant captured signals in the part of the brain responsible for speech and sent them through an attached cable to a computer, where an AI algorithm turns them into speech with a voice trained to sound like Johnson’s.