Artificial Intelligence and Special Needs

By John Paul Mueller, Luca Massaron

At one time, losing a limb or having another special need meant years of doctor visits, reduced capability, and a shorter and less happy life. However, better prosthetics and other devices, many of them AI-enabled, have made this scenario a thing of the past for many people. For example, check out this couple dancing. The woman has a prosthetic leg. These days, some people can run a marathon or go rock climbing, even if they’ve lost their original legs.

Many people view the term special needs as being equivalent to physically or mentally deficient or even disabled. However, just about everyone has some special need. At the end of a long day, someone with perfectly normal vision might benefit from magnifying software to make text or graphic elements larger. Color translation software can help someone with normal color vision see details that aren’t normally visible (at least, to someone with what is considered normal vision). As people get older, they tend to need more assistance to hear, see, touch, or otherwise interact with common objects. Likewise, assistance with tasks such as walking could keep someone out of a nursing home and in their own home for their entire life. The point is that using various kinds of AI-enabled technologies can significantly help everyone to have a better life.

Considering the software-based solutions

Many people using computers today rely on some type of software-based solution to meet specific needs. One of the most famous of these solutions is a screen reader called Job Access With Speech (JAWS) that tells you about display content using sophisticated methods. As you might imagine, every technique that both data science and AI rely upon to condition data, interpret it, and then provide a result likely occurs within the JAWS software, making it a good way for anyone to understand the capabilities and limits of software-based solutions. The best way for you to see how this works for you is to download and install the software, and then use it while blindfolded to perform specific tasks on your system. (Avoid anything that will terrify you, though, because you’ll make mistakes.)

Accessibility software helps people with special needs perform incredible tasks. It can also help others understand what it would be like to have a special need. A considerable number of such applications are available, but check out Vischeck at or one example. This lets you see graphics in the same way that people with specific kinds of color blindness see them. Of course, the first thing you’ll discover is that the term color blind is actually incorrect; people with these conditions see color just fine. The color is simply shifted to a different color, so saying color shifted is likely a better term.

Relying on hardware augmentation

Many kinds of special needs require more than just software to address adequately. The “Considering the use of exoskeletons” section, earlier in this chapter, tells you about the various ways in which exoskeletons see use today in preventing injury, augmenting natural human capabilities, or addressing special needs (such as allowing a paraplegic to walk). However, many other kinds of hardware augmentation address other needs, and the vast majority require some level of AI to work properly.

Consider, for example, the use of eye-gaze systems. The early systems relied on a template mounted on top of the monitor. A quadriplegic could look at individual letters, which would be picked up by two cameras (one on each side of the monitor) and then typed into the computer. By typing commands this way, the quadriplegic could perform basic tasks at the computer.

Some of the early eye-gaze systems connected to a robotic arm through the computer. The robotic arm could do extremely simple but important actions, such as help users get a drink or scratch their nose. Modern systems actually help connect a user’s brain directly to the robotic arm, making it possible to perform tasks such as eating without help.

Seeing AI in prosthetics

You can find many examples of AI used in prosthetics. Yes, some passive examples exist, but most of the newer visions for prosthetics rely on dynamic approaches that require an AI to perform. One of the more amazing examples of AI-enabled prosthetics is the fully dynamic foot created by Hugh Herr. This foot and ankle work so well that it’s actually possible for Hugh to perform tasks such as rock climbing. You can see a presentation that he made recently at TED.

A moral dilemma that we might have to consider sometime in the future (thankfully not today) is when prosthetics actually allow their wearers to substantially surpass native human capability. For example, in the movie Eon Flux, Sithandra has hands for feet. The hands are essentially a kind of prosthetic grafted to someone who used to have normal feet. The question arises as to whether this kind of prosthetic implementation is valid, useful, or even desirable. At some point, a group of people will need to sit down and ascertain where prosthetic use should end to maintain humans as humans (assuming that we decide to remain human and not evolve into some next phase). Obviously, you won’t see anyone with hands for feet today.