Methods for AI to Interact with the Environment

By John Paul Mueller, Luca Massaron

An AI that is self-contained and never interacts with the environment is useless. Of course, that interaction takes the form of inputs and outputs. The traditional method of providing inputs and outputs is directly through data streams that the computer can understand, such as datasets, text queries, and the like. However, these approaches are hardly human friendly and require special skills to use.

Interacting with an AI is increasingly occurring in ways that humans understand better than they do direct computer contact. For example, input occurs through a series of microphones when you ask Alexa a question. The AI turns the keywords in the question into tokens it can understand. These tokens then initiate computations that form an output. The AI tokenizes the output into a human-understandable form: a spoken sentence. You then hear the sentence as Alexa speaks to you through a speaker. In short, to provide useful functionality, Alexa must interact with the environment in two different ways that appeal to humans, but which Alexa doesn’t actually understand.

Interactions can take many forms. In fact, the number and forms of interaction are increasing continually. For example, an AI can now smell. However, the computer doesn’t actually smell anything. Sensors provide a means to turn chemical detection into data that the AI can then use in the same way that it does all other data. The capability to detect chemicals isn’t new; the ability to turn the analysis of those chemicals isn’t new; nor are the algorithms used to interact with the resulting data new. What is new is the datasets used to interpret the incoming data as a smell, and those datasets come from human studies. An AI’s nose has all sorts of possible uses. For example, think about the AI’s capability to use a nose when working in some dangerous environments, such as to smell a gas leak before being able to see it by using other sensors.

Physical interactions are also on the rise. Robots that work in assembly lines are old hat but consider the effects of robots that can drive. These are larger uses of physical interaction. Consider also that an AI can react in smaller ways. Hugh Herr, for example, uses an AI to provide interaction with an intelligent foot. This dynamic foot provides a superior replacement for people who have lost their real foot. Instead of the static sort of feedback that a human gets from a standard prosthetic, this dynamic foot actually provides the sort of active feedback that humans are used to obtaining from a real foot. For example, the amount of pushback from the foot differs when walking uphill than walking downhill. Likewise, navigating a curb requires a different amount of pushback than navigating a step.

The point is that as AI becomes more able to perform complex calculations in smaller packages with ever-larger datasets, the capability of an AI to perform interesting tasks increases. However, the tasks that the AI performs may not currently have a human category. You may not ever truly interact with an AI that understands your speech, but you may come to rely on an AI that helps you maintain life or at least make it more livable.