Neurobiology For Dummies
Book image
Explore Book Buy On Amazon
Neurobiology has all kinds of real-world (and not so real-world) applications. From curing paralysis to the possibility of cyborgs, neurobiology has answers to many fascinating questions this Cheat Sheet addresses.

How can paralysis be cured?

Paralysis has multiple causes. The part of the brain that controls movement can be damaged, such as from a stroke. Injuries and diseases can interrupt the message transmitted from the brain to the muscles. But treatment and research are helping, and rapid scientific and technological advances are making these science-fiction–sounding approaches feasible in the near future:

  • Rehabilitation and training help in all types of paralysis by strengthening specific neural pathways and recruiting alternate ones to bypass the injury.
  • Considerable current research is being done on the causes of paralysis. Curing the actual disease is almost always the treatment of first choice.
  • One promising treatment for strokes and tumors involves regrowing neurons by taking cells that have been reprogrammed to be neural stem cells and doing autologous transplants derived from the person’s own tissue put into the damaged region.
  • Other research efforts involve the use of growth factors to stimulate regeneration in damaged brain areas.
  • Prosthetics using brain computer interfaces (BCIs) may be used to alleviate paralysis by capturing the high-level neural signals for movement, bypassing the injury, and stimulating movement closer to the muscle. Another BCI approach is to capture high-level motor command signals and perform the body movements with mechanical devices, such as prosthetic limbs or exoskeletons that magnify with motors the force exerted by a person’s limbs, or that respond to brain commands for movement recorded by electrodes.

Can the mind be downloaded?

As computers become more powerful, there’s increasing speculation about whether they could equal or surpass human intelligence. One thread in this discussion is the idea of downloading the human mind into an artificial substrate such as a computer. Most neuroscientists are very skeptical about this idea for a few reasons:

  • Neurobiologists know that the brain is extremely complex. It consists of on the order of 86 billion neurons, each with a thousand connections to other neurons. No present-day substrate can come close to this complexity.
  • Neurobiologists think that the essential function of the brain is carried out by neural computations that generate action potentials that are sent to other neurons.
  • Even if neurobiologists could measure the firing of every neuron in the brain, and the response from every synapse activated by that firing, and download or model that in silicon, we still don’t know if it would actually work like a human brain.

Bottom line: Your thoughts are your own, for the foreseeable future.

Can imaging systems read our minds?

Since functional magnetic resonance imaging (fMRI) machines became more common at the end of the 20th century, there have been more claims about the ability of this technology to extract the content of mental processing. Many aspects of the claims and counterclaims parallel those associated with so-called lie detector tests during their heyday in the late 20th century.

fMRI scanners detect blood oxygenation levels and blood flow changes associated with metabolic changes in brain areas at a resolution of one to several cubic millimeters, depending on the magnet strength. This measurement is a one-dimensional index of the overall level of neural activity in that tissue volume, which is a complex circuit, composed of millions of neurons.

What can be deduced from fMRI scans, practically and theoretically? The gross anatomy of the brain is characterized by localization of function, with distinct motor and sensory areas and maps within those areas. For example, neurobiologists know exactly where the brain area is that controls the left hand, and, if a person in an fMRI magnet moved their left hand, that activity would easily be detected.

In sensory systems, visual space is laid out in a complex topographic map on the brain. If a person imagines some specific shape directly in front of them, some of the same brain areas will be activated that would have been activated by actually seeing that shape. This brain activity associated with imagery also can be detected in a scanner. Brain areas whose activity is necessary to conjure up images, or to lie, are different from those involved in retrieving content from actual memory, and this also can be detected.

Eventually, though, current research methods run out of resolution. One cubic millimeter volume of brain tissue has trillions of different states. No one-dimensional measure of the overall activity in this volume can distinguish among all these states. Scanners may well be able to distinguish between a finite number of alternatives characterized by significant differences in brain activity over many millimeters (such as images causing arousal), but they can’t with foreseeable non-invasive technology distinguish among complex, subtle differences in similar thought patterns.

Are cyborgs possible?

Cyborgs (cybernetic organisms) already exist! Any one of the more than 100,000 people worldwide who has a cochlear implant to restore hearing is technically a cyborg, a functional combination of organic and machine parts. Your Great-Aunt Betty suddenly seems much cooler, doesn’t she? The real question is how rapidly additional brain functions will be carried out with brain-computer interfaces (BCIs) and how quickly they’ll be developed.

Here are some interesting ideas about cyborgs:

  • The beginning of the human cyborg era began with the need to restore lost function, particularly hearing, where the BCI was relatively straightforward. Current devices such as Musk’s Neuralink hope to translate some thought processes into motor output, such as direct control of a computer mouse.
  • Research is underway to use a cyborg approach to repair some kinds of blindness via miniature cameras and arrays of stimulators to inject the camera signal into the retina.
  • A long-standing project is aimed at replacing some memory functions in a portion of the medial temporal lobe of the brain, called the hippocampus, with silicon circuitry.
  • Just as the nervous system adapts to new and novel inputs, such as those that occur when learning how to ride a bicycle or drive a car, it can likely adapt to direct injection of signals into the nervous system.

About This Article

This article is from the book:

About the book authors:

Frank Amthor, PhD, is a professor of psychology at the University of Alabama and holds a secondary appointment in the UAB Medical School Department of Neurobiology. He has been an NIH-supported researcher for over 20 years and has published over 100 journal articles and conference abstracts.

Frank Amthor, PhD, is a professor of psychology at the University of Alabama and holds a secondary appointment in the UAB Medical School Department of Neurobiology. He has been an NIH-supported researcher for over 20 years and has published over 100 journal articles and conference abstracts.

This article can be found in the category: