An Educator's Guide to the Human Brain
by Robert Sylwester

Preface

Recent dramatic developments in the cognitive sciences are moving us closer to an understanding of our brain's development, organization, and operation. Increased understanding of the brain should lead to widespread discussions of the important issues that will arise out of these advances, and to the development of appropriate and effective educational applications of this knowledge.

As an educational leader, you need a functional understanding of these significant developments to be able to comprehend the growing scientific and professional writing in this field; discuss, develop, and evaluate proposed educational applications; and effectively teach students about brain mechanisms and processes. Without such knowledge, our profession will become prey for educational hucksters who will propose expensive programs they claim to be compatible with current cognitive theory and research.

This nontechnical book provides a functional introduction to that professionally important information. It's directed especially to educational leaders who have a limited background in the cognitive sciences, but who will be expected to make the educational decisions sparked by developments in the scientific world. It's an introduction to our brain, not a comprehensive treatment. You'll need to read much more to become truly informed, so the text and especially the chapter notes recommend excellent nontechnical books and articles written by respected scientists and science writers who have the ability to explain complex biochemical processes in layman's terms. Your continued study will spark imaginative thoughts that will lead to new ways of looking at your profession, to the design of biologically based educational applications, to the excitement of curricular experimentation.

The book's title, A Celebration of Neurons, uses the poetic collective noun celebration (as in a pride of lions, a swarm of bees) to communicate the celebratory nature of the magnificent network of neurons we humans have. It's the only mass of matter in the known universe that can contemplate itself—a true celebration of neurons.

The book begins with the cognitive sciences' recent and rapid rise to importance in educational thought and practice. Subsequent chapters examine specific educationally significant elements of our brain and its processes. Although the book focuses on our current scientific understanding of our brain and its processes, it also suggests some broad educational applications—all based on current theory and research—that can be introduced in schools now.

The suggested applications probably won't surprise you because the cognitive sciences are discovering all sorts of things that good teachers have always intuitively known. What's important, however, is that our profession is now getting strong scientific support for many practices that our critics have decried.

Our profession is at the edge of a major transformation. The scientific discoveries will continue at an increasing pace whether or not we inform ourselves and make the effort to discover appropriate educational applications. What a tragedy it would be, though, if we were to choose to remain professionally uninformed and uninvolved in this historic revolution.

This book had its beginnings in 20 syntheses of cognitive science theory and research that I've published over the past 15 years, many in Educational Leadership. I'm grateful to Ron Brandt of ASCD for his constant support in this enterprise, and to the Education Press Association of America for the four encouraging awards they've given my published syntheses. The cognitive sciences are developing so rapidly, however, that I practically had to start from scratch in writing this book.

I'm especially appreciative of my wife Ruth's constant unconditional support—and since I've come to understand cognitive development in ways that I couldn't have imagined when our own children were young, I'm thankful for the second chance our children have provided us to observe its development in our dozen grandchildren.

Robert Sylwester


1. At the Edge of a Major Transformation

The human brain is the best organized, most functional three pounds of matter in the known universe. It's responsible for Beethoven's Ninth Symphony, computers, the Sistine Chapel, automobiles, the Second World War, Hamlet, apple pie, and a whole lot more. Our brain has always defined the education profession, yet educators haven't really understood it or paid much attention to it.

For a long time, scientists didn't understand the brain either. Our skull hides a bewildering array of electrochemical activity, so our brain's awesome complexity is its own major barrier to understanding itself. Our brain's cellular units are tiny, their numbers are immense, and everything is connected.

At the cellular level, our brain's three-pint, three-pound mass is divided somewhat evenly between tens of billions of nerve cells, or neurons, that regulate cognitive activity, and the much smaller and ten-times-more-numerous glial cells that support, insulate, and nourish the neurons.

Brain cells are very small and highly interconnected. Thirty thousand neurons (or 300,000 glial cells) can fit into a space the size of a pinhead. Neurons connect to other neurons, muscles, or glands via sending and receiving extensions; and although most sending connections are in the millimeter range, the extensions connecting some motor neurons to muscles can reach a meter in length. A neuron may connect to thousands of other cells, so the chemical information in a neuron is only a few neurons away from any other neuron. If you think that's implausible, consider our world's one billion telephones—and the relatively simple coding system of about a dozen digits that can rapidly connect any two of them.

Enter into a single neuron and the complexity increases. For example, a cell's nucleus contains DNA (deoxyribonucleic acid), a relatively large molecule that is the cell's recipe book for manufacturing cellular materials and regulating cellular processes. In human neurons, the unraveled ladder-shaped DNA molecule is a meter long—in a cell 1/30,000 the size of a pinhead!

Pioneer brain researchers obviously had problems when they tried to study individual neurons or interconnected assemblies of neurons. They simply didn't have the technology for such investigation, and so their research progress was slow and tentative. Still, these researchers had the freedom to patiently explore what they could of the brain, and to slowly develop more sophisticated chemical assay and monitoring technology to study the remainder. This technology has evolved rapidly during the current computer age, as have the research fields that study the brain—with greater wonders yet to come. Our brain is at the edge of understanding itself!

A Behaviorist Profession

Educators have never had the scientist's freedom to patiently wait for the research technology to catch up with their curiosity. Every year, a new batch of students arrives at the school door, whether we understand how their brains develop or not. We have had to find a way to bypass students' brains in order to carry out our professional assignment.

Our solution has been to focus on the visible, measurable, pliable manifestations of cognition, rather than on cognitive mechanisms and processes. The human brain uses sensory/perceptual processes to take in objects and events in the environment. It then draws on memory and various problem-solving strategies to process the information, and it eventually translates thought and decision into behavior. If our profession couldn't comprehend internal brain processes, it could focus on knowable external objects or events in the environment (stimulus) and the behavior (response) that emerged out of the unknowable cognitive processes. Thus, we became a profession of behaviorists, whether we liked it or not. We learned how to manipulate the student's environment to achieve the behavior we desired.

We didn't do all that badly with this approach. Millions of intelligent and sharing teachers, each observing the behavior of about thirty students for a thousand hours every year, eventually learned many practical and effective things about teaching and learning. When the findings of formal educational and psychological research were added to this experience, our informal knowledge evolved into a solid base of normative, practical professional knowledge.

To be honest, though, the practical base of our profession was probably closer to folklore knowledge than scientific knowledge. We could predict what would probably occur in a classroom, but we generally didn't know why it occurred. For example, we knew that more boys than girls had serious reading problems, and that hyperactive students tended to be thin, blond, blue-eyed boys, but we didn't know why. The problem with partial knowledge that focuses only on outward behavior is that it can lead to inappropriate, generalized conclusions, such as that boys are inherently stupid and ill-mannered.

We also didn't understand the underlying mechanisms that govern other significant teaching and learning concerns, such as emotion, interest, attention, thinking, memory, and skill development—even though we did learn how to deal with the outward behavior. Thus, studying student behavior was professionally useful, but we knew intuitively that behavior was only part of a much larger picture. Deep down, we could never be sure if students learned because of our efforts, or despite them. We accepted this blind spot as a limitation of our profession.

Perhaps a more serious issue is that the study of behavior can lead us to only a partial diagnosis and treatment of many complex learning behaviors that we've handled rather ineffectively. These include dyslexia, attention disorders, motivation, and forgetting. Schools tend to be most successful with motivated students of at least average ability who come from secure homes and can function reasonably well without much teacher assistance. They are less successful with students who don't fit this profile.

A Medical Analogy

The medical profession, too, operated at the level of professional folklore for most of its long history. Doctors weren't very effective at treating health problems that were much more serious than those the body's own recuperative powers could heal with time and rest. The romantic vision of the pioneer family doctor is of a caring and sometimes helpful person who could correctly diagnose an illness, but only sit helpless at the bedside as the patient died. Worse, earlier well-meaning doctors actually hastened their patients' deaths because they didn't realize that doctors should wash their hands and sterilize their instruments before treating a patient.

As medical knowledge improved, doctors reached the point where they could explain the cause of an illness to a patient, but still not be able to offer a cure. Later they discovered a cure, but it didn't always work. Finally, they could nearly always successfully treat the illness that had resulted in almost certain death a few decades earlier.

The medical profession's move from the careful but often ineffectual observation of the patient's body and behavior to the successful diagnosis and treatment of complex health problems began when it developed sound theories and the research skills and technologies needed to study the structure and biochemistry of the body and its organs.

If the medical profession had waited for the cure to suddenly appear before it did anything about understanding the illness, patients would still be waiting for miracle cures. Each stage of the progression of diagnosis and treatment, including the errors made along the way, was legitimate for its time and necessary for our advancement to the next stage.

Our Professional Challenge

The education profession is now approaching a crossroads. We can continue to focus our energies on the careful observation of external behavior—a course that may be adequate for managing relatively mild learning disorders—or we can join the search for a scientific understanding of the brain mechanisms, processes, and malfunctions that affect the successful completion of complex learning tasks.

Getting involved in the exciting developments in the brain sciences is an important step for educators, even though the educational applications of much of this research aren't yet clear (e.g., to know why language and attention problems are more prevalent in boys than girls doesn't really solve the problem of how to teach students). Understanding brain mechanisms and processes adds an exciting dimension to our thoughts about our profession. Few things fascinate us more than our own cognitive processes.

It's true that you don't have to know how an internal combustion engine is put together in order to drive a car, but you ought to have a functional understanding of the engine if you're going to sell cars, and you must have a technical understanding of it if you're going to repair cars. By analogy, the education profession will have to decide if its knowledge of the human brain ought to be closer to the level of those who merely use their brain, those who give advice on how best to use it, or those who repair malfunctions of the brain. Only through our knowledge of the research and our profession's own experimental fumblings will we begin to discover useful applications of brain theory and research. Knowing why generally leads to knowing how to.

Many educators lack the natural science background to understand cognitive science research, let alone to participate in it, or to create a curriculum around it. Because our profession's orientation has long been in the social and behavioral sciences, teacher education students rarely do much academic work in biology, chemistry, and cognitive psychology. This preservice focus was perhaps appropriate within the traditional view that classroom teaching focuses on negotiated activity in a group setting. The educationally significant new developments in brain theory and research suggest, however, that the amount of natural science in our professional preparation must increase—at both the preservice and inservice levels.

Can a profession whose charge is defined by the development of an effective and efficient human brain continue to remain uninformed about that brain? If we do remain uninformed, we will become vulnerable to the pseudoscientific fads, generalizations, and programs that will surely arise from the pool of brain research. We've already demonstrated our vulnerability with the educational spillover of the split-brain research: the right brain/left brain books, workshops, and curricular programs whose recommendations often went far beyond the research findings. If we can't offer informed leadership on the complex educational issues arising from current brain theory and research, we can expect that other people—perhaps just as uninformed as we are—will soon make decisions for us.

Our profession is at the edge of a major transformation. We can expect a marked increase in scientific knowledge about our brain and its processes. We can expect a similar increase in our patrons' awareness of such developments, because the mass media generally report and discuss such developments. Think about what we knew about our brain 20 years ago and compare it with what we know today; then project our current level of understanding 20 years into the future, when today's kindergarten student might be a beginning teacher.

Imagine the shape of our nation's health if the medical profession had, by default, been content to remain at the level of folklore and home remedies when it was at a similar point of decision.1

How Our Brain Studies Itself

Our brain has long contemplated itself, and it is rapidly moving toward understanding itself. This chapter began with the suggestion that our brain is awesomely complex, but our brain is also elegantly simple. Let's turn now to the search for that elegant simplicity.

The Scientists Who Study the Brain

The scientists who study brain mechanisms and processes approach their task from one of two general directions: from the bottom up or from the top down.

In studying our brain from the bottom up, the researcher focuses on the workings of small units—individual cells, or small systems of cells within more complex systems. This research perspective argues that understanding the basic units of a system is essential to understanding the entire system. Scientists who study brains from the bottom up are generally called neuroscientists, and many of them specialize in the study of a single cellular brain mechanism or process. Neuroscience has become a major research field during the past 25 years because researchers have developed the technology needed to study the brain's tiny and highly interconnected cells.

Studying our brain from the top down means that the researcher/scholar focuses on complex cognitive mechanisms, functions, or behaviors, such as movement, language, and abstract analysis. Cognitive psychology, linguistics, physical anthropology, philosophy, and artificial intelligence are some of the fields that use this broader approach. The top-down approach to brain study developed before the bottom-up approach because it was initially more tolerant of logical inference and speculations that weren't strongly supported by experimental evidence. Without the research technology to monitor cellular activity, top-down researchers/scholars had to infer brain activity from external behavior and brain malfunctions.

Because a brain is such a complex organ, most brain researchers/scholars focus their study at one level. Klivington (1986) uses a computer analogy to suggest three legitimate levels for understanding the development and operation of complex systems. In his analogy, the top level is software, and understanding a computer at this level means knowing how to write computer programs. The intermediate level is logic circuitry, the electrical hardware of information processing; understanding a computer at this level means knowing how to design logic circuitry and computer hardware. The bottom level is the solid-state physics of semiconductors in the component transistors, and a good background in physics is crucial to understanding the computer at this level.

This analogy suggests that a computer programmer, a circuit designer, and a solid-state physicist can each claim to know how a computer works without knowing much of what the other two specialists know. Similarly, a philosopher, a psychologist, and a neuroscientist all understand how a brain works, but at three different levels of understanding (one is reminded of John Saxe's 19th century poem of the six blind men and the elephant, each defining the elephant on the basis of his limited tactile exploration of it).

We thus have an organ and its functions that can be studied and understood at multiple levels. But it's difficult to know at what level something like conscious thought emerges from the movement of molecules within our brain. Think of the two atoms of hydrogen and one atom of oxygen that make up a water molecule, and then ask yourself how many molecules you would have to combine to achieve wetness, an important property of water. The abstract concept of consciousness is perhaps similar to wetness, in that it emerges in the system only when enough related molecular activity occurs in relevant neural networks. But at this point, scientists don't know for sure how much molecular activity in our brain is enough to create conscious thought, or where it occurs.

How Scientists Study a Brain

If our brain's awesome complexity hindered neuroscience research, its elegantly simplicity enhanced it. A human brain has to be simple, adaptable, and predictable to function continuously for upwards of a hundred years. To learn more about the brain, scientists had to discover how to perform intricate studies that would provide solid information on our brain's most basic operations: the normal and abnormal actions of a single neuron, the synchronized actions of networks of neurons, and the factors that trigger neuronal activity.

Thus, scientists developed laboratory procedures and brain-monitoring technology that could (1) collect electrochemical data from individual neurons and widespread neural networks, (2) summarize and interpret the relevant data and ignore the rest, and (3) graphically report neural activity in a form that researchers/ scholars could understand. This search for useful information led scientists to study the brains and behaviors of animals with simple neural systems. They also studied people and primates with and without brain damage or mental illness, and developed brain-imaging technologies that could take them beyond observable behavior.

Animal Research.

Our brain's complexity and general inaccessibility limit its direct use in the study of basic neural functions. Fortunately, basic neural mechanisms and processes are similar in all animals, so neuroscientists searched for animals with simple nervous systems that would be relatively easy to study with the microelectrodes they had developed to record the activity of single neurons. Some of the invertebrates proved especially useful, because they have only a few thousand neurons that are all much larger than human neurons. Invertebrates also have a limited behavioral repertoire and simple neural networks that are identical in all animals in the species. The marine snails Aplysia and Hermissenda have proved especially useful in this research, providing much of our knowledge about changes that occur in connecting neurons during learning and memory formation.2

Neuroscientists have also used a wide variety of other animals, such as squid, rats, cats, rabbits, and monkeys, that have brain mechanisms especially suited to a specific research problem. For example, major discoveries emerged when researchers used rabbits to study the role of the cerebellum in procedural memory processes, cats to study the structure of the visual cortex, and rats to study the effect of the environment on brain development.

The animal rights movement is critical of the use of animals in such research. Animal studies, however, have provided most of what we know about basic brain mechanisms and processes. This information has helped improve the lives of both humans and animals (albeit not the animals used in the research), and brain researchers argue that they currently have no other avenue to this kind of information. Animal rights activists argue that a perceived human need doesn't morally justify the killing of animals. So it's a real social dilemma. Each side argues its case persuasively, and the issue doesn't lend itself to simple compromise.

People with Brain Damage or Mental Illness. The dramatic bottom-up discoveries of cellular changes that occur during learning in a marine snail didn't directly help us to understand how children learn the multiplication tables. Although the individual neurons of snails and humans are remarkably similar, a snail's nervous system isn't similar to a human brain. Thus, researchers with a top-down interest in brain mechanisms and processes needed to discover a research approach that would allow them to study our brain directly, in all its complexity.

The obvious experimental limitations they faced forced them to focus their studies on available research subjects, generally people with brain damage or mental illness. War and accident injuries suffered by otherwise healthy young people provide the best subjects for this kind of research. People with such problems tend to allow researchers to study them in the hope that the discoveries will improve their plight.

The basic research design is straightforward: (1) identify the nature and general location of the subject's brain malfunction, (2) compare the subject's behavior to that of people without brain damage, in an attempt to link any abnormal behavior to the malfunctioning section of the brain, and (3) if possible, do a postmortem examination of the subject's brain to test the inference.

This approach has research design problems. The unpredictable availability of subjects permits researchers to only rarely conduct the controlled experiments critical to scientific research. Further, it's difficult to precisely locate a specific function in our brain because neurons are synaptically connected to thousands of other neurons in very complex, interacting networks. Still, researchers using this approach made some remarkable discoveries, although they often waited years to get enough subjects with the same problem to adequately study a function (or malfunction). Educators today are perhaps most acquainted with Roger Sperry's split-brain research (for which he won a 1981 Nobel Prize). The learning styles movement and many right-brain/left-brain books and workshops emerged from this research.

The split-brain research on humans had its beginnings in related research on the two hemispheres of cat brains. The two cerebral hemispheres that make up most of the mass of our brain are connected by the corpus callosum, a large group of neural fibers that allow the two hemispheres to communicate with each other and to collaborate on many complex cognitive functions. The success of the cat research encouraged doctors to cut the corpus callosum of patients suffering almost continuous epileptic seizures, in the hope that this radical surgical procedure would reduce the effects of the seizures. The procedure was effective in that it reduced the seizures, and it didn't seem to negatively affect the patient's mental or emotional life.

Brain researchers saw the split-brain patients as a rich source of valuable experimental data. The severed corpus callosum left each with practically no communication between the brain hemispheres, so for the first time in the history of brain research, scientists had an opportunity to discover the hemispheric location of specific cognitive processes.

The two hemispheres divide the visual information coming into each eye. The information from the left side of the visual field is sent to the right hemisphere, and vice versa. The researchers developed imaginative techniques that allowed them to send visual information into one hemisphere while blocking the information that would go into the other hemisphere. By asking probing questions and observing the subject's behaviors, the researchers believed they could discover which brain hemisphere normally processes specific cognitive functions. They also created related tests for hearing and touch.

Over the years, several dozen people with split brains have been extensively studied in increasingly sophisticated tests, and much of our early understanding of the division of cognitive functions between the hemispheres came from the study of these subjects.3 The literature on learning and memory often discusses an interesting case that came out of this type of research. The unfortunate man is known as H.M., and his story is another example of one person's tragedy resulting in a major increase in our understanding of an important brain mechanism.

In 1953, a surgeon removed the entire hippocampus of H.M. in a radical attempt to treat his epilepsy (the hippocampus is a wishbone-shaped structure that straddles the brainstem). The procedure has never been repeated, but it left H.M. with a permanent inability to form new factual memories, although he can remember things that occurred before his operation. He can hold information in his short-term memory for up to ten minutes, but he can't transfer it into long-term memory. For example, he has to be reintroduced to people who have left the room for 15 minutes or so. He has been extensively studied since the operation, and much that we know about the important role the hippocampus plays in long-term memory formation comes from studies of H.M.

Laboratory Experiments with Normal Primates and Humans

As cognitive scientists have learned more about our brain, they've developed increasingly sophisticated laboratory studies that allow them to study specific brain functions in normal humans and primates. For example, by carefully observing eye movements during the execution of a task, they can infer how the brain processes various spatial elements in our environment, and by timing response rates, they can determine how a brain processes various temporal elements, such as sequences.

The Stroop Test is an interesting example of the imaginative tests that cognitive scientists have developed—in this case to discover how rapidly and effectively our brain responds to conflicting information. The researcher asks the subject to read aloud a list of color names that are printed in a different color (e.g., the word red is printed in blue ink) or to say the color the word is printed in rather than the word itself.

Brain-Imaging Technology

. As productive as these early studies were, it was obvious that brain researchers would eventually have to develop technologies and procedures that could directly represent (image) the activity of a normal, active human brain. The rapid development of computer technology during the past two decades made brain-imaging machines possible and revolutionized brain research.

Brain-imaging machines gather and rapidly process the vast amounts of electrochemical data continuously generated by our brain, and so take researchers well beyond observable behavior, two-dimensional black-and-white x-rays, and EEG reports—and into the world of three-dimensional color TV graphics with high spatial and temporal resolution. Imaging machines can now focus to within one millimeter of a specific slice of brain tissue, much as an optical camera can focus on a specific plane in the photographed scene. Using imaging machines, researchers need only a few hours to gather from the brain the same type of data that formerly took 20 years of inferential laboratory work with nonhuman primates (Blakeslee 1993). New technological wonders and brain properties to explore will continue to emerge in this field.4

The current brain-imaging technology focuses on three elements of the organization and operation of our brain: (1) the chemical composition of cells and neurotransmitters, (2) the electrical transmission of information along neuronal fibers and the magnetic fields that brain activity generates, and (3) the distribution of blood through the brain as it replenishes energy used in electrochemical activity.

Chemical Composition

The CAT scan (short for computerized axial tomography) and MRI (magnetic resonance imaging) are examples of technologies that create graphic, even three-dimensional, images of the anatomical structures of our body/brain, thus locating features and malfunctions. The CAT scan uses multiple x-rays to provide the depth-of-field and clear cross-sectional views of features that simple x-rays don't provide. These multiple x-rays respond to the density of the tissue being scanned, showing dark shadows for denser elements, such as bones or tumors, and various shades of gray for the soft tissue that constitutes our brain. The MRI, with its focus on soft tissue, provides the reverse image. It responds to chemical differences in the composition of various brain and body tissues (while ignoring bones, moving blood, etc.), and thus it provides a clear image of the chemical composition of our brain. Fast MRI is a remarkable new development that allows researchers to observe brain activity on a TV monitor while the subject is carrying out a cognitive action.

Electrical Transmission

The EEG (electroencephalogram) has been in use for over half a century, reporting patterns in the electrical transmission of information within an active brain. Translating a score of squiggly lines on a moving sheet of paper into an accurate vision of brain activity is difficult, however, and the convolutions in the cerebral cortex make it difficult to pinpoint the exact source of the electrical activity. The SQUID (superconducting quantum interference device) is a major advance in this technology that uses the more easily located small magnetic fields produced by the brain. The BEAM machine (brain electrical activity mapping) represents another major advance in this technology. It records the electrical activity from more precisely defined areas and then uses color gradations to represent positive and negative levels in the analogous location on a TV screen's easily interpreted graphic representation of the cerebral cortex.

Blood Flow Patterns

. PET (positron emission tomography) uses radioactive materials to monitor the unequally distributed patterns of the pint and a half of blood that flows through our brain every minute. It thus traces sequential changes in brain energy use as various parts of our brain are activated. PET research has provided some recent dramatic advances in our knowledge of how and where our brain processes a series of events.

Brain-imaging machines are expensive, so their use thus far has been limited to medical research and diagnosis facilities and university science and psychology departments. Computer technology tends to become cheaper and more powerful over time, however. Researchers in university education departments have just begun to use electrical imaging technologies, and we can expect this use to increase in the years ahead, as the technology advances. Eventually, K-12 schools will probably use adaptations of these technologies in the diagnosis and treatment of learning problems, as the graduate students involved in such university research move into jobs in school districts.

The use of this technology in schools would certainly mark another giant step over the professional folklore line, but with it will come many of the problems the medical profession faced after it crossed that line: increased expectations from patrons, new ethical issues, and the threat of malpractice suits.

But then, we can't really go back either. We will have to adapt our profession to the inevitable increase in our understanding of the brain mechanisms and processes that define our profession.

New Brain Hypotheses and Theories

Successful research requires a strong theoretic base that explains the relationships among the various elements in a researched phenomenon. In the past few years, the cognitive sciences have seen a flurry of activity in the development of theories that use biological processes to explain complex cognitive functions—theories that scientists can test using the research technologies described earlier. This theoretical work, along with developments in genetics, may spark a Century of Biology, just as Albert Einstein's theories sparked advances in physics that have dominated the 20th century.

The new biologically based brain theories focus on the developmental relationship between a brain's ancestors and its current environment: the "nature versus nurture" issue. Our profession has tended to think of the nurture side as dominant, but these new theories argue that nature plays a far more important role than previously believed—or that the dichotomy itself is now an irrelevant issue. They also suggest that many current beliefs about instruction, learning, and memory are wrong. These theories will become controversial because they will require reconceptualizations of such concepts as parenting, teaching, learning, intelligence, identity, free will, and human potential. Further, some people may misuse the theories to support racist, sexist, and elitist beliefs. Certainly, those who reject Darwinian evolution will be disturbed by the evolutionary base of the new theories.

When these brain theories and their strong supporting evidence shortly reach the awareness of the general public, educational leaders will be asked to comment on them. The thrust of these theories raises fundamental issues about our professional assignment, so we had better understand them.

This discussion will focus on the work of two Nobel laureates whose work in brain theory has attracted much attention in the scientific community—and will probably attract much controversy as it moves into the educational community and general public awareness.

Francis Crick: The Astonishing Hypothesis

Francis Crick and James Watson collaborated in the 1953 discovery of the molecular structure of DNA, a form of memory that passes from one generation to the next genetic information about how to construct and maintain a body (Crick, Watson, and Maurice H. F. Wilkins received a Nobel Prize in 1962 for their work with DNA). Crick subsequently joined the staff of the Salk Institute for Biological Studies, where he shifted his focus to understanding such things as the memories that a brain processes in its lifetime. He was especially interested in the related issues of consciousness and free will—how and where we are aware of what we know and what we do. He felt that consciousness would have to be an essential element of any global brain theory. Thus, in his scientific career he has moved from identifying the DNA molecule in our cells that directs life to identifying the networks of cells in our brain that give conscious meaning to life.

What has emerged from Crick's work is not a theory, but a hypothesis that he hopes will guide scientists in the development of sound brain theories that will spark further research. Crick has published his work in a fascinating but controversial book for general readers who have an interest in the cognitive sciences, The Astonishing Hypothesis: The Scientific Search for the Soul (Crick 1994). His astonishing hypothesis is that everything that constitutes who each of us is as a human being involves nothing more than the behavior of a vast assembly of nerve cells and their related molecules. Everything includes all of our interior states—our joys and sorrows, memories and ambitions, our loves and hatreds, our sense of personal identity and free will. It's an astonishing hypothesis because it goes against the feeling that many, if not most, people have: that we're certainly more than a pack of functioning neurons, that we also have a disembodied mind, spirit, self. And it certainly goes against many religious beliefs.

Crick used this hypothesis to guide his biological search for the soul (or consciousness) within neural networks. He focused his initial efforts on our visual system, the window to our soul. It's the brain system that scientists understand best, and Crick believes that if scientists can identify the neural systems that collaborate to create visual awareness (or consciousness), they can then move on to other cognitive processes, such as hearing and touch, in the development of a global, biologically based theory of consciousness.

Over the course of his book, Crick proposes what he considers to be a plausible model of the visual system that could explain visual awareness—how and where our brain knows (and attends to) what it sees. The thalamus, in the center of our brain, appears to play an especially important role. It's the relay center between our sense organs and the cortex, the large, folded sheet of neurons on the top of our brain that processes and remembers the objects and events we experience. Innate and learned biases toward certain kinds of visual information, and reverberating circuits between the thalamus and cortex, help to identify the important elements in the current visual field and to activate a synchronized firing pattern among the various networks that process the elements. This process holds the important information within our attentional and short-term memory systems, ignores the less important information, and thus seems to create the visual awareness we experience.

Suppose that I've gone to the airport to meet someone. I know what flight he's on and that he's a 40-year-old man who is six feet tall and will be wearing glasses and a red sweater. This descriptive information will focus my attentional mechanisms to those properties, and I'll attend only to incoming passengers who fit at least some of them. What occurs in my brain is thus a mix of (1) my direct perception of the man himself and (2) internal cognitive processes that prime certain networks to fire more easily than they normally would in a crowd of strangers.

Gerald Edelman: The Theory of Neuronal Group Selection

Gerald Edelman shared with Rodney Porter the 1972 Nobel Prize in physiology or medicine for a major discovery about how our immune system operates. Like Francis Crick, he then turned his attention to our brain. He is currently the Director of the Neuroscience Institute and Chairman of the Department of Neurobiology at the Scripps Research Institute.

Edelman's move from our immune system to the brain isn't as strange as it may seem. Our immune system is a sort of loose brain, in that most immune cells float free in our body, while our brain's neurons function within a highly interconnected web. Both systems are functionally similar, however, for both are highly integrated systems that recognize and respond to a wide variety of potentially helpful and hurtful stimuli. From the sensory information that reaches our skin's surface, our brain creates an internal mental model of external objects and events, and then responds appropriately to friend and foe. Our immune system examines the shapes of antigens that invade our body, and then destroys those that pose dangers to our body.

Edelman won the Nobel Prize for his discovery that the immune system doesn't operate through an instruction/memory model, as had been thought, but rather through evolutionary natural-selection procedures. The earlier belief was that generic antibody cells learned to recognize harmful antigen invaders, such as bacteria and viruses. The immune system then destroyed the antigens, and the system remembered the shape of the invader in the event of subsequent invasions. Edelman, however, found that through natural-selection processes occurring over eons of time, we are born with a vast number of specific antibodies, each of which recognizes and responds to a specific type of harmful invader that shares our environment. If we lack such a natural immunity to a specific invader (such as the AIDS virus), we may die if infected. Our immune system can't learn how to destroy the invader; it simply has or hasn't the capacity at birth.

Edelman then studied our functionally similar brain to see if it also operates principally on natural selection, rather than on instruction and learning. His controversial theory, the theory of neuronal group selection (or neural Darwinism, as it's more commonly called) argues that our brain does operate on the basis of natural selection—or at least that natural selection is the process that explains instruction and learning.

Edelman's theory currently appears to be the most completely developed biological brain theory, so the remainder of this chapter will focus on it. Edelman developed his theory through four books published since 1987. The latest, Bright Air, Brilliant Fire (Edelman 1992), presents the most complete and informal explanation of his complex theory, so it is the best resource for educators with a limited background in science (though it is challenging reading material).5

A New Brain Model

. We tend to use simple models to help us understand complex phenomena, but the model we choose can sometimes hinder our understanding. The computer is the prevailing model of our brain, and an appealing one, but Edelman (along with other brain theorists) argues that it's an inappropriate model because a computer is developed, programmed, and run by an external force, and our brain isn't. (Terms such as teacher and parent come to mind as the programmers for our brain.) A computer model biases our thoughts toward filing and operating systems that differ significantly from the way our brain processes information. For example, most brain memories appear to be stored in the same locations that carry out current operations. Further, the powerful role that emotion plays in regulating brain activity, and the preponderance of parallel (rather than linear) processing in our brain, suggest to Edelman that a useful model for our brain must come out of biology, not technology.

Edelman suggests a better model: that the electrochemical dynamics of our brain's development and operation resemble the rich, layered ecology of a jungle environment. A jungle has no external developer, no predetermined goals. Indeed, it's a messy place characterized more by organic excess than by goal-directed economy and efficiency. No one organism or group runs the jungle. All plants and animals participate in the process, each carrying out a variety of ecological functions. A tree is a single organism, but it also participates in many symbiotic activities with other organisms (e.g., insects, birds, vines, and moss). It doesn't develop its limbs as a nesting site for birds, but birds use the limbs for that purpose.

Further, the jungle environment doesn't instruct organisms how to behave in an ecologically appropriate manner, for example, by teaching trees how to position their limbs and roots to get sunlight and soil nutrients. It's more a matter of natural selection, in an evolutionary sense. All trees have the innate capacity to reach the sun and soil nutrients, and those that succeed in doing so will thrive and reproduce. The others die, and other organisms take their place. An environment doesn't tell its organisms how to change so that they will increase their ability to survive. Evolution works by selection, not by instruction. The environment selects from among the built-in options available to it, it doesn't modify (instruct) the competing organisms.6

From Model to Brain

So it is with our brain, Edelman argues. Think of the vast number of highly interconnected neural networks that make up our brain as the neural equivalent of the complex set of jungle organisms that respond variously to environmental challenges. The natural selection processes that shape a jungle over long periods of time also have also shaped our brain over an extensive period, and they shape our brain's neural networks over our lifetime.

Our brain is made up of tens of millions of relatively small basic neural networks, and just as each type of immune antibody responds to a specific environmental antigen, so each sensory network processes a specific element of the external world—a single sound, a diagonal line. Various interconnected combinations of these basic neural networks process more complicated, related phenomena—from sounds to phonemes to words, from lines to triangles to pyramids.

Thus, we have a modular brain, in that a relatively small number of standard, nonthinking components combine their information to create an amazingly complex cognitive environment. For example, when we observe a red ball rolling along a table, our brain processes the color, shape, movement, and location of the ball in four separate brain areas. It's not yet clear how the complex communications among four such areas result in our brain's creation of a unified picture of a rolling red ball—but then, it's also not clear how the members of a jazz quartet communicate with one another as each improvises on a simple theme, blending individual efforts into unified song.

The theory of neural Darwinism argues that genetic processes that evolved over eons of time create a generic human brain that is fully equipped at birth with the basic sensory and motor components a human needs to function successfully in the normal physical world. Our species needs to hardwire its basic survival networks (e.g., circulation, respiration, reflexes), but individuals also need the flexibility of adaptable or "softwired" networks to be able to respond to specific environmental challenges (e.g., to learn French, to drive a car).

An infant brain doesn't have to learn how to recognize specific sounds and line segments; such basic neural networks are operational at birth. We don't teach a child to walk or talk; we simply provide opportunities for adaptations to an already operational process.

Gazzaniga (1992) argues that all we do in life is discover what's already built into our brain. What we see as learning is actually a search through our brain's existing library of operating basic networks for the combinations of those that best allow us to respond to the immediate challenge (much like college students in a library select and synthesize materials from various existing sources to write their term papers).

On the other hand, our DNA couldn't possibly encode our brain's networks for every possible combination of sights, sounds, smells, textures, tastes, and movements that our brain can process, so instead it encodes a basic developmental program that regulates how neurons will differentiate and interconnect. The fetal brain thus develops general areas dedicated to various basic human capabilities within a certain range of variation, such as our ability to process language. Infant brains are born capable of speaking any of the 3,000+ human languages, but they're not born proficient in any of them.

When infants begin to interact with the local language, their brain can already recognize the sounds of the language. The larger neural networks that process the specific language(s) they'll speak form as the various combinations of sounds in the language(s) occur frequently. The amount of use selectively strengthens and weakens specific language networks. The networks for sounds that aren't in the local language may atrophy over time due to lack of use, or they may be used for other language purposes. Scientists call this process neural pruning. We can see its results in the difficulty that most older Japanese adults have with the English l and r sounds, which aren't in the Japanese language. A Japanese adult who learned English as a child would have no trouble with the two sounds.

To those who argue that they taught their child to speak a language, the theories ask, in effect, "And when and how did you teach your child your native accent, prepositional phrases, and the rules for forming the past tense?" Children master most of the complexities of grammar with practically no explicit instruction from their parents, although extensive parent-child verbal interactions obviously provide an important environment for the effective development of a language.

Thus, learning becomes a delicate but powerful dialogue between genetics and the environment: the experience of our species from eons past interacts with the experiences we have during our lifetime. Our brain is powerfully shaped by genetics, development, and experience—but it also then actively shapes the nature of our own experiences and of the culture in which we live. Stimulating experiences create complex reciprocal connections among neural networks. A limited sensory input can thus trigger a wide range of memories, but such memories can also trigger internal fantasies and external explorations.

Parenting and teaching are probably something like facilitating agents, but how the new brain theories will eventually reconceptualize such concepts is not yet clear. Hubel (1988) certainly underscored the important role that facilitating agents play in early life experiences when he studied the development of the visual cortex in kittens. Kittens reared in a research environment that lacked certain line orientations (such as vertical or horizontal lines) suffered a dramatic decline in the viability of the neural networks that normally process the type of line orientation that had been eliminated from the kitten's experience—and so they tended to walk into chair legs if vertical lines had been eliminated from their early experience.

Technology as a Solution to Biological Problems.

Unfortunately, biological evolution proceeds at a very much slower pace than cultural evolution, so we're forced to grapple with current social and environmental issues using a brain that biological evolution has tuned to the far different cognitive challenges of 30,000 years ago, when physical dangers were signaled by rapid changes in the environment, not by gradually developing problems (e.g., pollution, overpopulation, acid rain).

Part of the difficulty is that evolutionary modifications occur within the existing biological system. Evolutionary processes don't dismantle an existing mechanism, such as our brain, and start again from scratch. Evolutionary modifications may therefore differ considerably from what intelligent engineers might have developed had they redesigned our brain from scratch to meet our current needs (Churchland 1992).

We've compensated by seeking technological solutions to our problems. In effect, we've added a layer of technological brain (e.g., autos, books, computers, drugs) outside our skull—a layer that continually interacts with our internal biological brain. But each technological advance also creates new human problems. Our profession will be challenged to reconceptualize formal education as new brain theories evolve, and then to discover how best to reset our brain during its development, so that humans might one day develop sound biological solutions to many technological problems that now seem to defy solution. Chapter 6 explores this issue.

The Biological Nature of Consciousness.

Francis Crick focused his attention on consciousness, and neural Darwinism also seeks to define the biological nature of consciousness, an important but formidable challenge for any brain theory.

Edelman divides consciousness. Primary consciousness is a state of being mentally aware of objects and events currently in the immediate environment. But these mental images aren't accompanied by any sense of being an organism with a past or future. An animal with primary consciousness sees a room the way a beam of light illuminates it—with an awareness of only the illuminated areas, and with no ability to connect what it sees to other areas. Edelman calls this level of consciousness "the remembered present." Primary consciousness permits the brain to create a complex mental scene that connects the immediate perceptions of a situation to the parts of the brain that process such survival values as food, light, and warmth—and so it takes a subjective (i.e., eat-or-be-eaten) view of everything it confronts.

Higher order consciousness is perhaps a distinctly human condition that allows us to build on primary consciousness, to go beyond it to recognize our own personal actions and values. It uses language and other symbols in processes such as reflection and generalization that can emotionally detach us from the here and now, and lead us into purely imaginative mental scenes. Higher order consciousness suggests a linking of the brain areas that process primary consciousness with the areas of symbolic memory and conceptualization: it adds past and future to the present, and a sense of the inner self to the world out there.

Thus, memory combines a built-in species bias for such values as food, warmth, and survival with current short-term events. Long-term memory is an adaptive (but currently ill-understood) cognitive technique that operates within a single lifetime. It is a necessary capability for directing conscious behavior from within, for moving beyond pure stimulus-response behavior. Further, laws and traditions become cultural memories that can last beyond a single lifetime—perhaps an early first step in genetic change.

Seeking Educational Applications

Finding practical educational applications in Crick's hypothesis and Edelman's theory is difficult. At this point, they are basically science research agendas. Applications to educational policy and practice will come later, after the kind of study that leads to greater understanding.

Edelman's model of our brain as a rich, layered, messy, unplanned jungle ecosystem is especially intriguing, however, because it suggests that a junglelike brain might thrive best in a junglelike classroom that includes many sensory, cultural, and problem layers that are closely related to the real-world environment in which we live—the environment that best stimulates the neural networks that are genetically tuned to it.

The classroom of the future might focus more on drawing out existing abilities than on precisely measuring one's success with imposed skills, encourage the personal construction of categories rather than impose existing categorical systems, and emphasize the individual, personal solutions of an environmental challenge (even if inefficient) over the efficient group manipulation of the symbols that merely represent the solution. Educators might then view classroom misbehavior as an ecological problem to be solved within the curriculum, rather than aberrant behavior to be quashed. The curriculum might increase the importance of such subjects as the arts and humanities, which expand and integrate complex environmental stimuli, and reduce the importance of basic skills and forms of evaluation that merely compress complexity.

Such a brain-based curriculum might resemble some current practices, but it might differ considerably from what schools are now doing. It's interesting to muse on such widely acclaimed developments as thematic curricula, cooperative learning, and portfolio assessment. All require more effort from teachers than do traditional forms of curriculum, instruction, and evaluation. Is the appeal to educators that these approaches seem to be inherently right for a developing, junglelike brain, even though they require more professional effort and aren't nearly as economical and efficient as traditional forms?

For now, Crick and Edelman and their growing band of fellow brain theorists provide us with rich (and at times junglelike) book environments for professional reading and contemplation. The theories will continue to develop, and educational leaders must enter into the process now, or else biologists may well redefine our profession for us.

End notes

1 Patricia Smith Churchland emphasizes the need for brain researchers and philosophers (and educators) to broaden their perspective of brain/mind in her book Neurophilosophy: Toward a Unified Science of the Mind/Brain (Cambridge, Mass.: MIT Press, 1986). It's a fine introduction to neuroscience for philosophers, and to philosophy for neuroscientists. Howard Gardner's The Mind's New Science: A History of the Cognitive Revolution (New York: Basic Books, 1985) is an excellent, comprehensive account of the development of the cognitive sciences.

2 Susan Allport's Explorers of the Black Box: The Search for the Cellular Basis of Memory (New York: W.W. Norton, 1986) is a fascinating account that focuses on the human side of this extended research, the two principal researchers being Daniel Alkon and Eric Kandel. Memory's Voice: Deciphering the Brain-Mind Code (New York: HarperCollins, 1992) is Alkon's autobiographical account of the research.

3 Michael Gazzaniga, one of Roger Sperry's coworkers and a distinguished researcher in his own right, has written an informative, witty, and fascinating account of the 30-year period encompassing the split-brain research in The Social Brain: Discovering the Networks of the Mind (New York: Basic Books, 1985). "The Two Brains," the sixth segment of the PBS television series The Brain (available on videocassette through PBS Video, 1 800-344-3337), contains an extended discussion with Gazzaniga and others involved in this research.

William Calvin and George Ojemann's Conversations with Neil's Brain: The Neural Nature of Thought and Language (Reading, Mass.: Addison-Wesley, 1994) is a fascinating description of how neurosurgeons now treat serious epileptic cases, and what their work has taught us about language and memory.

4 Michael Posner and Marcus Raichle's Images of Mind (New York: Scientific American Library, 1994) is an informative, well-written, and nicely illustrated account of how brain scientists currently use imaging technology. It's written for general readers. Many high school and public libraries have the entire Scientific American Library.

5 The other three books Edelman wrote as he developed his theory are: The Remembered Present: A Biological Theory of Consciousness (New York: Basic Books, 1989), Topobiology: An Introduction to Molecular Embryology (New York: Basic Books, 1988), and Neural Darwinism: The Theory of Neuronal Group Selection (New York: Basic Books, 1987).

See Oliver Sacks, "Making Up The Mind," The New York Review of Books (April 8, 1993), pp. 42-49, for an excellent review of Bright Air, Brilliant Fire. Steven Levy's "Dr. Edelman's Brain," The New Yorker (May 2, 1994), pp. 62-73, is an informative profile of Edelman and his theory. Michael Gazzaniga's Nature's Mind: The Biological Roots of Thinking, Emotions, Sexuality, Language, and Intelligence (New York: Basic Books, 1992) and Robert Ornstein's The Evolution of Consciousness: The Origins of the Way We Think (New York: Prentic Hall Press, 1991) are informative nontechnical discussions of the general thrust of the current theoretical work.

6 William Calvin, "The Emergence of Intelligence," Scientific American (October 1994): 101-107, describes six processes that define Darwinian evolution and suggests how they relate to genetics, immunology, and brain development: (1) The biological process operates on patterns, such as DNA. (2) These patterns are copied. (3) Patterns must occasionally vary, such as through mutations or copying errors. (4) Variant patterns must compete to occupy some limited space. (5) The environment influences the relative reproductive success of the variants, and this is called natural selection. (6) The makeup of the next generation of patterns depends on which variants survive to be copied.

7. How Our Brain Adapts Itself to Its Environment
Our brain's maximum capabilities were genetically determined by its need to respond quickly and effectively to crisis conditions rather than by its need to respond to normal life challenges. Furnaces are designed on the same principal: models are guaranteed to function well during the coldest weather in a specific area, not just the average weather.

The basic genetic developmental pattern for our brain is thus quite simple and straightforward: (1) create an initial excess of cells and connections among related areas—in effect, temporarily wire up everything to everything, (2) use emotion, experience, and learning to strengthen the useful connections, and then prune away the unused and inefficient, and (3) maintain enough synaptic flexibility (commonly called plasticity) to allow neural network connections to shift about throughout life as conditions change and new problem-solving challenges emerge.

Consequently, redundancy and alternate systems abound in our brain: two hemispheres, pairs of amygdala and hippocampi, several sensory systems with paired organs that respond to overlapping properties of the physical world, and complex neural systems that define and process multiple intelligences. Each such structure and system is itself powerful in the normal challenges it confronts, but these structures and systems can combine marvelously to solve very serious problems; to create new explanations, artifacts, and strategies; and to overcome terrible assaults.

The emerging brain theories discussed in Chapter 1 argue that innate factors play a more important role in determining our brain's capabilities than was previously believed. Conversely, our profession has historically and optimistically focused on nurturing factors that can increase our brain's capabilities. Both positions strongly suggest that children should choose their parents carefully—the former because of the genes that parents pass on, the latter because of the cultural environment that parents create. But both positions also recognize that neither nature nor nurture can exist without the other. It's like trying to determine which hand contributed most to the sound of hands clapping.

The major educational question to emerge out of recent brain theory and research is this: How much effect do environmental challenge and stimulation have on the general and specific capabilities and limitations of our students' brains? This chapter will first report on key animal and human research studies that focused on this issue. It will then begin the discussion of the broad educational implications of this research that I hope you will continue with your colleagues and patrons as you explore applications. To put it simply, should we fine-tune, overhaul, or revolutionize our current classroom pattern of instruction?

Brain Plasticity Studies with Animals
Marian Diamond (1988) is one of a number of researchers who have used rats in carefully controlled studies of the effects of environmental stimulation and deprivation on the development of the brain's cortex, the large sheet of neural tissue at the top of the brain that processes environmental interactions. Although most of a brain's lifetime supply of neurons are in place shortly after birth, many of the axon-dendrite connections that process cognitive information develop after birth, as a brain gradually adapts to its environment and makes itself the unique result of its own experience.

In the human brain, this postbirth development results in a weight increase from about one pound at birth, to two pounds at age one, to three pounds at late adolescent maturation. Herman Epstein (1978) found a postbirth brain development pattern of growth spurts and plateaus that relate to the stages of cognitive development that Piaget had earlier identified, without the biological correlate that Epstein found (Flavell 1963).1

Our brain continues to adapt its networks throughout adult life as it adds, edits, and erases memories and problem-solving strategies, but these processes don't result in a weight increase. Think of an analogous pattern in the making and shifting of interpersonal connections in our lives. When we first move into a neighborhood, we tend to check out area businesses and facilities before settling into the basic group we then normally patronize. If we live in the same residence for years, we may change many of these initial connections—job, stores, friends, memberships. Despite these shifts, our total number of relationships may remain relatively constant over the years—a favorite grocery store and service station, a dry cleaner, a couple of dozen friends, and so on.

Brain plasticity researchers study rats, whose overall mammalian brain development pattern resembles that of humans. The basic research design (with variations) compares the brains of rats that have lived in different environments for differing periods of time: (1) rats living alone in a small, unfurnished cage, (2) a group of 12 to 36 rats living together in a large laboratory cage that contains a regularly changed and stimulating collection of toys and other objects to explore, and (3) a group of rats living in a much larger outdoor, seminatural rat habitat. Most of the research has focused on conditions 1 and 2.

As one might expect, the researchers found that the best cortex development emerged from the social and environmental stimulation of the rat's natural habitat, followed by the enriched social cage, followed at a significantly lower level by the impoverished solitary environment.

The socially oriented seminatural and enriched laboratory settings produce a thicker and heavier cortex: larger neurons, more and better interneural connections, and a greater supply of glial support cells. These elements create a potentially better brain for learning and remembering, defined in rats by their ability to run mazes. Researchers consistently found the most effects in the occipital lobes (vision), but all cortical regions respond positively to enriched environments. The effects are similar, whether 12 or 36 rats live together in the 3' x 3' enriched laboratory environment. Although a brain's plasticity is greater during the early developmental period, researchers obtained enhanced effects throughout the rats' lifetime. Indeed, they found significant general cortical improvements after only a few days when they moved adult rats from an isolated cage to an enriched social environment.

Although the cortex has a remarkable ability to adapt successfully to different environments, it does have its limits. A brain may be unable to recover from the effects of serious environmental deprivation during a critical brain development period. For example, Hubel (1988) discovered that an otherwise normal cat was blinded for life in an eye that was covered for only a few days during a critical period of visual development. He also discovered that adult cats were unable to effectively process vertical or horizontal line segments if they were reared in an environment devoid of such lines.

It's probably safer to generalize from rats to mice than from rats to human beings. For example, a rat brain fills a thimble, while a human brain fills a three-pint container. The forebrain occupies 45 percent of a rat brain's mass, compared with 85 percent in humans; frontal lobes occupy about 5 percent of a rat's brain compared with 30 percent of the human brain; the cortex matures in about a month in a rat compared with about 18 years in a human brain.

Still, researchers expect to find related patterns of plasticity in humans when they develop the technology to monitor growth in specific areas of the human brain—with the differences between rats and humans probably occurring in location and degree of plasticity. Diamond and her colleagues (1985) compared Albert Einstein's preserved brain with the brains of normal people and discovered significantly more glial support cells in the angular gyrus, an important cortical area that integrates sensory data and processes conceptual and symbolic thought.

The research involving an enriched environment is important for educators, even with the caveats suggested above. All mammalian brains process information similarly, and the enrichment research indicates that the basic networks regulating a brain's interactions with its environment can maintain their plasticity and vigor throughout life if stimulated to do so. Because neurons thrive only in an environment that stimulates them to receive, store, and transmit information, the challenge to educators is simple: define, create, and maintain an emotionally and intellectually stimulating school environment and curriculum.

As we begin this process of exploring what to do and when to do it, it might be useful to pause briefly to consider how our culture normally apportions the approximately 150,000 hours of living we expend between ages 1 and 18.

We sleep about 50,000 hours of this time, and we dream about two of the eight hours we sleep each night. As reported in Chapter 5, sleeping and dreaming appear to be positively related to the development and maintenance of survival memories and other long-term memories.

We spend about 65,000 of our waking hours involved in solitary activities and direct, informal relationships with family and friends, and these play a major role in the maintenance of personal memories.

We spend about 35,000 of our waking hours with our larger culture in formal and informal metaphoric and symbolic activities—about 12,000 hours in school and nearly twice that amount with various forms of mass media (e.g., TV, films, music, sports, print media unrelated to school). Mass media and school thus play major roles in the development of important cultural memories.

From this information we can see that on an average developmental day between the ages of 1 and 18, a young person sleeps eight hours; spends ten waking hours with self, family, and friends; four hours with mass media; and only two hours in a classroom-oriented school. Our society has incredible expectations for those two hours!

We can think of the traditional classroom as an artificial environment, somewhat analogous to the laboratory environments in Diamond's plasticity studies. Although critics argue that a school lacks the direct, stimulating challenge of the natural world, our society considers school a flawed but efficient way to deal with complex cultural information that doesn't generally come up in family life or the mass media. Further, the time apportionment reported above suggests that more than 80 percent of the waking hours of a child and adolescent are spent outside of school in family, peer, and electronic environments that range from stimulating to impoverished, from social to solitary. Thus, the research design of the brain plasticity studies presents us with a set of three interacting models of educational environments to contemplate in our profession's search for the best use of the limited time we have in our students' lives. The brief introductory discussion below should stimulate your thoughts on the issue.

The Natural Environment. Many educational theorists have proposed over the years that we should move students out of the classroom and into the natural world that students inhabit during their hours outside the school. If that's not possible, they argue that we should at least organize the curriculum around classroom simulations, role playing, field trips, and other activities that more closely parallel the experiences and problem-solving challenges of the natural world.

When done correctly, the much-maligned extracurricular program probably gets as close to real problem solving as anything else we do in school. It uses metaphor, play, and limited adult domination in a nonthreatening, informal setting to explore the dimensions, tactics, and strategies of problem solving. The Duke of Wellington once suggested that the Battle of Waterloo was won on the playing fields of Eton. Play is an important element in a brain's development.

The Laboratory/Classroom Environment. When mature rats were placed in an enriched environment with a group of younger rats, the mature rats played with the toys and dominated the environment. They were stimulated by the environment and developed thicker cortexes. The less-involved younger rats, however, did not experience positive brain development in this potentially stimulating environment (Diamond 1985). These experimental results could find their human representation in classrooms where the teacher dominates the curricular, instructional, and evaluative decisions and activities. It isn't enough for students to be in a stimulating environment—they have to help create it and directly interact with it. They have to have many opportunities to tell their stories, not just to listen to the teacher's stories.

What role should teachers play in a classroom that purports to stimulate? John Dewey (1938) commented on the folly of mature adults who work in classrooms with immature students, but don't use their maturity to enhance the students' experience. To use one's maturity, though, doesn't mean to dominate. If we define the most mature person in a social setting as the person in the group who is the most able to adapt to the needs and interests of others, then teachers ought to adapt to their students whenever possible, and not always expect their students to adapt to them.

Such activities as student projects, cooperative learning, and portfolio assessments place students at the center of the educative process, and thus stimulate learning.

The Solitary Environment. It isn't enough to create an environment that merely keeps students busy. Rats placed into a small, solitary cage furnished only with a running wheel stayed active by using the wheel, but experienced no increase in cortical thickness: shades of continual drudgery with workbooks and long division problems. Years of research have found patterns of positive cortical effects only in changing, stimulating, social environments. Rats need to interact with other rats to learn how to solve rat problems. Running a solitary wheel doesn't do the job. The situation is similar with students: a stimulating social setting provides the only appropriate environment for mastering social skills.

Perhaps the most complex educational issue to come out of the human projections of the rat studies is the problem of trying to define a normal human environment—beyond the basic properties of being social, changing, and stimulating. Rats flourished best in their normal outdoor habitat. What is the normal habitat of contemporary children and adolescents, the environment that probably has the best potential for developing their brain to its maximum?

It may well be that the limited contemporary classroom and the out-of-school world that many students experience are closer to a natural human habitat than we care to admit. Many families already live in a human version of the enriched social rat cage, with a daily rearrangement of toys sent via TV and consumer technology. It sounds terribly depressing at first thought, and we tend to flog away at our indoor civilization and its electronic artifacts. But I can't think of anything from my childhood that did more to develop eye-hand coordination than the controls on a video game do for my grandchildren. TV and interactive computers have turned our world into a global village, and nobody knows what tomorrow will bring.

We thus need to keep an open mind about our urban electronic culture. We can romanticize the stimulation of a simple life in the great outdoors, but most people live regulated lives that occur primarily indoors—and most of us seem to make the best of it, enriching our lives in imaginative ways that nourish the human spirit.

It's important to remember that the enriched social rat cage did result in significant growth over the impoverished solitary environment. Schools have a responsibility to help students to adapt to the realities of our culture—to enjoy what is good, to resist what is evil. Our profession's major challenge is to create solid enrichment in a social school environment that admittedly has a high potential for impoverishment—to turn an artificial classroom environment into a respectable approximation of a natural human environment.

When pressed to draw practical classroom applications from her years of research with mammalian brains, Marian Diamond smiled and replied that teachers ought to approach their assignment with a commitment to provide their students with tender loving care. Tender loving care in the rat studies means that researchers handle rats gently when they work with them. Researchers have discovered that this simple tactile act in itself extends the life span of the animals and, in turn, positively affects their cortical development. Diamond leaves it to educators to discover the human equivalent to her rat care, but she believes that each student should be treated as an individual, with every effort made to bring forth the best in that student.

The discussion above has extrapolated from a rat research environment to a human classroom environment, and such reasoning can introduce problems. At some point we've got to get solid data from humans, so let's turn now to three such research studies. These studies couldn't measure such indicators of brain capability as cortical thickness, as in the rat studies, because they occurred prior to the recent advances in brain-imaging technology that could provide such useful information, but the researchers were able to talk with their research subjects—something the animal researchers couldn't do.

Studies with Hardy and Resilient Human Beings
It's obviously impossible to create a controlled study on the effects of the environment on human brain maturation because such a study would have to rear some of the child subjects in an impoverished environment. Occasionally and tragically, however, a child is reared in such an environment, and so provides useful information on our brain's resilient potential. Genie was one such girl (Rymer 1993).2

The Tragic Case of Genie
Authorities discovered 13-year-old Genie in 1970 in the Los Angeles area house where her disturbed father had raised her strapped naked to a potty chair in a back bedroom devoid of sensory stimulation. At night she was placed in the equivalent of a straitjacket in an infant's crib. Her parents rarely spoke to her, and so she had no language skills when she was discovered.

As tragic as her situation was, it created an opportunity for researchers at the University of California at Los Angeles to place her in the caring and stimulating home environment of one of the researchers, and to try to compensate for her years of terrible deprivation with a responsible instructional program. The controversial research studies that emerged out of this process were flawed, but Genie did progress regularly in her ability to walk, eat, talk, and function socially during her five years with the family. Her vocabulary development was quite good, but she was very deficient in sentence structure.

Genie's mother, who had also been victimized by Genie's father, regained custody of Genie at 18, and this legal development unfortunately stopped the research efforts that had attempted to develop her intellectual and language abilities as much as possible. Genie now lives in a home for retarded adults.

We'll never know how much Genie would have developed intellectually and socially with continuous and caring stimulation—and that's too bad for Genie and for those who wonder how much the school and home can do to overcome innate deficiencies and the effects of a traumatic early childhood. But even if the vigorous program of stimulation and instruction had continued, researchers still would never have known what innate potential Genie had. Her father had considered her retarded, and that view helped lead to his inappropriate rearing practices. Considering her terrible childhood, Genie did make astounding and optimistic progress, especially in language development, which linguists had believed was impossible when begun at such a late age.

Genie thus provided us with a rare and tantalizing glimpse into the human equivalent of the plight of the solitary rat in a laboratory cage devoid of any stimulation. As with the rat, she had to leave her solitary, impoverished environment for any hope of adequate cognitive development.

The Hardy Adults
What can our adult brain do when it is seriously challenged by a major life change that it can't control? Hans Selye (1956) and Holmes and Rahe (1967) were pioneering researchers who discovered that chronic stress and major life changes can adversely affect a person's health and consequent ability to cope with life's challenges. Other researchers followed, and one intriguing study examined people who suffered few ill effects in a very stressful situation.

Maddi and Kobasa (1984) studied several hundred middle-and upper-level male managers at AT&T during a very stressful two-year period when the company was being reorganized and their jobs were in jeopardy. About two-thirds of the group suffered stress-related illnesses, but the other third were psychologically hardy, seeming to thrive on the stressful challenges they faced. They experienced less than half the amount of illness experienced by the other high-stress executives.

Maddi and Kobasa discovered that hardy executives who did not suffer the debilitating effects of stress, even though they worked and lived in potentially high-stress environments, demonstrated high levels of challenge, commitment, and control in their lives. They had learned how to effectively use their brain's problem-solving capabilities:

They viewed change as a constant in their life, and welcomed it as a challenge to grow. They approached potentially stressful problems with a clear sense of the importance of their own personal goals, values, and abilities. Realizing that they couldn't do everything, they focused their energies on what they must and could do, and ignored or else sought help for the things they couldn't or shouldn't do. In this, their supervisors respected and supported them.

They had a strong commitment to the significant relationships in their life. They identified relationships between major problems and their own clearly defined general life and career plan, and its established personal, family, and job priorities. They could separate the foreground/background and subjective/objective elements of a problem, and then psychologically separate them. For example, they didn't take personally things that weren't meant to be personal slights, and they didn't take work problems home or bring personal problems to work.

They had an internal rather than an external locus of control. Although others may have caused their problem, they assumed responsibility for developing the solution that best met their needs. They didn't consider themselves to be mere victims of circumstance, but rather took personal control of their life, with all its successes and failures.

What we have in this hardy group are people who would make excellent teachers and role models. They would be take-charge teachers with a strong and accepting sense of who they are and what they do, caring teachers who can separate their subjective feelings for their students from the objective demands of their assignment.

What we don't know from this research is whether the ability and personality that allowed these people to function effectively in a very stressful situation came from innate body/brain factors or from a childhood environment that had developed these qualities in them. Therefore, let's look at a study that examined the childhood of high-risk children.

The Resilient Children
In 1955, Emmy Werner and her colleagues began to study about 200 children on the Hawaiian island of Kauai who were considered to be seriously at-risk at birth because of such factors as illness, family poverty, parental discord, and parental mental or medical problems. She has studied them for almost 40 years in an incredible longitudinal study (Werner and Smith 1992). Approximately 700 children were born on the island in 1955, and about 420 of these were born healthy and grew up in supportive environments. The 200 who were at risk because of their health, family, or social environment became the focus of the study.3

About two-thirds of these children (129) did not sufficiently overcome their circumstances to create a successful adult life. For example, they developed learning and behavior problems and had delinquency records, mental health problems, and early pregnancies. About one-third (72) of the study group became resilient, however, and adapted successfully to the problems they faced during their growing-up years. The 30 males had a more difficult time adapting to life during their first decade, and the 42 females during their second decade, but today these 72 resilient children have grown up to become successful adults who are living nurturant, responsible, achievement-oriented lives.

Werner and her colleagues identified several personal and environmental factors that they believe played important roles in developing the resilience that the 72 most successful subjects exhibited. She calls these protective factors or buffers; they protected the young people from their negative environment by providing support, skills, and hope when things looked bleak:

The 72 children were of at least average intelligence, and they were healthy, active, sociable children—with a pleasant personality that elicited positive responses from family members and strangers. They were "cuddly" in infancy, interacting easily with others, and this behavior encouraged adults to interact with them in ways that would enhance their intellectual development.

They were curious and interacted physically with their environment. As a result of their explorations, they developed interests and hobbies that weren't sex-typed and that they shared with friends.

They had both family and nonfamily mentors who provided them with unconditional love. This is an important protective factor, for it provided the resilient children with available positive adult models during a period in which their parents often did not provide such models. One can sense that these mentors encouraged them in their curiosities and hobbies—told them they could become whatever they wanted to be.

They were assigned responsibilities in a home environment that was reasonably well-structured. Although some such childhood tasks might be considered exploitative, adults looking back on their childhood often view such assigned tasks as evidence that their parents considered them to be capable and trustworthy. In a study of 500 at-risk students in grades 4 and 5, Kays (1990) found that the alienation from school that many at-risk students were beginning to feel came in part from their noninvolvement in routine classroom tasks that the other students were asked to do. Teachers didn't ask the at-risk students to get the projector, water the plants, or take things to the office. How do children learn to be responsible when they are never asked to be responsible in tasks that others depend on?

They developed a positive self-concept and an internal locus of control. Like the Hardy Executives, the Resilient Children were hopeful that when they confronted problems, everything would work out positively.

It's difficult to separate the impact of innate and environmental factors in the Kobasa and Werner studies. Werner noted that the negative effects of problems surrounding pregnancy and birth diminished over time, and the effects of the environment itself became more important. What we don't know, for example, is where the Hardy Executives and Resilient Children would have scored on each of Gardner's forms of intelligence. Were they born with intellectual abilities that tended to put them at the high end of the scales, so that their innate protective factors were strong enough to withstand and solve the problems they faced in their life? We simply don't know.

What we do know is that the middle-aged Hardy Executives were studied during a period of high stress. At that point in their lives they were stimulated by change and challenge, they were committed to themselves and to the significant others in their life, and they had an internal locus of control—a belief that they were responsible for their own life. Educators who possessed these same qualities would be fine role models for students to observe day after day in all sorts of challenging situations.

What we also know is that the Resilient Children were stimulated by other people and functioned effectively around them. They were successful enough in their activities to develop confidence in their interests and abilities, and they also developed an internal locus of control. Such children would profit from interactions with the educators described above. Imagine a classroom full of Resilient Children taught by a Hardy Adult!

The story is not an entirely happy one, however. This discussion focused on the one-third of the executives and at-risk children who were successful and became hardy and resilient. Two-thirds of both groups were not successful. What was the problem? Were their innate protective factors not strong enough, or did their environmental support (e.g., the school) fail them during their developmental years? What a challenge for our profession!

Chapter 1 reported that brain theorists now suggest that we forget the nature versus nurture dichotomy. Rather, we should view the phenomenon as a kind of dynamic interaction of innate and environmental factors. What we have is a classroom full of students who come with genes and nonschool experiences. We're not responsible for the genes, and we usually can't directly do anything about the experiences that students bring to school—but we are responsible for the quality of their school experiences. It's our task to make sure that school experiences enhance the development of a student's brain.

From Brain Theory and Research to School Policy and Practice
As you begin to think through and discuss the information and issues developed in this book—en route to developing school policy and practice attuned to what we're learning about our brain—recall that brain theorists insist that we must now think of our brain as a biological and ecological entity, not as an externally developed and controlled machine, such as a computer.

Thinking about our brain as a computer engenders thoughts of an efficient economical tool, something that exists solely to serve others. We do strive to assist and cooperate, but we are also biological entities with our own intrinsic value. We are both a part of and apart from the others who share our environment.

Gerald Edelman suggests that we think of our brain and its processes as being something like the current ecology of a rich jungle environment—in which natural selection and ecological principles operating both over eons of time and within our lifetime have created a magnificent human mind out of a basic human brain. The neural networks we're born with adapt marvelously to a continuously changing and challenging environment. Thus, teachers and parents become facilitators, who help to shape a stimulating social environment that helps students to work alone and together to solve the problems they confront.

Ecological principles that enhance the health of the larger environment would then also enhance the development and maintenance of our brain. In The Closing Circle, Barry Commoner (1974) proposes four laws of ecology that govern properly operating ecosystems. Since brain theorists view our brain as an ecological system, it might be useful (or at least intriguing) for you and your colleagues to examine Commoner's laws of ecology in the context of considering how to develop an ecologically oriented learning environment for an ecologically evolved brain. The four laws follow, with only enough (sometimes speculative) commentary to get your own curricular and instructional thoughts going:

Everything is connected to everything else. Our brain is a dense web of interconnected neurons. Any neuron is only a few neurons away from any other neuron, and all the organisms that inhabit our global village are now also highly interconnected (at least electronically). The naturalist John Muir suggested that when he carefully studied anything in nature, he discovered that it was connected to everything else in the universe. Thus, such things as the language arts, thematic curricula, and multicultural and environmental education programs are central to any curriculum that hopes to help students discover who they are, where they live, and how things are connected.

Everything must go somewhere. Everything that occurs within an environment (including a brain environment) leaves a trace. Just as toxic wastes will foul the subsequent life in an environment's ecological chains, so an abusive childhood will be remembered and will affect the child's subsequent life. Just as adding water, sun, and nutrients to an environment enhances the life of organisms in it, so such things as encouragement, help, and praise enhance the learning of students in a school environment.

Although the effects of what we do on a given day in school may not be immediately apparent, they do become part of the rich ecology of the student's life. I don't specifically know when and where I learned about the ecological beauty of the water cycle—except that I learned it in school. I also know that I learned a lot of obscene words and phrases in school.

Nature knows best. Complex environments function best through effective processes that have evolved over eons of time. Whether living in an environment or educating students, we must discover and follow the ecological principles that define our brain's capabilities and limitations. We ignore them at our peril: our students misbehave, and our patrons harass us.

The cognitive sciences are now providing much useful information on our brain and its processes, and this book has provided an introduction to that information. But for all practical purposes, that information doesn't exist if we educators don't become aware of it and don't use it in our explorations of how to improve the educative process.

There's no such thing as a free lunch. It takes effort to force a system to operate unnaturally (e.g., water flows downhill naturally, but uphill only with great effort). Educational procedures should seek to enhance our brain's strengths and to minimize the negative effects of its weaknesses. For example, we're generally good at such things as cooperating and conceptualizing, at defining moral and ethical issues, at storytelling. We're generally not good at things that require solitary sustained attention and precision.

Suggesting that we might begin to think about curriculum and instruction in the context of Commoner's four laws of ecology (or some other such metaphor) is a simple beginning to the solution of the problem of how best to fit about 100 pounds of student brain into a 1,000-square-foot classroom over a 1,000-hour school year—with at least one adult representative of our culture available to facilitate the operation.

But a simple solution isn't necessarily easy. The challenge of discovering new ways of thinking about what formal education is—and what it can be—is what will make teaching a creative, optimistic, and stimulating profession in the years ahead. It's the continual search for deeper meanings within simple systems that will stimulate imaginative educators to create new forms of enriched social environments within electronic classroom walls.

Current brain theory and research now provide only the broad, tantalizing outlines of what the school of the future might be—but we can anticipate that the rate of new discoveries will escalate. Educators who are willing to study the new cognitive science developments, and then to imaginatively explore and experiment in their search for appropriate educational applications, will have to work out the specifics in the years ahead. If our profession doesn't do it, nothing will happen. Things will remain as they are.

1 Both Piaget and Epstein did their research during periods in which it was very difficult to measure the subtle intellectual and brain differences they were studying, and some critics felt they went beyond their data. Epstein refuted his critics in a 1986 article, "Stages in Human Growth Development," published in a very respected scientific journal, Developmental Brain Research 30: 114-119. Although they were essentially biologists and not K-12 educators, Piaget and Epstein made major contributions to education by getting educators to think about their students in biological terms.

2 Russ Rymer's Genie: An Abused Child's Flight from Silence (New York: HarperCollins, 1993) and the PBS NOVA episode "The Secrets of the Wild Child" (with extended videotape of Genie) provide haunting, fascinating, and thought-provoking perspectives of Genie and her tragic situation. (Video-cassette available through WGBH; call 1-800-255-9424.

3 Emmy Werner and Ruth Smith's most recent report on their study, Overcoming the Odds: High Risk Children from Birth to Adulthood (Ithaca, N.Y.: Cornell University Press, 1992) is fascinating and informative—a marvelous source for educators who work with at-risk students, but also for those who want a positive boost on a day when everything looks bleak.