Psychology: The Cognitive Movement
Dr. C. George Boeree
In the latter half of the twentieth century, the advent of the computer and the way of thinking associated with it led to a new approach or orientation to psychology called the cognitive movement. Many are hoping that it will prove to be the paradigm -- the unifying theory -- we have been waiting for. It is still way too early to tell, but the significance of cognitive psychology is impossible to deny.
The roots of the cognitive movement are extremely varied: It includes gestalt psychology, behaviorism, even humanism; it has absorbed the ideas of E. C. Tolman, Albert Bandura, and George Kelly; it includes thinkers from linguistics, neuroscience, philosophy, and engineering; and it especially involves specialists in computer technology and the field of artificial intelligence. Let’s start by looking at three of the greatest information processing theorists: Norbert Wiener, Alan Turing, and Ludwig von Bertalanffy.
Norbert Wiener
Norbert Wiener was born November 26, 1894 in Columbia, Missouri. His father was a professor of Slavic languages who wanted more than anything for his son to be a genius. Fortunately, Norbert was up to the task. He was reading by age three, started high school at nine, graduated at 11, got his bachelors at 14, and his masters -- from Harvard! -- at 17. He received his PhD a year later, in 1913, with a dissertation on mathematical logic.
(If it is any consolation, Norbert was near-sighted, very nervous, clumsy, insecure, and socially inept. However, people liked him anyway!)
After graduation, he went to Cambridge to study under Bertrand Russell, and then to the University of Gottingen to study under the famous mathematician David Hilbert. When he returned, he taught at Columbia, Harvard, and Maine University, spent a year as a staff writer for the Encyclopedia Americana, another year as a journalist for the Boston Herald, and (though a pacifist) worked as a mathematician for the army.
Finally, in 1919, he became a professor of mathematics at MIT, where he would stay put until 1960. He married Margaret Engemann in 1926, and they had two daughters.
He began by studying the movement of particles and quantum physics, which led him to develop an interest in information transmission and control mechanisms. While working on the latter, he coined the term cybernetics, from the Greek word for steersman, to refer to any system that has built-in correction mechanisms, i.e. is self-steering. Appropriately, he worked on control mechanisms for the military during World War II.
In 1948, he published Cybernetics: or Control and Communication in the Animal and the Machine. In this book, he popularized such terms as input, output, and feedback!
Later, in 1964, he published the book God and Golem, Inc.,
which
he subtitled “a comment on certain points where cybernetics impinges on
religion.” He was concerned that someday machines may overtake
us,
their creators. That same year, he won the National Medal of
Science.
A few weeks later, March 18, he died in Stockholm, Sweden.
The idea of feedback is
very old, and is hinted at in the works of Aristotle. It began to
gain some notoriety in the 1700's, in the form of "the invisible hand,"
an idea introduced in Adam Smith's The
Wealth of Nations, which some see as the roots of both control
theory and game theory. Feedback is a simple idea: Take the
output of some system, and feed it back as an input, in order to in
some way alter the process. For example, homeostasis or the
thermostat principle is a form of negative
feedback: It gets cold in the house, which triggers the
thermostat, which turns on the furnace. It gets warmer, which
triggers the thermostat, this time to turn off the furnace. Then,
it gets colder, and the cycle begins again. The goal of such a
system is equilibrium (say, 70º F in the house), but it is
actually an oscillating or "hunting" process.
Positive feedback occurs
when the output tells the system to produce even more of
something. Although the "positive" in positive feedback makes it
sound like a good thing, if it isn't backed up with negative feedback,
it tends to run out of control. A common example of positive
feedback are economic bubbles, where something increases in value (such
as tulips in 17th century Holland, or "dot-coms" in the recent past),
everyone buys into the product, driving up the prices, leading to more
investors, until finally the whole thing collapses.
What Wiener did was recognize the larger significance of this
feedback idea!
Alan M. Turing
Alan Turing was born June 23, 1912 in Paddington, London, the second child of Julius Mathison Turing and Ethel Sara Stoney. His parents met while his father and his mother’s father were serving in Madras, India, as part of the Civil Service. He and his brother were raised in other people’s homes while his parents continued their life in India.
A turning point in his life came when his best friend at Sherborne School, Christopher Marcom, died in 1930. This led him to think about the nature of existence and whether or not it ends at death.
He went to King’s College of Cambridge in 1931, where he read books by von Neumann, Russell and Whitehead, Goedel, and so on. He also became involved in the pacifist movement at Cambridge, as well as coming to terms with his homosexuality. He received his degree in 1934, and stayed on for a fellowship in 1935.
The Turing Machine -- the first description of what would become the modern computer -- was introduced in a 1936 paper, after which he left for Princeton in the US. There, he received his PhD in 1938, and returned to King’s College, living on his fellowship.
He began working with British Intelligence on breaking the famous Enigma Code by constructing code-breaking machines. In 1944, he made his first mention of “building a brain.”
It should be noted that Turing was also an amateur cross-country runner, and just missed representing the UK in the 1948 Olympics!
In 1944, he became the deputy director of the computing lab at
Manchester University, where they were attempting to build the first
true
computer. In 1950, he published a paper, "Computing Machinery and
Intelligence," in Mind. Turing saw the human brain as an
"unorganized machine" that learned through experience.
Unfortunately, he was arrested and tried in 1952 -- for homosexuality! He made no defense, but took an offer to stay out of jail if he would take estrogen injections to lower his supposedly overactive libido. He lost security clearance because of his homosexuality as well.
He began working on pattern formation in biology -- what we would now call the mathematics of fractals -- and on quantum mechanics. But on June 7, 1954, he committed suicide by ingesting cyanide -- making it look like an accident to spare his mother’s feelings. He was 41.
Today, he is considered the father of Computer Science. Let me let his biographer, Andrew Hodges, describe the famous Turing Machine:
His work introduced a concept of immense practical significance: the idea of the Universal Turing Machine. The concept of 'the Turing machine' is like that of 'the formula' or 'the equation'; there is an infinity of possible Turing machines, each corresponding to a different 'definite method' or algorithm. But imagine, as Turing did, each particular algorithm written out as a set of instructions in a standard form. Then the work of interpreting the instructions and carrying them out is itself a mechanical process, and so can itself be embodied in a particular Turing machine, namely the Universal Turing machine. A Universal Turing machine can be made do what any other particular Turing machine would do, by supplying it with the standard form describing that Turing machine. One machine, for all possible tasks.Quoted from Andrew Hodges’ “Alan Turing --- a short biography,” at http://www.turing.org.uk/turing/bio/part3.htmlIt is hard now not to think of a Turing machine as a computer program, and the mechanical task of interpreting and obeying the program as what the computer itself does. Thus, the Universal Turing Machine embodies the essential principle of the computer: a single machine which can be turned to any well-defined task by being supplied with the appropriate program.
Additionally, the abstract Universal Turing Machine naturally exploits what was later seen as the 'stored program' concept essential to the modern computer: it embodies the crucial twentieth-century insight that symbols representing instructions are no different in kind from symbols representing numbers. But computers, in this modern sense, did not exist in 1936. Turing created these concepts out of his mathematical imagination. Only nine years later would electronic technology be tried and tested sufficiently to make it practical to transfer the logic of his ideas into actual engineering. In the meanwhile the idea lived only in his mind.
Ludwig von Bertalanffy
Ludwig was born near Vienna on September 19, 1901. In 1918, he went to the University of Innsbruck, and later transferred to the University of Vienna, where he studied the history of art, philosophy, and biology. He received his doctorate in 1926, with a PhD dissertation on Gustav Fechner.
In 1928, he published Modern Theories of Development, where he introduced the question of whether we could explain biology in purely physical terms. He suggested we could, if we see living things as endowed with self-organizational dynamics.
In 1937, he went to the University of Chicago, where he gave his first lecture on General Systems Theory, which he saw as a methodology for all sciences. In 1939, he became a professor at the University of Vienna and continued his research on the comparative physiology of growth. He summarized his work in Problems of Life, published in 1940.
In 1949, he emigrated to Canada, where he began research on cancer. Soon, he branched into cognitive psychology, where he introduced a holistic epistemology that he contrasted with behaviorism.
In 1960, he became professor of theoretical biology in the department of zoology and psychology at the University of Alberta. In 1967, he wrote Robots, Men, and Minds, and in 1968, he wrote General Systems Theory.
Ludwig von Bertalanffy died of a heart attack on June 12, 1972.
Once upon a time, it was possible for one bright individual -- say
an Aristotle or a da Vinci -- to know everything that his or her
culture had to offer. We still sometimes refer to people who have
a particularly broad knowledge base as a renaissance man or
woman. But this isn't really possible anymore, because there is
simply too much information in the world. Everyone winds up a
specialist. That isn't, of course, entirely bad; but it does mean
that the various sciences (and arts and humanities as well) tend to
become isolated. A new idea in one field stays in that field,
even when it might mean a revolution for another field. The last
time we saw a truly significant transfer of ideas from one science to
others was when Darwin introduced the theory of evolution!
General Systems Theory was a proposal for a mathematical and logical
means of expressing ideas in what we nowadays comfortably call
systems. Bertalanffy believed that this was the way we could
unify the sciences, including biology, history, sociology, and even
psychology, and open the door to a new kind of scientist who is a
generalist rather than a specialist. These generalists, by making
use of these common systems models, would be able to transfer insights
from one field to another.
Bertalanffy took concepts from cybernetics, information theory, game
theory, decision theory, topology, factor analysis, systems
engineering, operations research, and human engineering, and perfected
the "flow diagram" idea that we all take for granted today. His
most significant innovation, however, was the idea of the open system
-- a system in the context of a larger system. This allowed
systems theory to be applied to animals within ecosystems, for example,
or to people withing their socio-cultural contexts. In
particular, the idea of the open system gave the age-old metaphor of
societies-as-organisms scientific legitimacy and a new lease-on-life.
Noam Chomsky
In addition to the input (no pun intended) from the "artificial intelligence" people, there was the input from a group of scientists in a variety of fields who thought of themselves as structuralists -- not allying themselves with Wundt, but interested in the structure of their various topics. I'll call them neo-structuralists, just to keep them straight. For example, there's Claude Levi-Strauss, the famous French anthropologist. But the one everyone knows about is the linguist Noam Chomsky.
Avram Noam Chomsky was born December 7, 1928, in Philadelphia, the son of William Chomsky and Elsie Simonofsky. He was father was a Hebrew scholar, and young Noam became so good that he was proof-reading his father’s manuscripts by the time he was in high school. Noam was also passionate about politics, especially when it concerned the potential for a state of Israel.
He received his BA from the University of Pennsylvania in 1949, whereupon he married a fellow linguist, Carol Schatz. They would go on to have three children. He received his PhD in 1955, also from the U of Penn.
That same year, he started teaching at MIT and began his work on generative grammar. Generative grammar was based on the question “how can we create new sentences which have never been spoken before?” How, in other words, do we get so creative, so generative? While considering this questions, he familiarized himself with mathematical logic, the psychology of thought, and theories about thinking machines. He found himself, on the other hand, very critical of traditional linguistics and behavioristic psychology.
In 1957, he published his first book, Syntactic Structures. Besides introducing his generative grammar, he also introduced the idea of an innate ability to learn languages. We have born into us a “universal grammar” ready to absorb the details of whatever language is presented to us at an early age.
His book spoke about surface structure and deep structure and the rules of transformation that governed the relations between them. Surface structure is essentially language as we know it, particular languages with particular rules of phonetics and basic grammar. Deep structure is more abstract, at the level of meanings and the universal grammar.
In the 1960’s, Chomsky became one of the most vocal critics of the Vietnam War, and wrote American Power and the New Mandarins, a critique of government decision making. He is still at MIT today and continues to produce articles and books on linguistics -- and politics!
Jean Piaget
Another neo-structuralist is Jean Piaget. Originally a biologist, he is now best remembered for his work on the development of cognition. Many would argue that he, more than anyone else, is responsible for the creation of cognitive psychology. If the English-speaking world had only learned to read a little French, this would be true without a doubt. Unfortunately, his work was only introduced in English after 1950, and only became widely known in the 1960's -- just on time to be a part of the cognitive movement, but not of its creation.
Jean Piaget was born in Neuchâtel, Switzerland, on August 9, 1896. His father, Arthur Piaget, was a professor of medieval literature with an interest in local history. His mother, Rebecca Jackson, was intelligent and energetic, but Jean found her a bit neurotic -- an impression that he said led to his interest in psychology, but away from pathology! The oldest child, he was quite independent and took an early interest in nature, especially the collecting of shells. He published his first “paper”when he was ten -- a one page account of his sighting of an albino sparrow.
He began publishing in earnest in high school on his favorite subject, mollusks. He was particularly pleased to get a part time job with the director of Nuechâtel’s Museum of Natural History, Mr. Godel. His work became well known among European students of mollusks, who assumed he was an adult! All this early experience with science kept him away, he says, from “the demon of philosophy.”
Later in adolescence, he faced a bit a crisis of faith: Encouraged by his mother to attend religious instruction, he found religious argument childish. Studying various philosophers and the application of logic, he dedicated himself to finding a “biological explanation of knowledge.” Ultimately, philosophy failed to assist him in his search, so he turned to psychology.
After high school, he went on to the University of Neuchâtel. Constantly studying and writing, he became sickly, and had to retire to the mountains for a year to recuperate. When he returned to Neuchâtel, he decided he would write down his philosophy. A fundamental point became a centerpiece for his entire life’s work: “In all fields of life (organic, mental, social) there exist ‘totalities’ qualitatively distinct from their parts and imposing on them an organization.” This principle forms the basis of his structuralist philosophy, as it would for the Gestaltists, Systems Theorists, and many others.
In 1918, Piaget received his Doctorate in Science from the University of Neuchâtel. He worked for a year at psychology labs in Zurich and at Bleuler’s famous psychiatric clinic. During this period, he was introduced to the works of Freud, Jung, and others. In 1919, he taught psychology and philosophy at the Sorbonne in Paris. Here he met Simon (of Simon-Binet fame) and did research on intelligence testing. He didn’t care for the “right-or-wrong” style of the intelligent tests and started interviewing his subjects at a boys school instead, using the psychiatric interviewing techniques he had learned the year before. In other words, he began asking how children reasoned.
In 1921, his first article on the psychology of intelligence was published in the Journal de Psychologie. In the same year, he accepted a position at the Institut J. J. Rousseau in Geneva. Here he began with his students to research the reasoning of elementary school children. This research became his first five books on child psychology. Although he considered this work highly preliminary, he was surprised by the strong positive public reaction to his work.
In 1923, he married one of his student coworkers, Valentine Châtenay. In 1925, their first daughter was born; in 1927, their second daughter was born; and in 1931, their only son was born. They immediately became the focus of intense observation by Piaget and his wife. This research became three more books!
In 1929, Piaget began work as the director of the Bureau International Office de l’Education, in collaboration with UNESCO. He also began large scale research with A. Szeminska, E. Meyer, and especially Bärbel Inhelder, who would become his major collaborator. Piaget, it should be noted, was particularly influential in bringing women into experimental psychology. Some of this work, however, wouldn’t reach the world outside of Switzerland until World War II was over.
In 1940, He became chair of Experimental Psychology, the Director of the psychology laboratory, and the president of the Swiss Society of Psychology. In 1942, he gave a series of lectures at the Collège de France, during the Nazi occupation of France. These lectures became The Psychology of Intelligence. At the end of the war, he was named President of the Swiss Commission of UNESCO.
Also during this period, he received a number of honorary degrees. He received one from the Sorbonne in 1946, the University of Brussels and the University of Brazil in 1949, on top of an earlier one from Harvard in 1936. And, in 1949 and 1950, he published his synthesis, Introduction to Genetic Epistemology.
In 1952, he became a professor at the Sorbonne. In 1955, he created the International Center for Genetic Epistemology, of which he served as director the rest of his life. And, in 1956, he created the School of Sciences at the University of Geneva.
He continued working on a general theory of structures and tying his psychological work to biology for many more years. Likewise, he continued his public service through UNESCO as a Swiss delegate. By the end of his career, he had written over 60 books and many hundreds of articles. He died in Geneva, September 16, 1980, one of the most significant psychologists of the twentieth century.
Jean Piaget began his career as a biologist -- specifically, a malacologist! But his interest in science and the history of science soon overtook his interest in snails and clams. As he delved deeper into the thought-processes of doing science, he became interested in the nature of thought itself, especially in the development of thinking. Finding relatively little work done in the area, he had the opportunity to give it a label. He called it genetic epistemology, meaning the study of the development of knowledge.
He noticed, for example, that even infants have certain skills in regard to objects in their environment. These skills were certainly simple ones, sensorimotor skills, but they directed the way in which the infant explored his or her environment and so how they gained more knowledge of the world and more sophisticated exploratory skills. These skills he called schemas.
For example, an infant knows how to grab his favorite rattle and thrust it into his mouth. He’s got that schema down pat. When he comes across some other object -- say daddy’s expensive watch, he easily learns to transfer his “grab and thrust” schema to the new object. This Piaget called assimilation, specifically assimilating a new object into an old schema.
When our infant comes across another object again -- say a beach ball -- he will try his old schema of grab and thrust. This of course works poorly with the new object. So the schema will adapt to the new object: Perhaps, in this example, “squeeze and drool” would be an appropriate title for the new schema. This is called accommodation, specifically accommodating an old schema to a new object.
Assimilation and accommodation are the two sides of adaptation, Piaget’s term for what most of us would call learning. Piaget saw adaptation, however, as a good deal broader than the kind of learning that Behaviorists in the US were talking about. He saw it as a fundamentally biological process. Even one’s grip has to accommodate to a stone, while clay is assimilated into our grip. All living things adapt, even without a nervous system or brain.
Assimilation and accommodation work like pendulum swings at advancing our understanding of the world and our competency in it. According to Piaget, they are directed at a balance between the structure of the mind and the environment, at a certain congruency between the two, that would indicate that you have a good (or at least good-enough) model of the universe. This ideal state he calls equilibrium.
As he continued his investigation of children, he noted that there were periods where assimilation dominated, periods where accommodation dominated, and periods of relative equilibrium, and that these periods were similar among all the children he looked at in their nature and their timing. And so he developed the idea of stages of cognitive development. These constitute a lasting contribution to psychology.
[For much more detail on Piaget's theory -- especially the childhood stages -- click here!]
Donald O. Hebb
There are three psychologists who, in my opinion, are most responsible for the development of cognitive psychology as a movement as well as for its incredible popularity today. They are Donald Hebb, George Miller, and Ulric Neisser. There are no doubt others we could add, but I am sure no one would leave these three out!
Donald Olding Hebb was born in 1904 in Chester, Nova Scotia. He graduated from Dalhousie University in 1925, and tried to begin a career as a novelist. He wound up as a school principle in Quebec.
He began as a part-time graduate student at McGill University in Montreal. Here, he began quickly disillusioned with behaviorism and turned to the work of Wolfgang Kohler and Karl Lashley. Working with Lashley on perception in rats, he received his PhD from Harvard in 1936.
He took on a fellowship with Wilder Penfield at the Montreal Neurological Institute, where his research noted that large lesions in the brain often have little effect on a person’s perception, thinking, or behavior.
Moving on to Queens University, he researched intelligence testing of animals and humans. He noted that the environment played a far more significant role in intelligence than generally assumed.
In 1942, he worked with Lashley again, this time at the Yerkes Lab of Primate Biology. He then returned to McGill as a professor of psychology, and became the department chairperson in 1948.
The following year, he published his most famous book, The Organization of Behavior: A Neuropsychological Theory. This was very well received and made McGill a center for neuropsychology.
The basics of his theory can be summarized by defining three of his terms: First, there is the Hebb synapse. Repeated firing of a neuron causes growth or metabolic changes at the synapse that increase the efficiency of that synapse in the future. This is often called consolidation theory, and is the most accepted explanation for neural learning today.
Second, there is the Hebb cell assembly. There are groups of neurons so interconnected that, once activity begins, it persists well after the original stimulus is gone. Today, people call these neural nets.
And third, there is the phase sequence. Thinking is what happens when complex sequences of these cell assemblies are activated.
He humbly suggested that his theory is just a new version of connectionism -- a neo- or neuro-connectionism. This connectionism is today the basic idea behind most models of neurological functioning. It should be noted that he was president of both the APA and its Canadian cousin, the CPA. Donald Hebb died in 1985.
George A. Miller
George A. Miller, born in 1920, began his career in college as a speech and English major. In 1941, he received his masters in speech from the University of Alabama. In 1946 he received his PhD from Harvard and began to study psycholinguistics.
In 1951, he published his first book, titled Language and Communication. In it, he argued that the behaviorist tradition was insufficient to the task of explaining language.
He wrote his most famous paper in 1956: "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information." In it, he argued that short-term memory could only hold about seven pieces -- called chunks -- of information: Seven words, seven numbers, seven faces, whatever. This is still accepted as accurate.
In 1960, Miller founded the Center for Cognitive Studies at Harvard with famous cognitivist developmentalist, Jerome Bruner. In that same year, he published Plans and the Structure of Behavior (with Eugene Galanter and Karl Pribram, 1960), which outlined their conception of cognitive psychology. They used the computer as their model of human learning, and used such analogies as information processing, encoding, and retrieval. Miller went so far as to define psychology as the study of the mind, as it had been prior to the behaviorist redefinition of psychology as the study of behavior!
George Miller served as the president of APA 1969, and received the prestigious National Medal of Science in 1991. He is still teaching as professor emeritus at Princeton University.
Ulric Neisser
Ulric Neisser was born in 1928 in Kiel, Germany, and moved with his family to the US at the age of three.
He studied at Harvard as a physics major before switching to psychology. While there, he was influenced by Koffka’s work and by George Miller. In 1950, he received his bachelors degree, and in 1956, his PhD. At this point, he was a behaviorist, which was basically what everyone was at the time.
His first teaching position was at Brandeis, where Maslow was department head. Here he was encouraged to pursue his interest in cognition. In 1967, he wrote the book that was to mark the official beginning of the cognitive movement, Cognitive Psychology.
Later, in 1976, he wrote Cognition and Reality, in which he began to express a dissatisfaction with the linear programming model of cognitive psychology at that time, and the excessive reliance on laboratory work, rather than real-life situations. Over time, he would become a vocal critic of cognitive psychology, and moved towards the environmental psychology of his friend J. J. Gibson.
He is presently at Cornell University where his research interests include memory, especially memory for life events and in natural settings; intelligence, especially individual and group differences in test scores, IQ tests and their social significance; self-concepts, especially as based on self-perception. His latest works include The Rising Curve: Long-Term Gains in IQ and Related Measures (1998) and , with L. K. Libby, "Remembering life experiences" (In E. Tulving & F. I .M. Craik’s The Oxford Handbook of Memory, 2000)
Conclusions?
As I said at the beginning of this chapter, it is impossible to tell whether cognitive psychology will prove to be THE psychology of the future. In fact, as I pointed out with Ulric Neisser, even some of its major proponents have their doubts. Cognitive psychology is far more sophisticated, philosophically, than behaviorism. And yet it lacks in sophistication when compared, for example, to phenomenology and existentialism. It does, of course, have the tremendous advantage of being tied to the most rapidly developing technology we have ever seen -- the computer. But few people see the computer as ultimately being a good model for human beings, in some ways not even as good as the old white rat, which at least was alive!
© Copyright 2000 by C. George Boeree