William James’ Legacy

William James in the 1890s William James entered the field of psychology not with a bang or an explosion but as the morning dew distilling upon fields of clover. He was a reluctant psychologist, who did not want to even be called a psychologist, but he forever changed the course of modern psychology. William James not only changed the course of American psychology, in some ways he was the Course. James did not produce original research, he did not perform experiments, yet he became the driving force behind psychology. How did he accomplish this feat?

James was born to wealthy parents. His grandfather was one of the richest men in America and William’s father inherited a considerable sum of money. William’s father did not fall into the frivolous failings of inherited money though. He was very involved in his children’s lives, educating them himself in matters both temporal and spiritual. He was not an indulgent father but allowed the children to make their own decisions about life and to formulate their own ideas. He also provided them with opportunities to experience the world’s cultures and diversities. William and his family took a number of trips to Europe in order to be exposed to languages, cultures, arts, and philosophies.

William became interested in art initially. He wanted to become an artist but after studying some in Paris, he decided that, while he was good, he was not good enough. So he decided to become a scientist. He started attending Harvard, where he worked with different scientists. He discovered that he abhorred scientific experimentation, finding it tedious. He appreciated the work of other scientists but did not want to do any experimentation himself. After graduation he started medical school at Harvard. While he enjoyed the subjects he studied, William did not want to be a practicing physician; he decided he liked philosophy best. James had by then also been exposed to the work of the great German psychologists like Wilhelm Wundt and was impressed by their research. Continue reading “William James’ Legacy”

Ebbinghaus, the Father of Modern Memory Research

Ebbinghaus was the first modern researcher to systematically study memory. He was inspired in part by the work and writings of Fechner. Ebbinghaus was interested in associations (a philosophy or theory of the day that stated that people learn, remember, and organize concepts by ideas being attracted to each other in the mind much in the same way that physical objects are attracted to each other through the laws of magnetism or gravity.

Once Ebbinghaus discovered the work of Fechner he started formulating ideas for research into human memory. He was still interested in associations but needed a way to experimentally research learning and memory. Ebbinghaus started using short nonsense syllables and serial learning to test associationism. He discovered that people (i.e., himself – he was his own and only research participant) could only remember about 7 of the nonsense syllables from a series when only being allowed 1 repetition. George Miller in the mid-1900s investigated this further and showed that humans can only hold 7 (plus or minus 2) words or “chunks” or information in active memory. Ebbinghaus was also famous for establishing a forgetting curve for newly-learned nonsense syllables. Without any relearning or repetition people quickly forgot learned stimuli (down to less than 40% retention within 1 day) but could still remember about 20% 31 days later. Ebbinghaus’ research of memory spurred the large field of memory research that we have today.

Wilhelm Wundt’s Early Research

Wilhelm Wundt is usually viewed as the first psychologist to set up an experimental laboratory. By doing so, Wundt was trying to establish psychology as a legitimate science, separate from philosophy. He wanted to show that researchers could have well-controlled psychological experiments and systematically measure human behavior. With his work he did not seek to rule out non-experimental aspects of psychology, he just tried to establish that at least some aspects could be measured in a laboratory.

One key component of behavior that Wundt measured in his laboratory was reaction time, which he called mental chronometry. Wundt became interested in reaction time as a student of Helmholtz, who was the first to measure the speed of nerve reflexes. Wundt wanted to know how the brain related to basic nerve transit speed by testing reaction times. Wundt’s lab was able to incorporate the research of many scientists (including Donders’ subtractive method, which expanded on simple reaction times by establishing a reaction time baseline and then complicating the task by adding tasks) and use it to further our knowledge about psychophysiology.

Wundt’s greatest accomplishment was the establishment of his laboratory, which not only produced a lot of research but also trained a lot of future psychologist researchers, many of whom came from America. These trainees went on to other universities and established their own labs; some much like Wundt’s but with what they thought were improvements. Now, psychology labs are ubiquitous on university campuses. Much of Wundt’s writings have not been translated into English, so we (at least outside of German-speaking countries) do not know the entire significance of his work.

Fechner and the Development of Psychophysics

Fechner is considered by many to be the first experimental psychologist. It is more accurate to say that he was likely the first well-known and modern experimental psychologist. In any case, he was the first to publish a widely-read experimental psychology textbook. He was more than a writer though. He conducted a series of experiments investigating the nature of human sensation and perception. He even partially blinded himself by staring too much at the sun during studies about visual afterimages. His various studies solidified him as the premiere psychophysicist of the day.

Fechner was greatly interested in applying mathematics to various bodily sensations and perceptions. He believed that he could accurately measure the workings of the brain by measuring the perceptions of the body. Fechner built a lot on the work of Weber, who formulated Weber’s law (jnd/S = k), a law that calculated the difference in the mass of weights required for a person to sense a difference. Fechner expanded on this by recalculating Weber’s law as S = k log R. Briefly the equation represented the relationship between sensation (S) and the size or mass of a stimulus. Like Weber, he believed that there was a threshold that had to be crossed in order to perceive a difference in sensations. Fechner was the first psychophysicist to talk about an absolute threshold, where a stimulus was first noticed. Once a stimulus was noticed, each “just noticeable difference” of the stimulus (i.e., each time, with increasing size or mass of stimuli, that a person can perceive a difference) was termed a difference threshold by Fechner. He also came up with experimental methods for establishing thresholds in laboratory settings. The method of limits is a method in which a stimulus is presented either above or below threshold and then decreased or increased, respectively, systematically to a point where it crosses the absolute threshold. So, Fechner is usually regarded as the first modern experimental psychologist (some of his methods are still employed today). He not only built on Weber’s law but also greatly expanded the field of psychophysics.

Phrenology and the Clinical Method

Clinical neuropsychologists and all psychologists use various methods to understand normal human behavior and brain-behavior relationships. One common way of understanding behavior is by the clinical method, which is basically using abnormal behavior to make inferences about normal behavior. Neuropsychologists often study people with known brain damage or with abnormal behavior and then study their brain post-mortem (this can also be done in vivo now with MRI and other neuroimaging techniques). The clinical method is important because it is one way phrenology is easily disproven.

Phrenology started out as a good idea. Franz Gall was a physician and anatomist. He was a careful scientist and, for his day, an unmatched anatomist. Phrenology started out as the localization of intellectual and emotional functions to various regions of the brain, which idea was partially supported by later research. However, with flawed methods, Gall assumed that what occurred in the brain was also manifest through the skull. In some ways he was correct – there is some degree of relationship between head size and overall intelligence (although this relationship is minimal – there is a better correlation between brain size and intelligence then head size and intelligence). Gall and his followers also incorrectly localized functions, including many personality traits, to various regions of the brain (skull). Another problem with phrenology is that Gall used mainly anecdotal evidence on which to found his theory. He also, instead of trying to disconfirm his theory, only paid attention to stories and evidence that supported his theory. While phrenology started out as legitimate in many ways, it quickly degenerated into little more than pop-psychology. Phrenology was discredited by the mid-1830s but it still had many followers throughout the 1800s.

One was phrenology was discredited was through the clinic method. Researchers like Paul Broca were able to show that damage in certain areas of the brain did produce specific deficits but that these deficits did not correspond with Gall’s theorized cranial areas. After many years of applying the clinical method to brain-behavior, the early tenets of Gall – those of brain localization and contralateral function (each hemisphere controls the opposite half of the body) – have been largely supported. Phrenology has not been supported.

Whytt and Magendie’s Reflexes

Before Robert Whytt, little was known about human reflexes. Whytt was able to advance our knowledge through a series of experiments; he published the results in 1751. Previous scientists had noticed that decapitated animals (and people) still had muscle twitches. Whytt used decapitated animals to systematically show that he could make their muscles twitch by poking or pinching a leg. Clearly, basic reflexes did not require the brain. Whytt went beyond that though. He was able to dissociate reflex action from the brain by severing nerves between the spinal cord and an appendage. When the connection to the spinal cord was lost, there was no reflex. Whytt’s discoveries about reflexes went beyond simple automatic reflexes. He recognized that there was voluntary and involuntary action. The reflexes that he discovered the spinal cord played a large role in controlling were involuntary. He distinguished between voluntary and involuntary actions and stated that voluntary actions, if practiced enough, become habits. Habits, he stated, became more and more automatic with further practice. So while automatic and voluntary reflexes and actions are distinct they are not mutually exclusive. Whytt’s research was the forerunner to 20th century behaviorism, specifically classical conditioning.

A number of years after Whytt published his work, two independent researchers Sir Charles Bell and Francois Magendie discovered the functions of the two main nerve tracts in and out of the spinal cord. While Bell receives a lot of credit, Magendie’s work was the more scientific and documented – Bell just had a lot of political sway. What did these two researchers discover? Magendie exposed the spinal cord of a live dog and severed the posterior verve tract. The dog could still move its limbs but had no sensation in the affected area. Magendie was then able to sever the anterior root of a nerve tract. He discovered that the animal still had sensation in the affected area but no movement. He put the findings together and stated that the efferent anterior nerves controlled movement while the afferent posterior nerves controlled sensation. Bell had earlier discovered essentially the same thing. The law about the functioning of the nerve pathways became known as the Bell-Magendie law. Neurologists and neuropsychologists have been able to use this law as the basis for understanding different types of central nervous system injury.

The Age of Enlightenment and Truth

The Age of Enlightenment was a time when many scientific principles and the methods for uncovering those truths came to light. Previous to this period, oppressive governments, ideologies, and religion ruled supreme in establishing Truth. Many people had to spend all of their time in activities related to survival and basic life. There was little time for and less encouragement of critical thinking. Many people were taught to accept the Truth as established by kings and priests. The Age of Enlightenment arose in this context of stifled ideas. The scientists and researchers who drove the Age of Enlightenment mainly developed their ideas and work not to discredit the church or governments but rather to show that truth could be discovered through objective, replicable means. Most of the scientists did not reject the teachings of the Church, except on scientific matters; they merely sought to establish objective methods for discovering truth independent of the church or government. Some of the scientists stood up for their observations and theories and were excommunicated (or killed) by the church not so much because they were refuting the teachings of the Church but because they stated that all Truth does not come solely from the Church.

Galileo was one of the Enlightenists who spurred the age of science onward. He used systematic observations to observe the movements of the planets, the moon, the sun, and the stars. His precision and sound methods led him to say without doubt that the earth was not the center of the universe; in fact, it revolved around the sun. His teaching of heliocentrism offended the Church leaders and they excommunicated him and denounced his beliefs as heretical. The Church’s pronouncements would not stop the progress of science and scientific investigation.

This scientific progress that was part of the Enlightenment continued on into the 20th century. The 19th century was a great time of scientific advancement. A lot of this advancement came through technology, which goes hand in hand with science. As science progresses, technology improves. As technology improves, it is often used in turn to advance science. One such scientist who advances science through technology and technology through science was Benjamin Franklin. Franklin through a series of experiments discovered a lot about electricity. He used this knowledge about electricity to invent the lightning rod. Through the invention of the lightening rod scientists were able to learn more about electricity and continue to advance science and technology.

Building a Better Brain

Let’s look forward a number of years. Bioengineering is at the point where replacing people’s organs with lab-grown ones is standard procedure. Gone are the days of transplant patients taking anti-rejection medications for the rest of their lives. Transplanted organs are all manufactured using stem cells from their own body, from bone marrow or from skin or any number of different sources. New organs are rapidly grown using modified growth hormones to speed up their development. A complete new organ is grown within a few weeks, a surgery performed, and the transplant patient home within days. Because of the relative low cost of such procedures, all have access to transplants. Replacing hearts, livers, lungs, kidneys, and other organs increased the life expectancy dramatically with most people living well over 100 years. Scientists are on the verge of transplanting the first manufactured brain. Knowledge of neural networks and cognition is at the point where a person’s entire knowledge system and all memories can be downloaded and stored as a backup. Scientists are working on manufacturing an entire replica human body as a “clone” in case a person is seriously injured. While individual organs come fairly cheap, a whole body is prohibitively expensive. A large portion of the cost is the brain. Even though scientists have created working brains, their success rate is still only about 5% (but always getting better). They go through a lot of brains.

Some people use this new biotechnology for creating backups of their bodies. Other people have started using it to enhance the performance of their existing body. In laboratory situations scientists are able to create organs that are effectually perfect. They are created in well-controlled situations and don’t have to go through the gauntlet of normal development, with exposure to teratogens, fluctuations in nutrition, and all the other things that can affect development. Popular organs to replace are hearts and lungs. People are able to run faster than ever before due to more efficient hearts and lungs. Other people get new legs or arms with well-sculpted muscles. Still other receive nanotech implants to enhance normal biological performance. None of this is being done in the United States or in the United Kingdom but there are plenty of countries that don’t outlaw the procedures

With the common body enhancing going on many people want to enhance their brains. They want a new brain created with certain gyri a little bit bigger and cortex a little bit thicker. Some researchers are working on improving the speed and efficiency of neurotransmitting. Most of the improvements in brain design come from turning on and off certain genes at different time points in development and providing the lab-grown brains optimal nutrients and stimulation. These enhancements can create brains that can learn 1000 times more in 1000 times less time.

I’ve taken a bit of liberty in my hypothetical treatment of bioengineering and biotechnology in the unspecified future. There is little, scientifically-speaking, that stands in the way of us as humans eventually reaching this point. The question is, should we? Should we seek to create immortal and essentially all-knowing humans through science. Supposing humans can build better brains and bodies, should they control and manipulate natural biological processes to the extent that they can create “superbeings”? I’m not going to answer any of the questions; I just want to raise them. With our great advances in bioengineering, technology, and neuroscience, where do we draw the line, assuming we do draw a line? Do we eradicate all developmental, genetic, and environmental diseases and disorders. Do we cure epilepsy, cancer, Autism, Alzheimer’s Disease, and ever other disorder? Do we enhance some functioning, such as hearts or muscles but not the brain?

With all advances in science, we have to always be mindful of the underlying morality and ethics of the advances. we need to make sure that our advances do not out-pace our morals.

Symptoms of Parkinson’s Disease

Parkinson’s disease (PD) affects an estimated 1.5 million Americans and about 2% of people over 65 in the U.K. Its prevalence increases with age, although roughly 15% of Americans with Parkinson’s disease are 50 or younger. Parkinson’s disease is part of a broader spectrum of disorders known as parkinsonism. While it was viewed as fairly homogeneous in the past, researchers and clinicians now recognize the complexity of the disease and its related diseases.

The defining neurological marker of Parkinson’s disease is the destruction of the substantia nigra pars compacta, a small nucleus in the brain that is one of the major dopamine-producing brain areas. Symptoms of PD are not evident until around 80% of the neurons in the substantia nigra (literally translated as “black substance”) are destroyed. Because the substantia nigra produces dopamine, which is an important neurotransmitter, the depletion of dopamine in the brain that is associated with PD affects the striatum, which in part suppresses the subthalamic nucleus. This in turn results in more activity in the globus pallidus and substantia nigra pars reticulata, which in the end leads to more activation of the inhibitory thalamic nuclei that are involved in motor functioning. To summarize, decreased dopamine results in decreased motor activation as well as other motor problems.

The common features of Parkinson’s disease are easily remembered by the mnemonic TRAP.

  1. T – Tremor, specifically resting tremor. Tremor that occurs when moving (e.g., reaching for an object) is called essential tremor and is not a defining characteristic of PD; in fact, it is a different but related disorder.
  2. R – Rigidity. Difficulty moving and stiff arms and limbs.
  3. A – Akinesia. No or slow movements.
  4. P – Postural instability. Posture problems.

Gait abnormalities is also one of the common features of PD. It is especially useful for detecting the disease early in the process. The common gait problems are decrease height and length of step and less arm swing (i.e., walking more with a shuffle than a normal gait). People with PD also often take very small steps when turning around.

PD patients often have difficulty swallowing saliva so they often drool. They also often have micrography (very small writing) that progressively gets smaller with prolonged writing. Depression is common in PD patients as well. If given levodopa (L-dopa) they will respond. Symptoms of dementia often occur as well but they usually occur after a few years post diagnosis. However, there are often more mild cognitive changes early on in the disease process, such as slowed processing speed and slowed reaction time.

Reference

Approach to diagnosis of Parkinson disease (C. Frank, G. Pari, & J. P. Rossiter, 2006). Canadian Family Physician, 52, 862-868.

The Guillotine and Neuroscience

The air was chilly in 19th Century Paris as a criminal was led to his fate. A GuillotineThe man had committed a crime and was sentenced to pay. A crowd gathered to watch his punishment. There standing before him was the fateful Madame, the progeny of a French engineer. This Woman with the acerbic jaw was to seal the criminal’s fate. He faced the crowd wide-eyed and fearful, pleading for his life. His pleas seemed to fall on deaf ears as the frenzied crowd prepared for the spectacle. A German man stood waiting to play his part. Theodor Bischoff was not there to enjoy the public execution, he was there in the name of science. As the executioner led the criminal to the apparatus named after Joseph Guillotin (who by the way did not invent the guillotine), Bischoff approached. The blade fell and the criminal’s head dropped to the ground. Bischoff quickly rushed over to the head to perform his experiment.

Bischoff wanted to know whether or not consciousness was centered in the head – in the brain – and if any awareness resided after the beheading. He quickly thrust his fingers at the poor criminal’s eyes to see if there was any eye-blink. There was none. He placed smelling salts under the nose, with no reaction. Finally he spoke the word, “Pardon!” into an ear. Again, no response. He was satisfied with the results and concluded that consciousness did in fact reside in the brain and that it ended when the head was severed. His early neuroscience experiment was complete.

While this approach seems unorthodox at best today, early researchers had to resort sometimes to interesting techniques in order to investigate the influence of the brain on behavior, emotions, and consciousness. Their research methods were often seriously flawed but the work they did was important. Each new discovery led to our current understanding of the brain. So while we have much better methods to research the brain than antagonizing disembodied heads, our current research as neuropsychologists and neuroscientists is founded on the research of such creative men as Bischoff.

Note: I dramatized the story and as such, it is a bit of historical fiction. I don’t know if Bischoff was in Paris, he might have been in Germany when he did the experiment. However, Bischoff did perform this experiment.