GABA receptor role in postoperative cognitive decline

About 20-30% of older adults (age greater than 60) undergoing major surgery experience temporary (generally reversed) memory and thinking deficits after major surgery, particularly heart and orthopedic. A small minority (<5%, probably much less) might not return to cognitive baseline (how they were before surgery). The cause of this decline in cognition is unclear, although many attribute it to the anesthesia used. So far, however, research has been inconclusive as to specific causes of cognitive difficulties after surgery. This is because surgeries are major events that affect most parts of the body, not just what is being operated upon. They are stressful – physically and emotionally.

Newly published research proposes one mechanism for causes of memory problems after surgery – anesthesia acting on ɣ-aminobutyric acid type A receptors (ɣ5GABAaR). This new research suggests that the function of these receptors does not return to baseline until much later than previously believed. This means that the normal function of chemicals in the brain, particularly ones important for memory, might be disrupted for longer than expected, and might play a role in memory problems that some individuals experience after major surgery.


Zurek, A. A., Yu, J., Wang, D. S., Haffey, S. C., Bridgwater, E. M., Penna, A., … & Orser, B. A. (2014). Sustained increase in ?5GABA A receptor function impairs memory after anesthesia. The Journal of clinical investigation, 124(12).

Modeling the Human Brain

Wired has an article about Dr. Henry Markram’s goal to simulate an entire human brain within 10 years. While his goal will not be met within that time-frame, this is important work to do. If we can have a way to simulate brain development or function, it can help us understand how brain disorders occur and help with the treatment of them.

One of the great things about the project is the collaborative nature of it: “‘But the only way you can find out is by building it,’ [Markram] says, ‘and just building a brain is an incredible biological discovery process.’ This is too big a job for just one lab, so Markram envisions an estimated 6,000 researchers around the world funneling data into his model…. Neuroscientists can spend a whole career on a single cell or molecule. Markram will grant them the opportunity and encouragement to band together and pursue the big questions.”

Read the Wired article for more information about the project and the 1 billion Euro grant Markham received.

Intelligence and Neurological Conditions

Intelligence is an interesting concept. We have tests that measure what we call intelligence but such tests are limited and culture-centric (not that that is necessarily a negative thing). However, for the sake of discussion I will operationally define aptitude (i.e., intelligence) as Intelligence Quotient so as to have a standard metric as foundation for this post.

I spend time assessing people’s memory and thinking abilities. I almost always try to get some measure of baseline aptitude either by estimating it (e.g., years of education, vocabulary knowledge, word reading ability) or by formally measuring via an intelligence test. Granted, this has limitations but it allows me to estimate how well an individual’s brain should function across multiple domains of thinking (e.g., problem-solving, reasoning, memory, language, and so forth). In other words, the higher a person’s general aptitude (abilities), the better he generally will do across most cognitive domains barring brain insult. This is certainly not a rule codified in stone and in triplicate but it serves as a rubric to follow.

Intelligence as measured by IQ is generally quite stable across the lifespan but can improve modestly with  diligence in informal or formal education. Intelligence as denoted by IQ can also decrease modestly if people are intellectually inactive, although such declines are slight. What can happen though is as brains age or if damaged by a pathological process or an injury, components of IQ can decrease. My primary clinical and research focus is in understanding how brains and cognition change in old age – both naturally and in the presence of neurological (brain) insult. Remarkably, the measures we use for intelligence tend to be rather insensitive to aging and even neurological insult, at least some of the components of intelligence are generally insensitive to brain insult. However, this leads to one area where our conceptualization of intelligence as IQ starts to break down.

As they age, the brains of people almost universally slow down. Wear and tear on the brain over decades of life affects how well and quickly we can think. Blood, which is essential for life and for the functioning of the brain, happens to be toxic to brain cells. Sometimes the protections in the brain that keep blood far enough from brain cells (neurons) to protect them but near enough to feed and maintain brain cells start to break down over time. This can injure the brain and start to reduce how well the brain works, even lowering IQ. Now, does that mean that a person’s intelligence decreases? If IQ = intelligence, then yes, it does. Contrary to how I operationalized intelligence earlier, intelligence is not synonymous with IQ. IQ can be a useful concept but it is far from perfect, particularly if by using it one argues that someone is less intelligent simply because his head was injured in an accident or because she developed dementia or suffered a stroke.

This is an area that demonstrates the limitations of our current research and clinical conceptualizations of intelligence. However, understanding how IQ changes over time and how it is affected by neurological conditions is important information to have, as it can help localize areas of pathology.

Modems and White Matter

Yesterday my connection to the Internet decided to stop working. I tried restarting the cable modem, the wireless router, and other attached devices. That didn’t fix the problem. That’s usually a good first step though. I saw that the internet connectivity light was lit on the modem but the PC/Activity light was not lit. That told me that maybe the router was bad. I tried plugging my computer directly into the modem via ethernet and my computer did not recognize that a cable was plugged in. I had discovered what was wrong. While it hadn’t taken me long to figure out the problem, I did what many people do and look for solutions in the hardware first rather than in the connections. That’s not necessarily wrong, cables are more hardy than electronic components, but it did reveal my biases. So what was the problem?

The components were all okay – modem, router – but the connections were not. Wiring was the problem. Being interested in the brain, I immediately knew this would make  great brain analogy.

When someone’s cognitive functioning changes, one of the first things clinicians usually jump to is which part of the cortical or subcortical gray matter went bad, so to speak. While those components can and do go bad, we often overlook, just as I did at first, the connections. In my case, the ethernet cable had gone bad. There are many times when what’s affected in the brain are not the components but rather, the wiring – the axons. White matter might be just as important or even more important than the gray matter for cognition, even if its contribution might be more subtle. Much of my current research revolves around this idea.

So the moral of the story is that when things are not working correctly, the wiring might be the culprit.

How did my ethernet cable get damaged? Maybe it just stopped working spontaneously but it also had experienced a bit of acute stress earlier in the day (the modem fell off its stand). Something might have happened to the cable during this time. The white matter of our brain can similarly be affected by traumatic injury, nontraumatic injury (anoxia, hypoxia, etc.), stroke, or a long history of cerebrovascular problems. Just as we can take care of our electronic equipment (by not dropping it or knocking it off its home or stepping on it or whatever else we can do to our technology), we can take care of our white matter by avoiding similar injuries.

Exercise, weight control, managing diabetes, managing blood pressure, and managing cholesterol, can all help protect white matter from going bad and disconnecting different brain areas. We can’t connect to the Internet if our wiring is bad.

Can We Cure Parkinson’s Disease?

The National Parkinson’s Foundation produced a series of brief videos providing overviews of Parkinson’s disease related topics by prominent clinicians and researchers in the field of Parkinson’s disease. In one video, we are provided with an overview of the questions of whether or not we can cure Parkinson’s disease and how do we treat Parkinson’s disease.

The short answer is: no, we cannot right now cure Parkinson’s disease. We have hopes that stem cell therapies will work but there are a number of issues related to stem cells that make them potentially problematic (e.g., how do we make sure they don’t turn into cancers).

We can, however, treat symptoms of Parkinson’s disease with drug, physical, and cognitive therapies. L-dopa is effective at reducing tremors in most people and well as increasing rate and speed of movement. In some cases, deep brain stimulation is warranted. It has shown to be quite effective for many people. But for now we cannot cure Parkinson’s disease.

Mad Cow And Alzheimer’s Have Surprising Link : NPR

Mad Cow And Alzheimer’s Have Surprising Link : NPR. If this research showing a link between prion proteins and the deleterious effects of beta-amyloid plaques in rat brains holds true in humans, it has huge implications for finding a cure for Alzheimer’s disease. That would be as big as the polio vaccine or eradicating small pox. This is some research that is worth watching closely.

The Death of Psychotherapy

I’m going to preface my post by stating that the following post was written to help me think through the relationship between neuroscience and therapy. As such, it is a philosophical journey through some of my thoughts and is not even necessarily what I really believe because I’m still working on discovering what I believe. Thought processes like this are one way I try to keep some of my beliefs about psychology and neuroscience balanced. If I start leaning too strongly one way, I’ll start looking for things that disconfirm those beliefs and see what I discover. It’s a bit of playing the Devil’s Advocate with myself and a bit of philosophizing. Some of my friends and I used to do things like this in junior high and high school – having philosophical discussions where we discussed things and even tried to argue for things that we didn’t necessarily believe (e.g., classic topics such as supposing that this world and universe really aren’t real but are just reflections of reality. Again, that’s not something I believe but we would speculate). What does this all have to do with psychology and neuroscience?

The brain is what drew me to psychology initially. However, I vowed I would never go into clinical psychology because I didn’t think I would like therapy or dealing with people’s problems. Over time I discovered neuropsychology. Most neuropsychologists are clinical psychologists so in order or me to be a neuropsychologist, I had to be trained as a clinical psychologist. There are many things I enjoy about clinical psychology but therapy is not one of those things. Granted, most neuropsychologists do not actually do therapy, but we have to be trained in it. I enjoy talking with people in sessions but I haven’t been that impressed with therapy as a whole so far. Maybe that’s just because I haven’t exactly found the particular type of therapeutic method that really “clicks” with me. Cognitive-behavioral therapy is fine but so much of actual therapy in practice is just plain common sense. However, not everyone has a lot of common sense so they need some training in it. Part of me recognizes the validity of therapy but another part of me struggles with it. Now on to my main article.

The more I study the brain and the more exposure I have to therapy (giving, not receiving), the more biased towards the brain I become. What I mean is that we continue to discover more about the brain and as we discover more, the more behavior we can explain based on biology or neurophysiology and the less important I think therapy is. I’ve written about this topic in the past but wanted to briefly revisit it. This is somewhat of a second chapter to that post. Before I continue I wanted to expose one of my biases; I believe humans have free will. Even though some of my beliefs about the brain could be seen as mechanistic and deterministic, I do not believe that a strongly-biological foundation for behavior rules out free will. You can still assume biological foundations without assuming determinism. If, for example, you have a monistic set of assumptions that incorporates both mind – “nonmaterial” – and body – “material” – in one. [I have quotes around nonmaterial and material because mind is not necessarily nonmaterial and body is not necessarily material, well at least philosophically speaking]. Monism is a similar idea to a unified field theory (e.g., Grand Unified Theory) or the Theory of Everything for which some theoretical physicists are searching. That’s not what I’m going to write about and if it didn’t make sense, then don’t worry about it (I discussed this topic in a couple different posts: here {I linked to that post previously} and here). To summarize, I view behavior through a strong biological bias but I do not assume determinism.

As I said earlier, the more I learn about the brain and behavior (through research and observation), the more I lean towards neuroscience and away from “traditional psychology.” However, I still appreciate the psychosocial aspects of behavior; the nature versus nurture dispute will never be resolved because both are important. The environment is important  – all external stimuli are important – but the problem with downplaying biology is that it is the medium of behavior. What I mean is, everything we think, sense, perceive, or do is translated and transmitted through the firing of neurons. This means that all abnormal behavior, which is what psychologists often are interested in, originates in a neuron or related cell. Whether or not the cause of that behavior was internal or external is irrelevant. All events and stimuli are translated into patterns of neuronal firings.

This is why I think that understanding the biology of the brain is the best way to understand a person’s behavior. However, because we have an imperfect understanding of the biology of the brain, we have an imperfect understand of the biological foundations of behavior. This means that until we have a perfect understanding, we cannot ignore the psychosocial aspects of behavior; even with a perfect understand we couldn’t either because even if we understand the “translation” process we may not understand the origin of what needs to be translated. This is where traditional talk therapy can be most beneficial. However, I still believe less and less that talk therapy is the best solution for dealing with many psychological issues. Over time as we discover more and more about the brain, therapy will become even less important.

That is a fairly radical position to take as a student of clinical psychology – it’s more in line with psychiatry, or rather, I believe it’s more in line with neuroscience. I’m not saying that therapy is useless, I’m just saying that as we gain a more perfect understanding of the brain and how various chemicals interact in the brain, we will have less need for people to help others by “talking” through their problems. The better we understand the physiology of the brain, the more natural our pharmaceuticals will be. In other words, it will be easier to mimic and create normal brain functioning. Of course, many will ask, “What is normal?” That’s a good question.

Some may argue that with depression, for example, many people will have negative image and self-evaluations, which can lead to depression. That is true but it’s the classic chicken and egg question. Which came first? Did the negative thoughts cause the depression or did the person experiencing negative thoughts have a biological predisposition to those thoughts and depression? In other words, it is possible that biology originally led to the negative thoughts and not vice versa. This is all speculation but I think there is increasing evidence for this view.

The big problem with my point though is that at some point, it does become a deterministic system in that it’s possible that we could medicate away people’s free will. This is an unacceptable outcome. There would be a lot of power with this knowledge and many opportunities for abuse. That’s an ethical discussion for a later time.

To summarize, I think that as we (speaking in the collective) gain a more perfect understanding of the brain (and even individual differences in the brain) we will be better able to eradicate and prevent many or most psychological disorders. We could potentially stop schizophrenia through genetic engineering or other modifications. Again, I’m not addressing whether or not we should but I believe we will have the ability to at some point. This is why, at the moment I lean more towards neuroscience than I do psychotherapy. Of course, tomorrow I could [I won’t] write a post that completely contradicts this one. As I said, this is a process. I think it’s important to argue both sides of the issue.

Learning and Recall – Hippocampal Firing

Today in Science a team of scientists (Hagar Gelbard-Sagiv, Roy Mukamel, Michal Harel, Rafael Malach, and  Itzhak Fried) at the Weizmann Institute of Science in Israel, UCLA, and Tel Aviv University published their research where they directly recorded via implanted electrodes the firing of hippocampus neurons during learning and free recall. This represents the first time in humans this has been done. Here’s the abstract from Science:

The emergence of memory, a trace of things past, into human consciousness is one of the greatest mysteries of the human mind. Whereas the neuronal basis of recognition memory can be probed experimentally in human and nonhuman primates, the study of free recall requires that the mind declare the occurrence of a recalled memory (an event intrinsic to the organism and invisible to an observer). Here, we report the activity of single neurons in the human hippocampus and surrounding areas when subjects first view television episodes consisting of audiovisual sequences and again later when they freely recall these episodes. A subset of these neurons exhibited selective firing, which often persisted throughout and following specific episodes for as long as 12 seconds. Verbal reports of memories of these specific episodes at the time of free recall were preceded by selective reactivation of the same hippocampal and entorhinal cortex neurons. We suggest that this reactivation is an internally generated neuronal correlate of the subjective experience of spontaneous emergence of human recollection. (Published Online September 4, 2008; Science DOI: 10.1126/science.1164685)

The New York Times also has an article about the research.

Building a Better Brain

Let’s look forward a number of years. Bioengineering is at the point where replacing people’s organs with lab-grown ones is standard procedure. Gone are the days of transplant patients taking anti-rejection medications for the rest of their lives. Transplanted organs are all manufactured using stem cells from their own body, from bone marrow or from skin or any number of different sources. New organs are rapidly grown using modified growth hormones to speed up their development. A complete new organ is grown within a few weeks, a surgery performed, and the transplant patient home within days. Because of the relative low cost of such procedures, all have access to transplants. Replacing hearts, livers, lungs, kidneys, and other organs increased the life expectancy dramatically with most people living well over 100 years. Scientists are on the verge of transplanting the first manufactured brain. Knowledge of neural networks and cognition is at the point where a person’s entire knowledge system and all memories can be downloaded and stored as a backup. Scientists are working on manufacturing an entire replica human body as a “clone” in case a person is seriously injured. While individual organs come fairly cheap, a whole body is prohibitively expensive. A large portion of the cost is the brain. Even though scientists have created working brains, their success rate is still only about 5% (but always getting better). They go through a lot of brains.

Some people use this new biotechnology for creating backups of their bodies. Other people have started using it to enhance the performance of their existing body. In laboratory situations scientists are able to create organs that are effectually perfect. They are created in well-controlled situations and don’t have to go through the gauntlet of normal development, with exposure to teratogens, fluctuations in nutrition, and all the other things that can affect development. Popular organs to replace are hearts and lungs. People are able to run faster than ever before due to more efficient hearts and lungs. Other people get new legs or arms with well-sculpted muscles. Still other receive nanotech implants to enhance normal biological performance. None of this is being done in the United States or in the United Kingdom but there are plenty of countries that don’t outlaw the procedures

With the common body enhancing going on many people want to enhance their brains. They want a new brain created with certain gyri a little bit bigger and cortex a little bit thicker. Some researchers are working on improving the speed and efficiency of neurotransmitting. Most of the improvements in brain design come from turning on and off certain genes at different time points in development and providing the lab-grown brains optimal nutrients and stimulation. These enhancements can create brains that can learn 1000 times more in 1000 times less time.

I’ve taken a bit of liberty in my hypothetical treatment of bioengineering and biotechnology in the unspecified future. There is little, scientifically-speaking, that stands in the way of us as humans eventually reaching this point. The question is, should we? Should we seek to create immortal and essentially all-knowing humans through science. Supposing humans can build better brains and bodies, should they control and manipulate natural biological processes to the extent that they can create “superbeings”? I’m not going to answer any of the questions; I just want to raise them. With our great advances in bioengineering, technology, and neuroscience, where do we draw the line, assuming we do draw a line? Do we eradicate all developmental, genetic, and environmental diseases and disorders. Do we cure epilepsy, cancer, Autism, Alzheimer’s Disease, and ever other disorder? Do we enhance some functioning, such as hearts or muscles but not the brain?

With all advances in science, we have to always be mindful of the underlying morality and ethics of the advances. we need to make sure that our advances do not out-pace our morals.

The Guillotine and Neuroscience

The air was chilly in 19th Century Paris as a criminal was led to his fate. A GuillotineThe man had committed a crime and was sentenced to pay. A crowd gathered to watch his punishment. There standing before him was the fateful Madame, the progeny of a French engineer. This Woman with the acerbic jaw was to seal the criminal’s fate. He faced the crowd wide-eyed and fearful, pleading for his life. His pleas seemed to fall on deaf ears as the frenzied crowd prepared for the spectacle. A German man stood waiting to play his part. Theodor Bischoff was not there to enjoy the public execution, he was there in the name of science. As the executioner led the criminal to the apparatus named after Joseph Guillotin (who by the way did not invent the guillotine), Bischoff approached. The blade fell and the criminal’s head dropped to the ground. Bischoff quickly rushed over to the head to perform his experiment.

Bischoff wanted to know whether or not consciousness was centered in the head – in the brain – and if any awareness resided after the beheading. He quickly thrust his fingers at the poor criminal’s eyes to see if there was any eye-blink. There was none. He placed smelling salts under the nose, with no reaction. Finally he spoke the word, “Pardon!” into an ear. Again, no response. He was satisfied with the results and concluded that consciousness did in fact reside in the brain and that it ended when the head was severed. His early neuroscience experiment was complete.

While this approach seems unorthodox at best today, early researchers had to resort sometimes to interesting techniques in order to investigate the influence of the brain on behavior, emotions, and consciousness. Their research methods were often seriously flawed but the work they did was important. Each new discovery led to our current understanding of the brain. So while we have much better methods to research the brain than antagonizing disembodied heads, our current research as neuropsychologists and neuroscientists is founded on the research of such creative men as Bischoff.

Note: I dramatized the story and as such, it is a bit of historical fiction. I don’t know if Bischoff was in Paris, he might have been in Germany when he did the experiment. However, Bischoff did perform this experiment.