Some people continuously worry about getting cancer. In fact, a population-based survey found that although a third of respondents never worried about getting cancer, more than half worried occasionally and 6% worried often.[1]

When the persistent fear of cancer rises to the level of an overt phobia it is known as cancer phobia or carcinophobia. It may lead to repeated medical examinations that fail to reveal a malignancy. Despite this, people with this condition are unable to be reassured about their clean bill of health for any length of time. This article will explore why most of us probably don’t need to worry about getting cancer.

Luckily, as the survey demonstrated, most of us are not overtly phobic about cancer, even though it may be lurking deep in our subconscious. Why is it not an active fear for the bulk of the population? It is likely because in people without any signs of cancer, it is not perceived as an imminent threat. We are hard-wired to fear clear and present dangers. Risks and threats far into the future don’t get as much priority in our constellation of daily fears.

Examples of this from our daily life abound. For example, research has demonstrated that most people are not willing to take urgent action on climate change if it is presented as a distant threat.[2] But if portrayed as proximal in time and place, more people are willing to act with urgency.

This may seem unrelated to worrying about cancer, but the underlying neurobiological mechanism is the same. We’ll explore that later in this post.

So, should we worry more about getting cancer?

George Klein (1925-2016) was Professor Emeritus at the Microbiology and Tumor Biology Center at the Karolinska Institute in Stockholm, Sweden when he published a fascinating article in The Scientist. The article makes the point that approximately one in three people will be struck by cancer in their lifetime.[3] But, the other side of that coin is that two out of three people remain unaffected. Even the majority of heavy smokers who bombard their lungs with carcinogens and tumor promoters over many years remain cancer-free.

A systematic review revealed that prostate cancer’s incidental findings at autopsy ranged from <5% in men under age 30 to almost 60% by age 70.[4] A not-insignificant percentage of these cancers, when localized and low risk, do not progress to overt cancer during the person’s lifetime. This has led to a recommendation option of active surveillance as opposed to treatment.[5]

It is also known that circulating tumor cells (CTCs) are present in many cancer patients. However, only a portion of these cells will enter and persist in distant parts of the body.[6] These are known as disseminated tumor cells or DTCs. An only a fraction of them develop into secondary tumors (metastases).

  • What keeps these micro-cancers in check?

They are kept in check by a mix of the following elements:

In other words, when it comes to getting or not getting cancer, the glass is more than half-full.

So, should we just relax and not worry? Actually, that’s not a productive question to ask. A more interesting one, that may actually produce interesting answers is:

What makes most people resistant to cancer?

What causes cancer?

As the Cancer Society of Finland succinctly states [9]:

“Cancer is caused by accumulated damage to genes. Such changes may be due to chance or to exposure to a cancer causing substance.” 

Risks related to developing cancer

There are a number of ways this damage can occur including, but not limited to this list:

      • Lifestyle factors that expose us to carcinogens, including
        • tobacco smoke
        • alcohol
        • UV radiation in sunlight
        • food factors, such as nitrites
      • Occupational exposures, such as  
        • asbestos
        • tar and pitch
        • polynuclear hydrocarbons
        • some metal compounds
        • certain plastic compounds
      • Infections with certain viruses or bacteria (Helicobacter pylori, hepatitis B, or Epstein-Barr)
      • Radiation exposure
      • Some drugs, in particular,
        • medications that weaken the immune system
        • anticancer drugs
        • certain hormones
      • Genetic predisposition (for example, Lynch Syndrome [10])
      • Factors not yet identified

We know that colon cancer, breast cancer, and prostate cancer, develop through progressive stages of mutations that ultimately cause cell division to spin out of control and proliferate wildly.

Related content:
Have Breast Implants? Should You Be Worried About Cancer
A Lump or Bump on the Eyelid. Beware It Could Be Skin Cancer

Is cancer resistance simply the absence of mutations?

You might be wondering at this point if not getting cancer is merely the absence of harmful mutations? If that were the case, then is not getting cancer simply a matter of luck? To answer this question, let me paraphrase Albert Einstein’s quip about quantum mechanics, evolution doesn’t play dice. It increases its odds with natural selection.

It turns out that mutations, harmful or otherwise, occur all the time in all of us. With a few exceptions related to certain genetic or pathological conditions, most of the rest of us possess several well-known anti-cancer mechanisms.

The body’s anti-cancer mechanisms

In a classic paper published in PNAS, George Klein identified five kinds of anti-cancer mechanisms [11]:

1. Immunologic

The first type of resistance Klein describes is immunological. For example, researchers have compared the antibody responses of the squirrel monkey and the marmoset when infected with Herpesvirus saimiri. Marmosets, but not the squirrel monkeys, develop rapidly growing lymphomas after exposure to the virus. Of note, the virus is endogenous to squirrel monkeys, but marmosets never encounter it. 

The researchers found a striking difference in the timing of each animal’s antibody response. In the tumor-resistant squirrel monkeys, the antibodies rose to a high level just three days after the infection. However, in the marmosets, the response took three weeks, too late to stop the virus-driven lymphoma.

The dynamics of the antibody response suggest that squirrel monkeys had pre-existing memory T cells against the virus. Whereas the marmosets had to develop them first before a full-blown antibody response could be mounted, a process that takes about three weeks.

2. Genetic

The second mechanism Klein describes is genetic. Our cells are constantly subjected to DNA damage. And, there are individual variations in the efficiency of the repair mechanisms.

Although, in the vast majority, these mechanisms are capable of repairing the damage quickly, some are not. An example is a DNA repair deficiency disorder called xeroderma pigmentosum [12]. Individuals with this deficiency are highly sensitive to ultraviolet light. Even with careful protection, they develop multiple skin cancers due to their genetic deficiency.

**Love our content? Want more information on the science of cancer?     SIGN UP FOR OUR WEEKLY NEWSLETTER HERE**

3. Epigenetic

According to a review of cancer epigenetics, the term refers to the “study of heritable changes in gene expression without alterations in DNA sequences.” [13] Unlike changes in the genome itself, epigenetic changes are reversible. Some key epigenetic processes include the following:

        • changes in DNA methylation,
        • chromatin modifications,
        • alterations of nucleosome positioning
        • changes in non-coding RNA profiles.

These alterations can lead to altered gene function as well as a neoplastic transformation of cells.

The next two mechanisms are, for some reason, my favorites.

4. Apoptosis or cell death

As part of an intracellular defense, a cell can trigger apoptosis (cell death) if it detects extensive DNA damage. This prevents the cell from reproducing and spreading the damage. It is the ultimate altruism on the cellular level.

In some individuals, this mechanism fails. For example, The cellular protein P53 is a tumor suppressor. When it is mutated, it increases the risk of inheriting Li-Fraumeni syndrome [14], a rare disease in which patients develop multiple cancers starting from childhood.

5. Factors in the tissue’s microenvironment

The last mechanism of defense against tumors resides in the microenvironment in which tissues are embedded. Here is a striking example. The naked mole-rat (NMR) and the blind mole rat (BMR), live up to 20 and 30 years, respectively, and never develop cancer. How do they pull off this trick?

The naked mole rat

The naked mole-rat (NMR) displays exceptional longevity, with a maximum lifespan exceeding 30 years [15]. This is the longest reported lifespan for a rodent species. It is especially striking considering its small body mass. In comparison, a similarly sized house mouse has a maximum lifespan of 4 years. In addition to their longevity, naked mole-rats show an unusual resistance to cancer.

The NMR is a social species that live in highly organized matriarchal societies. It has to force its way through narrow and often sinuous underground tunnels. The connective tissue in its skin contains a high-molecular-weight form of hyaluronic acid (HA) that makes the animal’s skin malleable. The corresponding HA in mice and humans has less than one-fifth of the molecular weight.

The heavy form of HA that occurs in the NMR is not only beneficial for the animal’s locomotion. It also prevents the transformation of normal cells in cell culture into cancer cells. Only after it has been removed can the NMR’s cells be transformed into cancer cells. The NMR cells also display an extreme sensitivity to contact inhibition. The cells stop dividing when barely touching each other.

The blind mole rat

Several species of the blind mole rats (Spalax judaei and Spalax golani) are common in Israel and surrounding countries. BMRs are small subterranean rodents. They are distinguished by their adaptations to life underground, remarkable longevity (with a maximum documented lifespan of 21 years. They also show remarkable resistance to cancer.

In tissue culture, when overproliferation starts taking place after several cell divisions, BMR cells began secreting interferon ß. [16] This triggers a massive cell suicide response (a.k.a. apoptosis). The Masada phenomenon is apparently alive and well in this Middle Eastern species.

In case you conclude that it is subterranean living or the small size that protects these animals from getting cancer, think again—the blue whale is cancer-resistant as well.  So, we don’t have to live underground or go back to the ocean where our very distant ancestors came from.

The bottom line is that most of us don’t need to worry about getting cancer

Rather, we can take a deep breath and relax because two-thirds of us will never develop cancer for all of the reasons described in this article.

As for the other third, don’t despair. New diagnostics and therapeutics for cancer are being developed at a rapid rate. That doesn’t mean all cancers are curable yet. But I, for one, am putting my faith in human ingenuity to one day make cancer much less feared than it is today.

References:

  1. Murphy P, Marlow L, Wailer J, et al. What is it about a cancer diagnosis that would worry people? A population-based survey of adults in England, BMC Cancer, 2018 Jan 24. https://pubmed.ncbi.nlm.nih.gov/29361912/
  2. Brügger A,  Morton T,  Dessai S. Hand in Hand: Public Endorsement of Climate Change Mitigation and Adaptation, PLOS One, 2015 Apr 29,  https://doi.org/10.1371/journal.pone.0124843
  3. Klein G. Resisting Cancer, The Scientist, 2015 Apr 1, https://www.the-scientist.com/features/resisting-cancer-35711
  4. Bell K, Del Mar C, Wright G, et al. Prevalence of Incidental Prostate Cancer: A Systematic Review of Autopsy Studies, Int. J. Cancer, 2015 Oct 1, https://pubmed.ncbi.nlm.nih.gov/25821151/
  5. Garisto J, Klotz L. Active Surveillance of Prostate Cancer: How to Do It Right, Oncology (Williston Park), 2017 May 15, https://pubmed.ncbi.nlm.nih.gov/28512731/
  6. Dasgupta A, Lim A, Ghajar C. Circulating and Disseminated Tumor Cells: Initiators or Harbingers of Metastases? J Mol Oncol, 2017 Jan 9, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5423226/
  7. Gonzalez H, Hagerling C, Werb Z. Roles of the Immune System in Cancer: from Tumor Initiation to Metastatic Progression, Genes Dev 2018 Oct. 1, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6169832/
  8. National Foundation for Cancer Research. What is Angiogenesis? Accessed 1/24/21. https://www.nfcr.org/research-programs/research-focus-areas/angiogenesis
  9. Cancer Society of Finland. All About Cancer: Facts About Cancer, Accessed 2021 Jan 24, https://www.allaboutcancer.fi/facts-about-cancer/what-causes-cancer/#d8f54e89
  10. Cancer.Net Editorial Board. Lynch Syndrome, Cancer.Net, 2020 Jan. https://www.cancer.net/cancer-types/lynch-syndrome. Accessed Jan 24, 2021
  11. Klein G. Towards a Genetics of Cancer Resistance. PNAS, 2009 Jan. 20, https://www.pnas.org/content/pnas/106/3/859.full.pdf?sid=f73133d0-35bf-4d9b-a3eb-a3fac2ba089f 
  12. MedlinePlus. Xeroderma Pigmentosa, National Library of Medicine, Accessed 1/24,2021, https://www.pnas.org/content/pnas/106/3/859.full.pdf?sid=f73133d0-35bf-4d9b-a3eb-a3fac2ba089f
  13. Kanwal R, Gupta K, Gupta S. Cancer Epigenetics: An Introduction, Methods Biol, 2015, 1238:3-25, https://pubmed.ncbi.nlm.nih.gov/25421652/
  14. Medline Plus. Li Fraumeni Syndrome. National Library of Medicine, Accessed 1/24/2021, https://medlineplus.gov/genetics/condition/li-fraumeni-syndrome/
  15. Tian X, Azapurua J, Hines C, et al. High-molecular-mass Hyaluronan Mediates Cancer Resistance in the Naked Mole Rate, Nature, 2013 June 19, https://www.nature.com/articles/nature12234
  16. Gorbunova V, Hinea C, Tian X, et al. Cancer resistance in the blind mole rat is mediated by concerted necrotic cell death mechanism, PNAS 2012 Nov 12, https://www.pnas.org/content/pnas/109/47/19392.full.pdf

First published 5/3/15. Updated 3/25/18. Major revision 11/9/19. Updated 1/24/21. Updated 2/5/21.

A while back, I had an epiphany. No, it was more like a rude awakening. Specifically, I had great faith in science and its mechanisms that designed to ensure that what is reported is “the truth, the whole truth, and nothing but the truth.”

In case you are worried that I went over to the dark side of evolution or climate change deniers, rest assured that I haven’t lost my sanity yet. I still believe in science. But when a serious flaw in its procedures is uncovered, it is important to shine a bright light on it.

The story begins with a paper published in Nature with the unassuming, even boring, title: Analysis of protein-coding genetic variation in 60,706 humans. In fact, it was so unassuming that I thought it was just another “inside baseball” article written by and for professional geneticists, so I skipped it.

It was only a few months later that I noticed a new article in the same magazine with a much more seductive title: A radical revision of human genetics: why many “deadly” gene mutations are turning out harmless

My interest was tweaked, I had to read it. So here is the story as related in the article by Erika Check Hayden, a science writer.

A rare and mysterious disease

In 2010, Sonia Vallabh’s mother died of a rare and mysterious disease called fatal familial insomnia. With ‘familial’ in the name of the disease, it was only natural that Vallabh sought genetic counseling to see if she carried the fatal mutation of the prion protein gene PNRP, called D178N. Prions are proteins that can self-aggregate when they undergo a mutation, causing several degenerative neurological and muscular diseases.

Refusing to accept this virtual death sentence without a fight, she and her husband quit their respective careers and enrolled as graduate students in biology. One of their aims was to determine whether or not the D178N mutation definitely caused the disease.

Enter Dr. Daniel MacArthur of Massachusetts General Hospital of Harvard University, Boston. He asked the very same question when he started his research into rare muscle diseases:

How rare are these diseases?

For that, you need to know how many people have the mutation, and how many don’t have it. So, by persuading many geneticists to pool their genomic sequences, they created a database of more than 60,000 human coding sequences (called exomes). 

This database, called ExAC (for Exome Aggregation Consortium) gave him the power to answer his original question about how common or rare a given mutation in humans is.

But there was an additional insight. If a certain variation (mutation) in the genetic sequence occurs at a much lower frequency than statistically expected, then one could conclude that the mutation is indeed lethal.

Conversely, suppose a mutation is branded lethal but is common in the general population without showing any pathology. The inescapable conclusion would be that the mutation is not lethal.

Indeed, MacArthur and his group found nearly 180,000 instances of mutations so severe that they should render their protein products completely inactive. One would expect that these mutations would be either lethal or severely pathogenic. But they weren’t. The individuals carrying them were happily healthy and well.

What about the Vallabh mutation?

What about Sonia Vallabh and her mutation? She and her husband Eric Minikel joined MacArthur’s lab. There they gathered genetic data from more than 16,000 people who have been diagnosed with prion diseases and compared them with data from 600,00 healthy individuals.

They found that 52 people in the ExAC database had PRNP mutations that have been linked to prion diseases. But based on the prevalence of the disease, they would have expected about two. Again, the obvious conclusion is that not all PRNP mutations are lethal.

However, the story doesn’t end there. PRNP can have several mutations. Some may be benign, as we saw in the 52 individuals that have them. But some may indeed be lethal.

What about Vallabh’s specific mutation in the PRNP gene, D178N? Vallabh and Minikel found that it was absent from the 16,000 genomes they looked at, unfortunately, suggesting that this mutation is indeed rare and lethal.

How did we end up with this situation?

We’ll probably never know how many individuals receive the bad news that they carry some really dreadful mutation, and are living in dread of what their future holds. How could that happen?

Simply, the scientists made the same mistake that non-scientists make: They equate association with causation. Just because a protein is so severely mutated as to render it inactive, it doesn’t mean that it will inevitably cause pathology.

There could be other mutations that counter the pathogenic effects of the inactive protein. Or there could be environmental factors that modify, or completely neutralize, the bad gene (epigenetic modification). There could be alternative metabolic pathways that circumvent the inactive one.

Related: Most Americans Do Know Something About Science After All

The case of Woody Guthrie

Case in point: The folk singer Woody Guthrie (“This land is your land”) died of Huntington’s disease, a devastating degenerative neurological disease. His son Arlo (“Alice’s restaurant”) and his own children, all musicians, are free of the disease. In genetics, we call it incomplete penetrance, which describes this phenomenon, although it doesn’t explain its cause(s).

An editorial in Nature states that

“…researchers who hunt for genetic mutations likely to cause disease need to be cautious. Many, it seems, have not required enough evidence before asserting that a particular variant is harmful.”

And,

“It is becoming clear that many human genetic variations are relatively rare, and when researchers do not examine large enough groups of people with and without disease when scanning for pathogenic mutations, they are likely to mistakenly conclude that particular variants of interest turn up only in people with disease. The truth may be that they haven’t looked hard enough for these variants elsewhere. These conclusions have consequences for real people.”

In praise of science

There is another aspect to this story. The unstinting self-criticism of the scientific community. What other discipline has the courage and the integrity to expose its flaws, and take steps to correct it?

Our legal system? Witness the resistance of states to acknowledging the grave errors committed in their courts in the face of irrefutable DNA evidence. Our political system? Have you heard any politician, from our president on down, admit that they screwed up? Don’t hold your breath.

This is the essence and beauty of the scientific method. It is not infallible, but it is self-correcting in its ceaseless search for the truth, the whole truth, and nothing but.


This article was originally published on 1/23/17. It was reviewed and updated by the author for republication on 7/22/20.

We used to have a dog, Hubert Beagle-Basset, who suffered from a severe case of separation anxiety. Whenever we came back home, even if we had only been gone for a short period of time, he used to run joyous circles around the dining room table—we used to call them victory laps (My humans came home! My humans came home!)

Our present dog Sherman, a big black lab, suffered from depression when we got him from the San Francisco SPCA shelter. He had already been there twice. We tried to let him know that he had hit the jackpot coming to live with us. But he was so insecure that he didn’t wag his tail for a whole year.

He’s fine now—he wags his tail a lot—but he is still pretty “weird.” If he isn’t out for a walk or eating, he is hiding out in our closet. He loves the small dark space and he loves being alone.

Dog psychiatry

I am pretty sure Sherman would qualify for the canine diagnosis of an introverted personality. Ask any veterinary psychiatrist (yes, they exist) and they will tell you that dogs suffer from almost every psychiatric disorder that afflicts humans—all except one: schizophrenia.

If we played the quasi-philosophical game of “what defines us as human?” my first choice would be our capacity to hope and the ability to plan for the medium and distant future. But a close second is the uniquely human malady of schizophrenia.

Now, I am not being flippant. No other species is “endowed” with this psychiatric disorder. Interestingly, as I have written before, schizophrenia is associated with creative genius, a characteristic that is also uniquely human. What’s the connection, if any?

Related content: Does Your Dog Have Personality? But of Course!

Genetics of schizophrenia

If you wanted to identify all the genes that are somehow associated with schizophrenia, you would do the obvious. That is, compare the whole genome of people with schizophrenia to that of people without the disorder. That sounds easy, but actually it’s quite an undertaking.

Despite the huge strides made in DNA sequencing, the accurate sequencing of a whole human genome is still far from trivial. Also, you would need to sequence thousands of individuals both with and without the disorder.

Why the need for massive numbers? Because the genome of every individual is, well, uniquely individual. This is because we are all continuously subject to random mutations, the vast majority of which are ‘neutral’, neither beneficial nor deleterious.

Furthermore, we all live in different environments. And as it turns out, the environment can exert its influence on our genes by inducing chemical changes, called epigenetic changes, that affect the expression of specific genes.

So, to get to the “core genome,” we have to cancel out all the “noise” in any individual genome. This can only be done by determining the sequence of thousands of genomes.

No single scientist could possibly accomplish such an undertaking. It would require the collaboration of hundreds of laboratories around the world. But, indeed this was done.

The Schizophrenia Working Group study

A study, published in Nature, is the result of a collaboration among more than 300 scientists from 35 countries. This collaboration is called the Schizophrenia Working Group of the Psychiatric Genomics Consortium. The researchers compared the whole genomes of nearly 37,000 people with schizophrenia with more than 113,000 people without the disorder. And the results?

They found 128 gene variants associated with schizophrenia, in 108 distinct locations in the human genome. The vast majority of them had never before been linked to the disorder.

Bear in mind, a study like that cannot identify specific genes that cause the disease. But, it does provide a list of genes that will become the subject of detailed investigations as to their role in the causation of the disease. But with such a long list of genes, where do you start?

An evolutionary approach to the genetics of schizophrenia 

Why is schizophrenia uniquely human? Researchers at Mount Sinai Medical School came up with a brilliant evolutionary approach to the question.

Schizophrenia is relatively prevalent in humans despite being detrimental. The condition affects over 1% of adults. So it must be associated with something that confers a selective advantage. And that “something” must be uniquely human.

Indeed, there are segments of our genome that are called human accelerated regions, or HARs. HARs are short stretches of DNA that while conserved in other species, underwent rapid evolution in humans following our split with chimpanzees. This is presumably because they provided some benefits specific to our species.

What do HARs do?

The genes found in those HAR stretches don’t code for proteins, instead, they regulate other genes in their vicinity. Could some schizophrenia-associated genes happen to be in the neighborhood of some HARs?

To find out, Dudley and colleagues used data culled from the Psychiatric Genomics Consortium that I mentioned above. They first assessed whether schizophrenia-related genes sit close to HARs along the human genome—closer than would be expected by chance.

It turns out they do, suggesting that HARs may play a role in regulating genes contributing to schizophrenia. And, what makes those genes even more interesting is that they were found to be under stronger evolutionary selective pressure compared with other schizophrenia genes. This implies that the human variants of these genes are beneficial to us in some way despite harboring schizophrenia risk.

Beneficial HARs

To help understand what these benefits might be, Dudley’s group then turned to gene expression profiles. Gene sequencing is important, but it can give us only their structure, not their function. So the most we could say about them is that they are associated with the disease.

To find a causal connection we need to know the function of the gene when it is turned on and off, and in what tissues. That’s what gene profiling does.

Dudley’s group found that HAR-associated schizophrenia genes are found in regions of the genome that influence other genes expressed in the prefrontal cortex (PFC). Inputs into this area arrive from the rest of the brain and are integrated to carry out higher cognitive functions that we associate with being human, such as judgment, planning, decision making, and the like.

Many of those inputs are mediated by the neurotransmitter dopamine. Others are mediated by acetylcholine, and norepinephrine, and glutamate. But all of them are excitatory. They deliver a positive signal.

Nothing in biology is unregulated

Now, nothing in biology is left unchecked or unregulated. Too much of a good thing can be highly disruptive to the stability of the system. Just imagine if a whole cacophony of signals assaulted the PFC. Ideas rushing in uncensored, images flooding in unfiltered, voices unrelentingly filling our consciousness—we would go crazy.

GABA in relation to schizophrenia

Gabaergic Neurons | by Source (WP:NFCC#4) | via Wikipedia | Fair use

To prevent this dismal state of affairs, we need an inhibitory system, a yang force to counteract the yin, if you will.

Thankfully, we have such a system. There are neurons that secrete an inhibitory neurotransmitter called GABA, which tamps down the cacophony of the various signals and maintains our sanity.

So what did the gene profiling of those HAR-associated genes find? They found that they are involved in various essential human neurological functions within the PFC, including the synaptic transmission of the neurotransmitter GABA.

Not surprisingly, GABA’s impaired transmission is thought to be involved in schizophrenia. If GABA malfunctions, dopamine runs wild, contributing to the hallucinations, delusions, and disorganized thinking common to psychosis. In other words, the schizophrenic brain lacks restraint.

Schizophrenia: It’s all about balance

Very few things in biology are all-or-none, like a light switch. They are more like a rheostat, dimming or brightening the light. In biology, we also refer to it as a dose-response. If you have a strong stimulus, you get an appropriately strong response.

So would it be much of a stretch if, neurobiologically speaking, creative geniuses may have a hyper-stimulated dopamine system, or alternatively, an underactive GABA system?

If so, it could go a long way toward understanding their almost universal description of ideas flooding in, of visualizing sounds, of hearing conversations in their heads? And how far is that from crossing the threshold into the incapacitating pathology that we label schizophrenia?

Psychiatric disorders and imbalance

Of course, we still don’t know for a fact that all this really happens in the brains of creative people. But one thing does seem clear, many psychiatric conditions, in addition to schizophrenia, may be related to the imbalance of signals reaching the PFC.

Paranoid ideation is closely related to schizophrenia. And, OCD, despite its frequent portrayal as a behavioral quirk, is a vicious and debilitating mental illness, with some similarities to the experiences of schizophrenia.

People with OCD can have some of the same dark ideas, thoughts, and images as someone with schizophrenia, but the person with OCD is fully aware that they generate the thoughts themselves.

The bottom line: Why dogs don’t get schizophrenia

The studies we cited reveal how wonderfully complex the human brain is, and how exceptional the human species is. They also make it is increasingly clear, that schizophrenia and its associated psychiatric disorders, are part and parcel of us becoming what we are today. They are the price we pay for our wonderfully crafted, uniquely human brain.

More by the author:
Age-Related Memory Loss and What Can You Do About It?
Want to Know Why You Procrastinate?

**Love our content? Want more stories about Psychiatry, Psychology, and Human Behavior?  SIGN UP FOR OUR WEEKLY NEWSLETTER HERE**


This post was first published on 09/20/15. It has been reviewed and updated by the author for republication on 3/22/2020  

What makes the human superior to field animals?” So mused King Solomon, the wisest man of his times (10th century BCE), in Proverbs. Since then the question of how humans are different from animals has occupied the best minds of the human race. That ranges from Plato in the 5th century BCE to the molecular biologists, neurobiologists, neuropsychologists, and philosophers of the 21st century.

For a long while, we thought that it was intelligence that set us apart. But, we now know better. Whales, dolphins, crows, parrots, and apes, to name a few, have been shown to possess a high level of intelligence.

Then we wondered if it was our self-awareness that makes us unique? Not quite. Apes show various degrees of self-awareness.

So, is it our communication skills? They are indeed highly developed but, again, they are not unique. Whales, dolphins, birds, and apes all communicate via quite complex languages.

It has been suggested that our capacity to feel and show empathy is uniquely human. However, have you seen a mother elephant grieving over her dead infant? Or her whole herd commiserating with her? And, what about the African buffaloes who form a protective shield around a female who is giving birth in order to ward off predators and vultures?

In short, we are becoming increasingly aware that all these “human” traits started evolving millions of years before the first human descended from the trees to take his first tentative steps in the African savannah. That being said, there are some characteristics of humans that are truly unique and different from “lower animals.” Let’s explore some of them.

Our exceptional neurobiology allows us to plan for the future

Daniel Gilbert points out in his bestseller “Stumbling on Happiness” that,

“The human being is the only animal that thinks about the future.”

Note also, that he adds a significant caveat, “…the long-term future.”

Now, my dog does seem to plan for the near future (minutes from now). He stations himself by his food bowl about 9 AM when his breakfast time rolls around. And he starts to bark at me when it is time for his afternoon walk. But is he planning to send his offspring to dog school? Of course not.

Does the silverback gorilla in the impenetrable forest of Uganda worry about the potential effect of global warming on the food supply for his troupe five years from now? Not that we know. In fact, experimental evidence suggests that they don’t.

Whatever looks like a long-term pre-planned activity in animals, like birds building a nest for the future chicks, is believed to be the result of genetically pre-determined, automatic behavior.

If we accept the notion that we are only animals that plan for the future, then it begs the question:

What is the underlying genetic and neuronal basis for such a breathtaking jump from an animal living in the present to one that is worried about the future and is planning for it?

As an extension of that, let me add the observation that we are the only species that, as part of our awareness of the future, wonders about our role in the world, and is concerned (frightened?) about dying one day.

The progress of evolution

Before we examine the changes in the brain that made it possible for us to plan for the future when our closest evolutionary cousins, the great apes, apparently cannot, let’s take a look at how evolution progresses.

Kelley Harris, now an Assistant Professor of Genome Sciences at the University of Washington studies the evolution of mutagenesis. That is, how genomes change over time to produce variations in traits of humans and other animals, including some that allow new species to emerge.

The popular “molecular clock” model of natural selection posits that mutation rates evolve very slowly over, perhaps, tens of millions of years. This is the basis of the gradual quantitative accretion of mutations during evolutionary time.

For example, our biological clock and that of the lowly yeast are very similar. Yeast emerged about 1.5 billion years ago, archaic Homo evolved about 600,000 years ago. Modern H. sapiens about 300,000 years ago. Think of that, over a billion years and hardly any significant genetic change in that trait.

Harris’s work, however, demonstrates that DNA replication fidelity is a lot like other biological traits, sometimes evolving at a snail’s pace and sometimes evolving by leaps and bounds for reasons that usually elude us.

The evolution of our brains

One could speculate, then, that if there is a qualitative difference between us and our closest relatives, the gorillas, chimps, and bonobos, it must have been one of those “leaps and bounds” that Harris’s work demonstrated.

But why speculate? An international team of 38 scientists led by Nenad Sestan of Yale University published a magnificent accomplishment in the quest to understand what makes the human brain unique.

The investigators focused on 16 regions of the brains of adult humans, chimpanzees (ape), and macaques (monkey) involved in higher-order cognition and behavior. They looked at the genetic information in the cells of these regions by sequencing the total mRNA of each cell. mRNA is the molecule that transcribes a gene from the DNA code into a protein. The total population of a cell’s mRNA is known as its transcriptome.

Then they went further. They overlaid the data of each cell’s transcriptome on histological sections of these tissues. They did this so they could get an integrated picture of every cell in a tissue, including its genetic and protein content.

Human-specific cells

In addition to all kinds of variations in the molecular and cellular features between humans and chimpanzees, there was one finding that takes your breath away. They found some rare cells that are present in humans and are completely absent in chimpanzees and macaques.
These human-specific cells are located in the striatum, a nucleus (an agglomeration) of neurons in the midbrain. The name, striatum, comes from its appearance as stripes of gray and white matter.

Some cells in the striatum are activated by the neurotransmitter dopamine. They are known as dopamine interneurons. Functionally, the dopaminergic (dopamine-responsive) striatum cells coordinate multiple aspects of cognition, including motor- and action-planning decision-making, motivation, reinforcement (which carried to an extreme can end up in addiction), and reward perception.

The newly discovered human-specific cells (called dopamine interneurons) were found to secrete dopamine. And, these interneurons, in turn, activate the dopamine responsive neurons. Could it be that this is the location in the brain that makes us exceptionally, well…human? We simply do not know yet.

Some unanswered questions

There are still some important unanswered questions:

  • Exactly which cells in the striatum do those dopamine interneurons communicate with?
  • What functions do these cells perform?

But Daniel Gilbert’s observation that humans are the only animals that think about the future may be getting a solid cellular and molecular basis. Although we don’t yet know if the anatomical finding and the psychological observation are at all related, these findings are nevertheless, quite intriguing.

Anatomically, we are the only mammals that have this specific dopaminergic cell type. And we are apparently the only animal that engages in long-term planning.

But that isn’t the only way in which we differ from other animals. Let’s examine another fascinating way in which we are unique.

What on earth is glycobiology?

In an article in Nature magazine, Bruce Lieberman reviewed the fascinating work of Ajit Varki of the University of California, San Diego. Dr. Varki is trying to uncover the mystery of human uniqueness. Now, if you guessed that Dr. Varki is a trained anthropologist, or a neurobiologist, or even a philosopher, I wouldn’t blame you. These are the usual suspects in this field. But in actual fact, he is a glycobiologist. What’s that anyway?

Glycobiology is the study of sugars in biology. Until quite recently, this field was the backwater of biochemical research. And why not? DNA could crow about its function in storing all our genetic information. RNA could claim to be the crucial bridge between the information stored in DNA and the formation of proteins. And proteins had bragging rights as the machinery of life, performing all the functions that are critical for any living organism.

But sugars? These molecules can be solitary or monosaccharides, such as glucose or fructose, or can form chains called polysaccharides. But they are totally unglamorous. Glucose provides energy to the cell. Polysaccharides mainly cover the cell surface. They are basically dumb molecules. They have none of the sophisticated functions of information storage or enzymatic activity.

Now bear with me for a second, and don’t get intimidated by the chemical terminology. You’ll be rewarded with an amazing insight.

Related content:
What We Can Learn About Ourselves from the Genome of the Honeybee
A Brief But Powerful History of the Colors Purple and Blue

Vive la petite difference

humans-different-from-animals

N-acetylneuraminic acid (top) and N-glycolylneuraminic acid (bottom)

What kind of polysaccharides cover the cell surface? In humans, the most common is a type of sialic acid called N-acetylneuraminic acid or Neu5Ac.

Dr. Varki discovered that we are the only animal that has this molecule exclusively. All other animals have a different sialic acid on their cell surface, called N-glycolylneuraminic acid or Neu5Gc.

Look at the molecules. You don’t have to be a chemist to realize that the difference between us and the rest of the animal kingdom is tiny—one oxygen molecule! (It’s shown in blue in the graphic.)

In fact, Varki found that a mutation in the enzyme involved in the synthesis of Neu5Gc rendered it inactive. That’s how we humans ended up with Neu5Ac. In a 2019 study published in the PNAS Varki’s group showed that this human-specific genetic mutation affecting cell-surface may be one other factor.

They also show that the same mutation can help explain the apparently human-specific increased risk of CVD events associated with red meat consumption. Now, here is a mutation that is not merely anthropologically intriguing, it probably of paramount importance in understanding our uniqueness in developing coronary artery disease that may be related, in part, to the consumption of red meat.

One small step in glycobiology, one giant step for humanity

How so? For that, we should ask a question that is basic to evolution: Why did this mutation survive? What selective advantage did it confer on the newly minted humans compared to their ancient evolutionary cousins, the chimps and bonobos?

The answer is not known yet, but Varki points out a tantalizing clue. Humans are not susceptible to a type of malaria organism that afflicts our ancient ancestors the chimpanzees. That organism is Plasmodium reichenowi. This parasite attaches itself to the cell surface by binding to Neu5Gc and we don’t have it – so we don’t get it.

On the other hand, chimpanzees are not susceptible to Plasmodium falciparum, the human malaria organism. So the overall picture is becoming clear, a single mutation allowed us to escape from at least one devastating disease, and maybe more. This is an enormous selective advantage.

You may also enjoy: Do Dogs Have Personality? But Of Course!

No free lunch

But after all, we do get malaria, albeit from a different species (P. falciparum). Interestingly, genetic analysis of this species shows that it evolved in Africa, alongside the evolving humans. Further, it accompanied the bands of early humans as they migrated out of Africa.

This is not the only disease that we acquired by becoming human. Asthma is pretty unique to us, as is rheumatoid arthritis, and Alzheimer’s, and Parkinson’s. The list goes on and on.

Does the sialic acid mutation play a role in all those uniquely human diseases? We don’t know yet. But what we do know is that sialic acid, carpeting the cell surface, is critical to interactions between cells.

And such interactions are critical to the immune response, to communication between neurons, to hormones binding to their target cells, etc, etc. It would not be surprising to find this molecule in the center of physiological and pathological processes that are, well, uniquely human.

So there you have it. One tiny difference in a single molecule, and what momentous consequences it has wrought.

The bottom line

Humans have always thought of themselves as exceptional and unique. However, some of our early ideas about our uniqueness have been debunked.

We are not the only animals that are intelligent and we are not the only animals that can communicate with each other.

That being said, some amazing science has demonstrated that there are some intriguing ways in which our behavior and even our biochemistry have truly rendered us one-of-a-kind.

Other stories about evolution: The Fascinating Case of the Hairy Penis

***

**LOVE OUR CONTENT? WANT MORE INFORMATION ON EVOLUTION, HUMAN, AND ANIMAL BEHAVIOR? SIGN UP FOR OUR WEEKLY NEWSLETTER HERE**


Originally published on July 22, 2015, this story has been reviewed and updated by the author for republication on October 4, 2019.

A while back, Nature magazine published a very interesting article about honeybees entitled “Insights into Social Insects from the Genome of the Honeybee Apis mellifera.” The article is the product of a remarkable collaboration of 90 research institutions around the world, involving hundreds of scientists. Why should so many scientists care enough to devote a significant part of their careers to this enterprise? And why should we care about what they learned? It’s because we can learn a lot about ourselves from the genome of honeybees.

First, a bit about honeybees

The honeybee has always fascinated humans. Ancient Egyptians considered it (and the dung beetle) as deities. Apparently, this was because of their industry and seeming intelligence.

The biblical Israelites honored some of their daughters with the name Dvora, which means “bee”. The name is still common in present-day Israel. In English-speaking countries, the bee name has been changed to Debora.

Just like the Ancients, the Mormons have incorporated the bee into their belief system. For them, the bee is a symbol of industry, perseverance, and intelligence.  And, in common parlance, we say that hard-working people are “busy as bees.” And so over and over again, we find that humans beings are fascinated by and admire the industry of bees.

Bee science

When I was a high school student, I vividly remember reading about Karl von Frisch’s work decoding the language of the bee dance. Through a “waggle dance”, they communicate to each other the location, direction, and distance of food sources. They even maximize the efficiency of the pollen and nectar collection through two other types of dances:

  • a “shaking dance” to recruit more foragers
  • and a “tremble dance” to draft more bees to handle the food inside the hive.

Prof E.O. Wilson of Harvard, an eminent biologist and the “father” of the field of Sociobiology, quotes Karl von Frisch in his commentary on the Nature article:

“The life of bees is like a magic well. The more you draw from it, the more there is to draw.”

The genome of honeybees

There are many aspects of honeybee genome work that are of purely scientific interest. One of these is the similarities and differences of the honeybee genome as compared with the genomes of the fruit fly (Drosophila) and the malaria-bearing mosquito (Anopheles). Another is the bee’s origin in Africa.

But several aspects are of great interest to us non-experts. Take for instance the relationship between energy metabolism and longevity. It is well known that the hormone insulin and another protein called insulin-like growth factor (IGF-1) bind to the same receptor on the cell surface – the insulin receptor (IR).

Once either one of these molecules binds to the receptor, a whole cascade of signals is propagated, through specific pathways, to the DNA in the cell nucleus. Some genes are activated and some are silenced, but the common denominator to all of them is that they are involved in the regulation of the most critical functions of our lives, such as:

  • energy metabolism
  • fertility
  • aging

These are just a few of the many important biological processes regulated by insulin, IGF, and the insulin receptor.

Energy balance, reproduction, and lifespan

Why is this so important to understand? It is because we now understand the code that will allow us to decipher the relationship between food intake and aging. We have known for a long time that obesity is linked to a shortened lifespan. It is also known that the higher the reproductive capacity of a species, the shorter its lifespan.

But these relationships are turned upside down when it comes to certain bees. Queen bees are gluttons. They are constantly being pampered and fed by specialized worker bees. The worker bees make sure the queen of the hive eats a lot and doesn’t exercise too much. That way she can devote all her energies to one task: laying eggs. And this she does prodigiously, usually laying about 1500 eggs per day!

Yet, despite the gluttonous lifestyle of the queen bee, she has a lifespan of 1-2 years. This is quite remarkable when compared to worker bees whose lifespan is only 1-2 months. If we assume that humans have a mean lifespan of about 80 years, the queen bees extended lifespan would be equivalent to some of us living 800 years. Methuselah incarnate!

Related content: What Helps Ensure Cooperation in Diverse Societies?

So, how did the queen bee manage to upend the seemingly iron-clad relationships between energy balance, reproductive capacity, and lifespan? If we could understand these genetic tricks that allow this to occur, it could have direct implications for human energy metabolism and longevity. Perhaps, if we could figure out how to manipulate human genes in the same way as the queen bee, we could finally be able to have our cake and eat it too – and still enjoy a long life.

The peculiar nature of honeybee society

There is another aspect to the bee genome that is a bit more somber but absolutely fascinating. Bee society is highly organized in a caste system. The Queen is at the top (Hail to the Queen!).

The male bees, called drones, are next in line. Their only function is to fertilize the queen. After performing their duty, they are killed or die on their own. At the bottom of the caste system are the worker bees. They are sterile females that divide among themselves all the tasks required to maintain the hive.

How is this amazing feat of social control of beehives achieved? It turns out that it is controlled chemically. First of all, the Queen maintains her status as an exalted egg layer by keeping the other female bees sterile. She does this by secreting chemicals called pheromones. These pheromones act to keep the bees that constantly lick her sterile.

The caste system is maintained through the action of another chemical, the “royal jelly”. This jelly is secreted from glands in the heads of adult worker bees. It serves as food for the rest of the brood. The amount of royal jelly fed to different bees determines the function of the other bees in the hives. If you get a lot of royal jelly, for example, you may become a guard bee. If you get less, you may become a hive cleaner, and so on.

1984 and beyond

Move over George Orwell, the honeybee created a “1984 society” 300 million years ago! This chemically-controlled highly organized society, formed by what E.O. Wilson aptly called “revolution at the genomic level.” It should give us pause.

Could we also be susceptible to chemical manipulation? There is some evidence that this is the case, at least to some extent. Just think of the behaviors we are describing when we say an adolescent human male is being controlled by his “raging hormones”.

Will the nascent science of neurobiology allow scientists to find new ways of controlling human behavior through chemical and non-chemical means in the future? Who knows? These are pretty scary thoughts. But, I believe that the beauty of the bee genome work is that it gives us a thorough understanding of these complex biologic interactions. Armed with knowledge, we should be able to resist and counteract any sinister schemes to control behavior through chemicals.

More from this author:  Science and Truth: Learning from a Fatal Mutation

The bottom line

Who would have expected such important connections exist between the “lowly” honeybee and us humans? The more we learn about genomics, the more we come to realize that, in a sense, we are all One.

***

Love our content? Want more information on NATURE BIOLOGY, ENERGY BALANCE, OR ANIMAL BEHAVIOR? SIGN UP FOR OUR WEEKLY NEWSLETTER HERE


First published October 31, 2006, this article has been updated by the author for republication.

Renowned biochemist and Nobel Laureate Paul Berg said, “I start with the premise that all human disease is genetic.” Indeed, advances in medicine have seen the transformative utility of genetic testing for cancer and other medical conditions. However, genetic testing has only recently seen utilization in psychiatry. The complexities of our brain have long eluded precise examination and treatment.

For people with mental illness and their loved ones, the journey to better health can be a long and troubled passage. Patients and their caregivers can face stressful days and restless nights, filled with profound physical and emotional challenges.

For the more than 46 million Americans whose lives are impacted by mental health conditions1, the burden can be costly as well. According to an editorial in the American Journal of Psychiatry, serious mental illnesses cost patients more than $193 billion per year in lost earnings.2  The emotional cost is boundless.

Genetic testing can reduce the pain of trial and error

The financial and emotional strain of prolonged treatment failure worsens the struggle for many people with mental illness. Unfortunately, this is an unintended consequence of psychiatric practice that often utilizes an informed trial-and-error process in selecting medications.

Going through many iterations of medicines unsuccessfully often deepens a patient’s sense of despair and hopelessness. For some patients, it may take months or years to get to the right regimen.

I’ve witnessed the impact on patients and their families firsthand. It’s not uncommon for our practice to see about 40 new patients a month, 70% of whom have been treated by between one and thirteen other doctors.

These patients have suffered through many medication missteps. I even had one patient who had been on fifteen different psychotropics in the last three to four years, none of which helped, and some of which made him worse.

Related Content: Will Polygenic Risk Scoring Change Medicine in the Future?

The challenge of treatment resistance

Shortening the time it takes to get a patient on the right medicine, or combination of medications can do more than just treat their condition. It can also help avoid the risk of further emotional and cognitive decline and even prevent dementia or suicide.

However, achieving greater precision in prescribing drugs for mental health conditions can be a daunting task for any clinician, particularly when it involves treatment-resistant patients who go through multiple rounds of failed medications.

Addressing the challenge of treatment resistance is critical to stemming the burgeoning mental health crisis in the U.S. The Centers for Disease Control and Prevention predicts that 50% of all Americans will be diagnosed with a mental illness or disorder at some point in their lifetime.3

What makes this forecast more stark is the fact that an estimated two out of three people with depression will be prescribed an ineffective medication at the start of treatment. And, one in three people will become treatment resistant. This is according to a study published in the American Journal of Managed Care. The same study also showed that people who are resistant to treatment will pay 40% more in healthcare costs.4

Embracing innovation in mental health care

To improve the process of prescribing drugs for mental health conditions, doctors are increasingly utilizing new technologies like genetic testing to enhance their decision-making. Genetic testing brings much-needed innovation to the classic approach to treating psychiatric conditions that rely on treatment guidelines, physician training, population-based studies, and trial-and-error.

It augments diagnostic-specific treatment strategies and personalized patient data, such as symptom profile, family history, and past responses to medications, with precision medical information that is based on a patient’s personal genome. Since each person’s response to medicine can be impacted by their genetic variations, genetic testing provides actionable information that narrows the choice of medication for each individual patient.

Related content: 6 Trends Towards Reimagining Mental Health and Psychiatry 

While no single test enables a psychiatrist to prescribe with surgical precision, genetic testing makes us more precise prescribers. It identifies which drugs that are more likely to work and which are more likely to have unwanted side effects. This can shorten the time it takes to prescribe the most effective medication. Each day saved is one less day of emotional torment for patients and their loved ones.

Genetic testing and mental health: What you need to know

Genetic testing is driving a paradigm shift in the personalized treatment of mental illness. As more and more doctors rely on genetic testing to support drug selection for mental health conditions, here are answers to some commonly asked questions that may be helpful for patients:

  • How is a genetic test administered?

The test is conducted in a doctor’s office and requires only a simple swab of the inside of your cheek with a cotton swab.

  • What does it test for?

Genetic testing in mental health analyzes patients’ genes from two critical standpoints:

    1. the effect a drug has on their body
    2. how their body would metabolize that drug.
  • How many genes and medications are analyzed?

The numbers of genes and medications that are covered vary depending upon which commercially available genetic test is utilized. In my practice, I have worked extensively with the Genomind test. It covers 24 genes and 130 FDA-approved medications (36 of which are labeled with genetic guidelines) that are specific to a range of mental health conditions.

These include:

    • depression
    • anxiety
    • obsessive-compulsive disorder (OCD)
    • attention-deficit/hyperactivity disorder (ADHD)
    • bipolar disorder
    • post-traumatic stress disorder (PTSD)
    • autism
    • schizophrenia
    • chronic pain
    • substance abuse
  • How are test results reported?

Following lab processing, genetic test results are provided to your doctor in 3 to 5 days. Healthcare professionals utilizing the Genomind test can also arrange for a free consult with the company’s pharmacogenetic experts to discuss the results and help interpret the findings.

  • Is the test reimbursed by insurance?

Genetic testing can be submitted for reimbursement to commercial and third-party payers, Medicare and Medicaid, and some other government programs. Many private companies are also providing access to genetic testing through employee benefit programs.

Evidence of positive outcomes

Genetic testing is already proving to be a valuable asset in helping doctors improve patient outcomes in mental health care. Published studies have brought forward evidence of its utility in the doctor’s toolkit for guiding treatment strategies for various conditions.

Among these is a study published in Primary Care Companion for CNS Disorders that examined the effectiveness of pharmacogenetic testing to guide treatment in patients with mood and anxiety disorders. It found that 87% of patients (685 total) reported measurable improvements on multiple analyses of their symptoms, adverse effects, and quality of life over three months.5

Another independent study published in the Journal of Depression and Anxiety examined 817 patients with mood and anxiety disorders whose treatment was guided by a commercially-available genetic test. The researchers compared their outcomes with those of matched control patients whose treatment did not involve pharmacogenomic testing.

The results showed the patients who used the testing service had 40% fewer emergency room visits and 58% fewer inpatient hospitalizations in the six-month period following testing. Moreover, healthcare utilization costs decreased by $1,948 per patient in the same six-month period.6

Related Content:  How Virtual Reality is Improving Care for Mental Health Disorders

Final thoughts

Pulitzer Prize-winning author Dr. Siddhartha Mukherjee has said,

“In the twenty-first century… we are constructing a new epidemiology of self: we are beginning to describe illness, identity, affinity, temperament, preferences – and, ultimately, fate and choice – in terms of genes and genomes. The influence of genes on our lives and beings is richer, deeper, and more unnerving than we had imagined.”  

As the integration of genetic testing into mental health care continues, we still have a lot to learn. Like any frontier science or disruptive technology, there has been a fierce debate. But mental illness is a global crisis that shows no signs of abating. As healthcare practitioners, now is the time for us to embrace innovation in order to improve upon our efforts to help patients.

Genetic testing offers a path forward to partner with our patients with a more personalized, precise approach. To fully empathize with their plight, we must leave no stone unturned in shortening their time to achieving a full recovery.

For more information on the pharmacogenomic test we use in our practice, visit www.genomind.com.

Author’s financial disclosure: “Genomind and Potomac Psychiatry have an ongoing marketing collaboration aimed at raising visibility for Genomind pharmacogenomics services and Potomac Psychiatry’s Genetic Testing Consultations.”

**LOVE OUR CONTENT? WANT MORE INFORMATION ON MENTAL HEALTH, GENETIC TESTING, AND HEALTHY LIVING?  SIGN UP FOR OUR WEEKLY NEWSLETTER HERE**

References

  1. NIMH Report: Prevalence of Mental Illness (2017)
  1. Insel, T.R. (2008). Assessing the Economic Costs of Serious Mental Illness. The American Journal of Psychiatry. 165(6), 663-665
  1. https://www.cdc.gov/mentalhealth/learn/index.htm
  1. Fagerness J, Fonseca E, Hess GP, Scott R, Gardner KR, Koffler M, Fava M, Perlis R, Brennan FX, Lombard J Pharmacogenetic-guided psychiatric intervention associated with increased adherence and cost savings American Journal of Managed Care 2014 May; 20(5):e146-5
  1. Brennan FX, Gardner KR, Lombard J, Perlis RH, Fava M, Harris HW, Scott R. Prim Care Companion CNS Disord. 2015 Apr 16;17(2). doi: 10.4088/PCC.14m01717
  1. Perlis R et al. Pharmacogenetic testing among patients with mood and anxiety disorders is associated with decreased utilization and cost: A propensity‐score matched study. Depression and Anxiety, 2018

 

Whatever happened to the hairy penis?

I am sure this burning question has been on your minds for a while. It’s been on mine too. Since it is blogging day in Larkspur, I finally decided to research the answer about what happened to the hairy penis. And, I am pretty sure you are going to like this one.

Now, rest assured, I will get to the answer eventually. If you are one of my regular readers, you know that I like to start by meandering through some evolutionary biology first.

So, what happened on the way to Homo sapiens?

Linear vs non-linear thinking

Here is an example of linear thinking:

Q: Can you propose an experiment that will answer, “What makes us human?”
A: Study the human genome and find genes that are unique to us, and are missing in chimpanzees—the species closest to us and from which we diverged about 5-7 million years ago.

Here is a lateral way of thinking about the problem: Compare the human and chimp DNA and find the areas that chimps have and we don’t. In other words, maybe it’s what we don’t have that makes us human.

The Stanford study

A number of years ago, a story in the scientific journal Nature reported on exactly this kind of ingenious thinking.

Two Stanford molecular geneticists, Gill Bejerano and David Kingsley, set out to search for areas of DNA that were present in chimpanzees but missing in humans. The missing areas are called deletion mutations.

They found 510 such areas but to their surprise, most of them were not genes coding for specific traits. Instead, they were what is called non-coding DNA. These are the areas between our genes. They make up 98% of the total DNA whereas genes for traits make up only 2% of our DNA.

The importance of “junk DNA.”

The endless stretches of non-coding DNA were dubbed “junk DNA”. But, like all disparaging labels for things we don’t understand, it turned out to be wrong.

We now know that non-coding DNA is actually very important. It controls the expression of the genes located nearby. And, it is critical to the command-and-control part of an animal’s genome.

The study sequences

Of the 510 areas of deletions, Bejerano and Kingsley chose two that looked interesting to study in depth. One is near GADD45G, which is a gene involved in tumor suppression. The other is next to the AR gene, which codes for the androgen (or testosterone) receptor. 

They inserted these sequences into mouse embryos to see what kind of functions they control. The first sequence acted as a brake on the development of certain brain areas. The other sequence (adjacent to the testosterone receptor) caused the mice to develop the hard spines that chimps have on their penises. [See, I told you I would get there eventually]

As you might expect, this research attracted more than casual interest. Commentary has flooded the scientific literature. It was also quite popular in the popular press. There is something about the idea of hairy penises that captures the imagination.

What is the evolutionary value of the hairy penis?

Many male mammals have stiff sensory hairs on their penis. However, we don’t know for certain what value did they bring to the animals that evolved them.

Some evolutionary biologists theorize that they function as a broom, sweeping out the sperm of other males who had paid a visit to the missus. This would have an evolutionary advantage as it would increase the odds of the last guy’s own DNA got the prize. The prize being offspring to whom he passed his genetic legacy.

You might also enjoy: The Ancient Human Language of Music

The theory sounds good (at least for the males). But I can’t help but wonder how the females feel about it.

Let’s not forget the second deletion

I think the second deletion in the Berejano and Kingsley study is potentially the more interesting finding.

By removing the sequence that is inhibitory to the development of certain brain regions, the human brain was then free to develop the neocortex. This is the outer layer of neurons that are the anatomic basis of some of the traits that distinguish us from “lower” animals.  These traits include cognition as well as meta-cognition (thinking about thinking).

Related content: What Makes Humans Exceptional

Now chimps do have a neocortex but it is greatly attenuated compared to ours. Apparently, deletion of DNA next to the GADD45G released the neocortex from its developmental shackles and an evolutionary star was born.

Well, not quite. The emergence of Homo sapiens is actually a lot more complex, but it’s a good start or at least, a good story.

Final thoughts on the hairy penis

Whatever the true function and evolutionary advantage of the hairy penis, I for one am glad that it was deleted. I couldn’t imagine living with one (and neither can my wife).

However, the beauty of being a human being is that we can use our over-developed brains to create things that can compensate for losses.

Here is one. It is called the French Tickler.

french ticklers hairy penis

Photo source: screenshot from eBay

There are no evolutionary theories about the utility of these devices. As far as I know, there are no double-blind studies testing their efficacy. But I can tell you that some of the testimonials that I read about them, albeit anecdotal, were ecstatic. And perhaps, that is good enough.


This post was first published on June 25, 2015. It has been revised and updated for re-publication by the author. 

Some new healthcare companies claim to “reverse diabetes” with a lifestyle intervention that primarily focuses on the blood glucose and hemoglobin A1c level. That claim may be a clever marketing ploy, but it has little to do with the latest science on diabetes.

Although their intervention reduces carbohydrate and sugar intake dramatically and it does lower blood glucose. The problem is that it does not address the core issue in Type 2 diabetes.

The definition of Type 2 diabetes

Type 2 diabetes is defined by a blood glucose of 126 or higher or a hemoglobin A1c of 6.5 or higher. If you introduce sugar and carbohydrate restriction and cause weight loss, the glucose and hemoglobin A1c levels will fall below those numbers and so, by that definition, diabetes is “cured.” The patient no longer meets the criteria for a diabetes diagnosis.

But this premise ignores fundamental and important realities. High glucose levels cause changes in gene expression that persist after the glucose returns to normal causing atherosclerosis and other complications. This well-known phenomenon is called metabolic memory.

Metabolic memory

Metabolic memory is a critical concept that points to the broader reality of people living with Type 2 diabetes. The high glucose levels found in people with the condition are only one part of the complex molecular biology that causes diabetic complications.

The cardiovascular complications of diabetes begin in the fetus and even in the lifestyle choices of prior generations. Overfeeding or underfeeding in the parental generation can produce infants that are too small or too large based on inappropriate activation or inactivation of genes (epigenetics). That inappropriate switching on or off is persistent and many genes are involved causing hypertension, diabetes, and cardiovascular complications in adults.

Patients may themselves switch these genes on or off later in life by gaining weight or smoking. These genes cause diabetes by killing insulin-producing cells in the pancreas and increasing insulin resistance leading to high glucose which further increases oxidative particles and switches on additional genes that cause diabetic complications.

These genes remain switched on even when the glucose is lowered. Therefore, it is very important to continue to take medications that interfere very specifically with the signaling that these genes generate.

Metformin

Promoting the message that cardiometabolic medication should be stopped when the glucose falls below the diabetic level is not in keeping with the latest science and contrary to diabetic guidelines.

ADD_THIS_TEXT
 

The American Diabetes Association (ADA) guidelines recommend metformin use even in prediabetic patients who have a BMI of 35 or greater, have a history of gestational diabetes, or an increasing fasting glucose or hemoglobin A1c. That is because metformin reduces progression to diabetes by about 31%.

Roughly half of prediabetic patients will progress to diabetes despite weight loss. Even patients in the later stages of prediabetes may have lost 70-80% of their insulin-producing function.

These prediabetic patients should not stop metformin when they have lost weight and their glucose improves. There is a tendency for diabetes to reappear with age. Certainly, metformin should not be stopped in patients who already have diabetes regardless of their glucose level.

How metformin works

Metformin reduces diabetic complications by interfering with signaling that has nothing to do with blood glucose levels. It inhibits the same master metabolic switch as the active ingredient in the drug-eluting coronary artery stent.

The increased risk of major cardiovascular events does not disappear with diet and weight loss, however, metformin directly reduces that risk. Type 2 diabetics lose about 10 years of life. If they are on metformin, they live a bit longer than normal people. Metformin is safe and costs only about $4 a month. It is dangerous to recommend “stopping diabetic medications once the glucose is controlled by lifestyle interventions.”

Controlling glucose

Controlling glucose levels has beneficial effects on microvascular events like retinopathy, neuropathy, and kidney damage but it does not reduce the problems that cause most serious cardiovascular complications, death, disability, and costs. Most diabetics die of cardiovascular disease.

Lowering the glucose with lifestyle or any medication approved for the purpose does NOT reduce cardiovascular events. In fact, aggressive glucose lowering with medication and lifestyle caused more people to die.

This is especially important because cardiovascular event incidence and other complications can be dramatically mitigated with a comprehensive solution that brings to bear lifestyle management and medications that interfere with the molecular biology that causes cardiometabolic complications.

Lowering the glucose by any means below the diabetic level, stopping the medication, and expecting complications to fall makes sense, but that idea is not supported by the evidence.

Drugs that block epigenetic signaling

Take a look at the diagram below. Type 2 diabetes occurs mostly in patients with extra abdominal fat caused by poor diet. Increased nutrition and fat switches genes on/off that increase angiotensin II, aldosterone, HMG CoA Reductase, and mTOR activity.

Angiotensin receptor blockers (ARBs), spironolactone, statins, and metformin interfere directly with the molecular signaling cascades that these genes activate. Blocking angiotensin II with an ARB, aldosterone with spironolactone, HMG CoA reductase with a statin, and mTOR with metformin dramatically reduces cardiovascular events in multiple settings. The genes involved are switched on before diabetes develops.

In fact, these elements contribute to diabetes development. Increased mTOR activity directly increases insulin resistance. Blocking mTOR with metformin reduces progression to diabetes.

Elevated aldosterone levels make a strong contribution to metabolic syndrome development. Blocking angiotensin II reduces progression to diabetes.

Blocking HMG Co A reductase with a statin actually increases the incidence of diabetes but has a powerful impact on cardiovascular events and other outcomes. Even that can likely be explained by a hard look at the molecular biology. Blocking HMG CoA reductase with a statin does lower LDL cholesterol but it also reduces Coenzyme Q10 production. Coenzyme Q10 is a powerful antioxidant and it is important for mitochondrial function. Impaired mitochondrial function may be important in the modestly increased risk of diabetes in patients who take statins.

Type 2 Diabetes Treatment

The best treatment for type 2 diabetes is not a matter of either lifestyle or medication. Effective treatment requires best practice implementation of lifestyle measures along with medications that interfere with the molecular biology causing disease. For most patients, medical treatment will include metformin, statins, and angiotensin receptor blockers.

It is important to move beyond a medical system focused on risk factors and organ systems.

There is a very complex interplay in the molecular biology that causes hypertension, diabetes, high cholesterol, and related complications. Reversing diabetes with weight loss and dietary interventions makes perfect sense but improving cardiovascular outcomes and reducing costs to the fullest extent requires a more comprehensive solution. We can treat chronic disease with precision medicine and molecular biology now.

Related Content:
How to Make Managing Diabetes a Habit
Small Victories Make a Big Difference in Diabetes
How to Achieve High Value from your Executive Health Program
A Type 1 Diabetes Diagnosis Didn’t Stop Her from Climbing

 

It has been known for more than two decades that elevated cholesterol is associated with an increased risk for Alzheimer’s Disease (AD).[1] It is also known that the ApoE gene produces a protein that transports fats, including cholesterol, into brain cells. In the human population, there are three variants of the ApoE gene. Seven percent of the population has ApoE2 which confers increased risk for atherosclerosis; 79% has ApoE3 which confers no disease risk, and 14% of the population has the ApoE4 variant which increases the risk for AD.

But not everyone with either the ApoE4 gene or elevated cholesterol gets AD. Could there be an interaction between the higher concentrations of cholesterol and a specific ApoE gene variant that does increase the risk? The answer seems to be yes. Research has demonstrated that modulation in cholesterol alters the ApoE gene activity.[2]

The impact of a high-fat diet

Further research has discovered a nexus between these two factors. A high-fat diet was demonstrated to increase insulin resistance and cognitive decline in all groups whether ApoE4 or ApoE3. However, those with the ­ApoE4 gene had “exaggerated” deficits in the part of the brain specific to new learning and forming new memories (hippocampus). But when those with the ­ApoE4 gene were placed on a low-fat diet for one month all deficits reversed and learning and cognitive function returned to normal![3] The researchers concluded that those with two copies of the ­ApoE4 gene were particularly susceptible to neuronal and cognitive impairments due to insulin resistance caused by a high-fat diet.

What about cholesterol-lowering medications?

Having demonstrated that low-fat diets result in improved cholesterol profiles and subsequent improvement in the brain and cognitive function—especially in those with two copies of the ApoE4 gene, researchers examined whether cholesterol-lowering medications could offer the same benefit.

Data from long-term clinical trials have demonstrated that some but not all cholesterol-lowering medications conferred reduced risk of AD and better cognitive performance, especially in those with two copies of the at-risk ApoE4 gene. The greatest positive effect was seen with atorvastatin (P = .026) and the least with lovastatin (no significant difference found). Those individuals with two copies of the at-risk gene and who already had symptoms of AD but received statin medication had significantly better cognitive function over the course of a 10-year follow-up compared with those who did not receive the statins (< .01).[4]

The NPTX2 gene

Recently, researchers from Johns Hopkins University have discovered another brain protein that appears to be involved and works in concert with elevated b-amyloid to cause the cognitive and memory impairments of AD. The NPTX2 gene is one of the first genes to get activated when new memories are forming. If you are trying to remember what you are reading in this article, then normally NPTX2 would activate and produce the protein with the same name (NPTX2).

This protein acts as an instigator and activator of synaptic signaling and neural circuit recruitment critical in the formation of new memories. Without this protein, the neural circuits cannot effectively synchronize to form new memories. When this gene is turned down at the same time b-amyloid is building up in the brain, the neural circuits’ ability to adapt and organize is impaired contributing to the cognitive and memory decline of AD.

Individuals with high b-amyloid and high NPTX2 did not show cognitive changes of AD. And individuals with low NPTX2 and low b-amyloid also did not show impairment of cognition and memory. This study documented that both high b-amyloid and low NPTX2 were required for the negative outcomes. The good news is that the cause of suppressing NPTX2 is different than what causes elevations in b-amyloid.[5] This provides additional opportunities to make lifestyle changes to protect our brains and prevent dementia—even if one has the at-risk genes.

Why staying mentally engaged is important

So, what turns on the NPTX2 gene? The activity of the neurons themselves![6] Staying mentally engaged, cognitively active—people who are lifelong learners keep the neurons active and NPTX2 turned on with reduced risk of AD. Additionally, externally firing the neurons with treatments such as electroconvulsive therapy has been documented to increase the expression of this gene.[7] These two findings make it likely that any activity that increases the neuronal firing will activate the NPTX2 gene and may be one of the benefits of transcranial magnetic stimulation that causes neuronal firing via magnetic waves rather than electrical pulses.

Not only does NPTX2 enhance learning, neural circuitry recruitment, synchronicity, and brain neural plasticity, it also modulates a receptor (AMPA) involved in non-programmed cell death. Therefore, while normal amounts of NPTX2 are neural protective, and low amounts increase the risk of dementia, significantly higher than normal activity of NPTX2 can trigger AMPA and instigate unscheduled cell death. This, unfortunately, appears to occur in persons with Parkinson’s disease and Lewy Body dementia where NPTX2 is increased by more than 800% in the motor pathways.[8]

Related Content: Is This New Alzheimer’s Drug Really a Breakthrough?

REST factor

Another protein critical in maintaining brain health is repressor element 1-silencing transcription (REST) factor. REST functions within the cell like a conductor of an orchestra, directing various genes to sound out (express themselves) or be silent (turn off). As a result, REST is involved in determining how neurons develop, what function they fulfill, their connections and networking to other neurons and, as expected, is highly active in childhood during the massive remodeling of brain development.

In the past, it was believed REST became inactive after a person reached adulthood. However, recent research has discovered REST is active in older brains and functions to protect the memory circuits (hippocampus) from damage due to hyperexcitation. It also plays a key role in protecting the brain from damage associated with aging. Reduced levels of REST are associated with loss of brain volume in the hippocampus (memory circuits) and increased cognitive impairment.

In persons who have the toxic protein build up associated with Alzheimer’s dementia (amyloid and tau) those with high REST activity did not demonstrate cognitive decline or progress to dementia, supporting the idea that REST is neural protective. The critical question: What affects the availability of REST? Chronic mental stress suppresses REST, contributing to accelerated aging and cognitive decline, whereas meditation that reduced stress is associated with increased levels of REST and subsequent brain health. [9]

With all of this in mind, genetics appears to account for about one-third of the risk of developing AD. What is the key then that contributes to developing AD if it isn’t simply genetics? Strong evidence points to inflammation, which contributes to insulin resistance in the brain that causes a cascade of events resulting in the death of brain cells and the development of AD. Exercise, along with most of the other modifiable factors (sufficient sleep, anti-inflammatory diet, stress management, etc.), reduces inflammation and insulin resistance, keeps neurotrophins (proteins that act like fertilizer for the neurons), REST, NPTX2 and other protective factors turned on thereby preventing the development of AD.

While aging is inevitable, dementia is not!

The Aging Brain

We can make choices to protect our brains and prevent the development of late-life Alzheimer’s dementia. You can learn more about this in my new book, The Aging Brain: Proven Steps to Prevent Dementia and Sharpen Your Mind.

It is is an integrative examination of the various contributing factors to AD and it outlines a comprehensive action plan to slow the aging process and keep our brains healthy.

_______________________________

References:

[1] Jarvik GP, et al. Interactions of apolipoprotein E genotype, total cholesterol level, age, and sex in prediction of Alzheimer’s disease: a case-control study. Neurology. 1995;45(6):1092–6.

[2] Petanceska SS, et al. Changes in apolipoprotein E expression in response to dietary and pharmacological modulation of cholesterol. J Mol Neurosci. 2003;20(3):395–406.

[3] Johnson LA, Torres ER, Impey S, et al. Apolipoprotein E4 and insulin resistance interact to impair cognition and alter the epigenome and metabolome. Sci Rep. 2017;7:43701

[4] Geifman N, Brinton RD, Kennedy RE, et al. Evidence for benefit of statins to modify cognitive decline and risk in Alzheimer’s disease. Alzheimers Res Ther. 2017;9:10.

[5] Xiao MF, Xu D, Craig MT, et al. NPTX2 and cognitive dysfunction in Alzheimer’s disease. eLife. 2017 March 23;6.

[6] Reti, IM, et al., Prominent Narp expression in projection pathways and terminal fields. J Neurochem. 2002 Aug;82(4):935-44.

[7] Reti, IM, Baraban JM, Sustained Increase in Narp Protein Expression Following Repeated Electroconvulsive Seizure, Neuropsychopharmacology (2000) 23, 439–443. doi:10.1016/S0893-133X(00)00120-2

[8] Moran, LB, et al., Neuronal pentraxin II is highly upregulated in Parkinson’s disease and a novel component of Lewy bodies, Acta Neuropathol. 2008 April; 115(4): 471–478.

[9] Ashton N, Hye A, Leckey C et al. Plasma REST: A Novel Candidate Biomarker of Alzheimer’s Disease Is Modified by Psychological Intervention in an At-Risk Population. Transl Psychiatry. June 6, 2017; 7(6): e1148

 

Imagine a world where each of us knows our exact genetic fingerprint of disease risk. Your doctor will provide precisely tailored medical advice and preventive medicine to match your specific DNA sequence.

  • Will the new technique known as Polygenic Risk Scoring usher this new and universal age of precision medicine?
  • Will each of us soon carry our fully-sequenced genome in our cell phone, giving your physician instant access to your exact disease risk profile rather than having to rely on the blunt instrument of a family history?

A paper on Polygenic Risk Scoring was recently published in Nature Genetics.[1] It hints at a breakthrough in genetic diagnosis that appears to be both universally applicable and universally beneficial. I have no doubt, your genome will be a very important part of the doctor-patient relationship in the very near future.

Why is this important? I’ll get there, but first, let me provide some historical perspective.

B.G. Before Genomics

Back in the good old days, a doctor would calculate your risk of common diseases by your family history. If you have first degree relatives with coronary heart disease, breast cancer, diabetes and such, you would be declared at higher risk as well. General advice was then dispensed like don’t smoke, watch your weight, get regular mammograms.

On rare occasion, we knew your family disease history was caused by your genes, directly inheritable and a clear and present danger to you. The classic example is Huntington’s Disease, a fatal degenerative neurologic disease inherited in an autosomal dominant fashion that puts the children of HD victims at a 50% risk of having the disease themselves. It was only in those obvious circumstances that we could guess at the genetic origin of your personal risk of disease.

Gene sequencing and specific mutations

In the infancy of the study of the human genome, scientists discovered individual gene mutations that put individual people at dramatically increased risk for specific diseases. The most widely known example is probably the BRCA1 gene mutation that increases risk of breast cancer. Angelina Jolie is just one of the thousands of women who chose bilateral prophylactic mastectomy to mitigate the increased risk of the BRCA1 mutation.

There are hundreds of examples of known mutations putting individuals at increased risk of specific diseases. Sometimes that person also has an obvious family history of the same disease, but often spontaneous mutations or de novo gene combinations can cause new genetic abnormalities in the absence of a family history. Sometimes we don’t find the genes until after you have the disease.

 Your genetic fingerprint and polygenic risk

Enter this week’s paper. Unfortunately, the full paper is behind a paywall, but here are the highlights:

Researchers analyzed 400,000 individual genomes to “identify genetic variants associated with coronary artery disease, atrial fibrillation, type 2 diabetes, inflammatory bowel disease, or breast cancer.” They identified all the variations that produced even a small bump in disease risk, not just the major mutations like BRCA1 or the gene for Huntington’s Disease. Beneath the major mutations are a large number of minor variations that add up to increase in disease risk in individuals with multiple “hits”. This pattern of increased risk can cause disease in the absence of an obvious family history.

This allows the scientists to take anyone’s genome and calculate your aggregate risk for these diseases even if you don’t have one of the known major mutations. They call it Polygenic Risk Scoring (poly = more than one and genic = gene). Polygenic Risk Scoring is your total score of all the minor gene variations that increase disease risk

This is a powerful upgrade to your doctor’s ability to predict disease in any given patient. We are no longer in the darkness of the B.G. age with only the family history to guide us. And, we aren’t just looking for major, single gene mutations.

Polygenic Risk Scoring allows researchers to take a deep, wide look into your risks at the level of your individual DNA.

It is a fingerprint and a method of risk stratification that is relevant to every single human on the planet, not just those with a significant family history of a specific disease.

Dr. Amit V. Khera, a cardiologist and lead author of the study said:

“These individuals, who are at several times the normal risk for having a heart attack just because of the additive effects of many variations, are mostly flying under the radar. If they came into my clinical practice, I wouldn’t be able to pick them out as high risk with our standard metrics. There’s a real need to identify these cases so we can target screening and treatments more effectively, and this approach gives us a potential way forward.”

Sekar Kathiresan  MD, a co-author of the study and Director of the Center for Genomic Medicine (CGM) at Massachusetts General Hospital predicts:

“Ultimately, this is a new type of genetic risk factor. We envision polygenic risk scores as a way to identify people at high or low risk for a disease, perhaps as early as birth, and then use that information to target interventions — either lifestyle modifications or treatments — to prevent disease.”

It’s only the beginning of this diagnostic revolution

It’s only the beginning of this diagnostic revolution. I expect this technology will advance very rapidly in the years ahead. This study only had 400,000 genomes worth of disease association and predictive ability out of the current 7.6 billion humans on earth (= 0.005%). Over time, researchers will expand the number of genomes in the analysis, improve the accuracy and expand the scope of diseases they can predict. I suggest all of us keep an eye on this technology and research going forward.

Reference

Amit V. Khera, et al Genome-wide polygenic scores for common diseases identify individuals with risk equivalent to monogenic mutations. Nature Genetics, 2018; DOI: 10.1038/s41588-018-0183-z [Paywall]

 

Nobel prize winner Daniel Kahneman devoted his life’s work to acquiring an understanding of how we make decisions. In his best-selling book, “Thinking, Fast and Slow,” he describes two modes of thinking:

  1. System 1, the automatic system, is fast, automatic, instinctual, with no sense of voluntary control. If what you did brought you before the judge, you’d argue that “my impulses made me do it”.
  2. The other system (called the analytical system, or System 2) is slow, analytical, and allocates attention to the effortful mental activities that demand it. It gives us a subjective sense of agency and choice. If you watch the news on TV and yelled at the liar on the screen, you actually analyzed what she said, compared it with your fund of knowledge, and arrived at the conclusion that she is lying—then you used your analytical mode of thinking. And you if feel exhausted after watching too much of it, this is no accident. This mode of thinking requires a lot of metabolic activity.

As Kahneman states, “when we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to do.” But that’s not what actually happens. Our behavior is dictated by and large by our instincts, previous experiences stored in our memory bank, prejudices, and beliefs.

The reasons why we act on our instincts rather than think first and act later are obvious. First, it has a survival value. If we had to ponder the issue of an approaching lion, we wouldn’t be around very long. There is also a metabolic reason. Brain activity, including thinking, is metabolically very expensive. Our whole metabolism is geared toward minimum expenditure of energy. System 1 is metabolically less expensive than system 2. In other words, our brain is by its very nature lazy!

 

The origins of instincts

Now that we understand the reason why we act on instincts, it raises the question of how did we acquire them. How is it that an infant will turn his head toward a sudden loud noise, just as an adult would? The facile answer would be: The sudden noise implies possible danger, and all animals respond instinctively to a danger signal. This is obviously an adaptive response which arose through mutations in very ancient evolutionary times and natural selection favored animals that manifested the trait earlier in development.

But are all, or even most, instincts the result of such mutations? Here is an example of an instinct that is less likely to be embedded in our DNA. Unlike conscious bias, of which we are fully aware of, unconscious bias (also called implicit bias) refers to a bias that we are unaware of and which happens outside of our control. It is a bias that happens automatically and is triggered by our brain making quick judgments and assessments of people and situations, influenced by our background, cultural environment, and personal experiences. For many reasons, such an instinct is unlikely to be rooted in a mutation in our DNA. This unconscious racial bias is probably ultimately rooted in tribalism. So how did this human trait that dates from hunter-gatherer days transmit from one generation to another?

 

Epigenetics and behavior

To understand how behavior can be transmitted through the generations, we need to understand a bit of molecular biology. Stay with me, I’ll make it easy.

We are used to thinking of a change in gene expression as a result of mutation; change one nucleotide with another and you’ve affected its function. For example, some genes suppress tumor growth. Mutate them and their original function is reduced or eliminated, and you’ve got cancer. But there are more subtle changes that affect gene expression. These are called epigenetic changes.

For instance, the nucleotide cytosine, one of the four building blocks of DNA, can be methylated and, thus, change the expression of the gene containing it. Notice that no change has occurred in the DNA sequence itself: Cytosine is still there, but it is now chemically modified by the addition of a methyl group. There is also a group of RNA molecules, called non-coding RNA, which can suppress the expression of genes. Lastly, the chromosome is wrapped in a group of proteins called histones. Chemically changing some of the amino acids that make up the histones changes the histone’s structure, and this, in turn, changes the expression of the genes. So as you can see, none of the underlying DNA has been changed—rather some chemical modifications caused changes in the expression of the gene. These changes are epigenetic, “on top of” or “in addition to” genetic mutation.

The field of epigenetics is not only important, it is exciting. Why? Because it provides a deeper understanding how the environment affects genetics. It neatly resolves the age-old debate of nature vs. nurture. There is no conflict between the two states anymore: It is nature and nurture, intertwined, at times collaborating and at times clashing, that in the final analysis determine who we are.

 

But isn’t the brain the final arbiter of behavior?

Indeed it is. Consider the following example:

Female rats that exhibit lower reactivity to stress lick, groom, and nurse their pups extensively. Their offspring also react less to stress. So, how is it that all rat pups are born with the same neuroanatomical structure, yet the ones that were coddled by their mothers react less to stress? Turns out their reduced stress levels are associated with epigenetic changes in the hippocampus, the brain structure involved in memory and in hormone signaling.

Another example is the well-known phenomenon of imprinting. Ducklings follow their mother because within hours of their hatching their brains imprinted on their mother. But this in not mother-specific. If the mother was not present, but another duck, or even a human, was present at the “critical period” after hatching, the ducklings would imprint on them, and follow them. How did that happen? By epigenetic changes in certain genes in the brain.

These behavioral changes could not have happened if the brain was a static, immutable piece of tissue. But it isn’t. The brain is plastic, it can change, and the changes can last throughout an individual’s life course and, in many cases, be transmitted to the offspring.

So regardless of whether certain behaviors, such as the startle reaction to a loud “boo!”, or unconscious biases, are rooted in a genetic mutation or are the product of epigenetic changes in the brain, both are dependent on the plasticity of the brain, as Gene E. Robinson of the U. of Illinois and Andrew B. Barron of Macquarie University in Sydney, Australia, proposed in a recent article in Science magazine.

Is it good or bad? At first blush, it looks pretty discouraging. Some of our worst instincts seem forever (at least during our lifetime) embedded in our brain. But not so fast. The brain is plastic throughout our lifetime and is always amenable to modifications. If we have poor anger control or implicit or explicit biases, if System 1 “made us do it”, there is always System 2, the thoughtful, albeit a bit ponderous, alternative. It can, through epigenetic modifications and brain plasticity, effect behavioral changes. To paraphrase Abe Lincoln, it can “call on our better angels.”