1.) 10%: As this story goes, we humans only use about 10 percent of our brain’s capacity. This long-debunked myth is so well ensconced that there was a film built around the premise as recently as 2014. That movie, “Lucy”, features a titular character who accidentally ingests an overdose of a drug that allows one to exploit increased levels of one’s mental capacity. Admittedly, by the movie’s end the 10% myth is one of the lesser violations of reality because as Lucy gets closer to 100% of mental capacity all the laws of physics dissolve in her presence.
I’ll give a reference at the bottom of this paragraph from which one can learn about all the evidence of the folly of this belief. I’ll just lay out part of the evolutionary argument. The fundamental rule of the biological world is that mother nature doesn’t over-engineer. Once there is no longer any benefit to be gained in terms of enhanced likelihood of survival, we don’t evolve new and costly capacities. We don’t see people who can run 500 miles per hour or jump 50 feet vertically from a standstill. Those capacities weren’t necessary to survive the beasts that preyed upon us. Our brains are very costly, they consume 20 to 25% of our energy intake. [For the science, see: Beyerstein, Barry L.1999. “Whence Cometh the Myth that We Only Use 10% of our Brains?”. in Mind Myths: Exploring Popular Assumptions About the Mind and Brain. ed. by Sergio Della Sala. Hoboken, NJ: Wiley. pp. 3–24.]
Why does this myth get so much play? There are two likely reasons. One is wishful thinking. We’d all like to think there is much more available to us. And lucid moments of meditation or flow, we may even feel that we have tapped into a vast dormant capability. (In both the aforementioned cases, it’s interesting that the enhanced performance we may experience is a function of parts of the brain shutting down and not increased capacity ramping up.) The other reason is that people see savants of various sorts, but they don’t account for the full picture. There are people who can carry out activities with their brains that seem supernatural. However, it should also be noted that savants who can memorize phone books or tell you the day of the week for a random date hundreds of years ago [or in the future] often suffer corresponding downsides with respect to their brain activity. Of course, there are people who are just geniuses. Geniuses are endowed with more intelligence than most of us. They have a bigger pie; they aren’t just eating a bigger slice.
2.) Left and Right Brained: The myth is that some people use one of the hemispheres of their brain much more than the other, and that this explains why some people are creative and artsy and others are logical and mathematical. We must be careful about the nature of the myth and separating it from reality. The science is NOT saying that there aren’t some people inclined to be “artsy” and some inclined to Spock-like rationality. Clearly, these personality types exist. The science is also not suggesting that there aren’t some functions that are carried out exclusively in one hemisphere–e.g. language is a left brain function. The myth says that predominant use of one hemisphere is the cause of these extremes of personality type. This myth has had–and continues to have–a great following, but it’s not supported by the studies that use the latest brain imaging technology to see exactly where the brain is being active.
For the science, see: Nielsen JA, Zielinski BA, Ferguson MA, Lainhart JE, Anderson JS. 2013. “An Evaluation of the Left-Brain vs. Right-Brain Hypothesis with Resting State Functional Connectivity Magnetic Resonance Imaging.” PLoS ONE. 8(8): e71275.
Why is this a persistent myth? First of all, we all know people who fit neatly into one of the boxes, either “artsy” or “logical.” The left-brained / right-brained explanation is as good a way as any to explain these differences in the absence of evidence. It may have also been a way for people to attribute their weaknesses to an uncontrollable cause (always a popular endeavor among humans.) Secondly, once this idea caught on, a lot of people built the idea into their teachings, businesses, and academic ideas. Yogis used the notion to support ideas about imbalances in the “nadi” (channels.) Psychologists used it in their personality testing and profiles. In short, many people had a vested notion continuing false belief. Thirdly (maybe), there could be something to the issue of how we notice differences versus similarities. (e.g. A person says, “Hey, look Jimi Hendrix is playing a left-handed guitar. Lefties are creative.” The next thing one knows people are disproportionately noticing the left-handed individuals in the arts [and failing to notice the many (more) righties.])
3.) The Conscious Mind Makes All Our Decisions: We all have a conscious mind that we think is our ultimate decision-maker. The evidence, on the other hand, suggest that this is wrong. It turns out that it’s entirely possible for an entity to think it has decision-making authority, when–in fact–it’s finding out about the done deal decisions after the fact. Like the left / right brain myth, this is an idea that was firmly affixed right up until brain imaging technology became sufficiently sophisticated to see what parts of our brain were firing and when. In the wake of such imaging studies, we could see that the subconscious mind does its work first, and this has led to the widespread (though perhaps not consensus view) that decisions are made subconsciously before they filter up to our conscious minds.
There are many sources of information on this idea, but one popular book that is built around the idea is David Eagleman’s “Incognito“, a book that devotes itself to the part of our neural load iceberg that goes on below the waterline (i.e. subconsciously.) Scientists often compare consciousness to the CEO of a large and complex corporation. The CEO doesn’t personally make every decision. Instead, the CEO sets an agenda and the strategy, and–if all goes as planned–the decisions that are made are consistent with those overarching ideas.
It’s easy to see why this myth is persistent. First, if part of a process is buried from view, as subconscious thought has been (and–to a large extent–continues to be) then it’s easy to see how humanity would develop a story that excludes it and fills in from the visible parts. Second, there is immense vested interest in protecting all sorts of views of consciousness that are embedded in religions and quasi-scientific undertakings.
Thirdly, people have a deep-seated need to feel in control, and reducing the role of consciousness to long-term strategist and rationalizer of decisions would seem to make free will illusory. A number of scientists and scholars (e.g. Sam Harris) do argue that free will is an illusion. However, it should be noted that there are others who suggest otherwise (e.g. Daniel Dennett and Michael Gazziniga.) These “compatibilist” scholars aren’t necessarily arguing that the conscious mind is the immediate decision maker in contradiction of the scientific evidence. What they are arguing is that through learning, thinking, and agenda-setting, people can influence the course of future decisions–perhaps imperfectly, as when one eats a sleeve of Oreos after contemplating what one has learned about how that’s not good for you. (This goes back to mother nature not over-engineering. The conscious mind must have some role in facilitating survival or it–being incredibly costly–wouldn’t have evolved. If it can’t influence our path, it can’t enhance or likelihood of survival.)
I’ll attach this video by Alfred Mele that contradicts the notion that free will has been proven an illusion. This isn’t to suggest that I’m convinced Mele is right, but he does lay out the issues nicely and more clearly than most.
4.) The Sleeping Brain Shuts Down: I won’t spend a lot of time on this one because: a.) in a sense it’s a continuation of the consciousness myth (i.e. we lose track of this time as far as our conscious mind goes, and so we think nothing is happening because our conscious experience = what we believe our world to be.) and, b.) it’s not as ingrained a myth as some of the others. Perhaps this is because it began to be debunked with electroencephalogram (EEG) studies which began decades ago–in the 1950’s–well before the functional Magnetic Resonance Imaging (fMRI) that has been providing many of our most recent insights into the brain. (The EEG measures brainwaves and the fMRI blood flow.)
The NIH (National Institutes of Health) offers a quick and clear overview of this science that can be viewed here.
Our bodies go from a very relaxed to completely paralyzed state over the course of a night’s sleep. The paralyzed state occurs with Rapid Eye Movement (REM) sleep and may be an adaptation that kept our ancestors from fleeing out of trees or cliff-side caves during their dreams. If the body is essentially immobile, it’s not a far stretch to imagine the brain is as well. However, the brain is like a refrigerator–always humming in the background (part of the reason it uses between 1/5th to 1/4th of our energy.)
5.) Adults Can’t Generate New Brain Cells: This was the prevailing thought until quite recently. It was believed that one’s endowment of cells–at least as far as the Central Nervous System (CNS) is concerned–didn’t change / replenish once one reached adulthood. It turns out that, at least for the hippocampus, there is now evidence to support the idea of CNS neurogenesis (the production of new nerve cells.)
There’s a Ted Talk by Sandrine Thuret that explains the current state of understanding on this topic, including what activities and behaviors facilitate neurogenesis. (Long-story short: Exercise and certain healthy foods are good, and stress and sleep deprivation are bad.)
What is the basis of this myth? First, one must recognize that the studies don’t show that any and all CNS nerve cells are regenerated. That means there is an element of truth to this myth, or–alternatively stated–a more precise way of stating the idea would produce a statement of the best current understanding of medical science. Of course, telling teenagers that every beer they drink kills 20,000 brain cells irrevocably has probably proved a popular–if ineffective–reason for the continuation of this myth. (Note: at least heavy drinking is definitely damaging to the brain, though by damaging / interfering with dendrites and not by “killing brain cells.” It’s also not believed to be irreversible.)
6.) Emotion and Reason Are Forever at Odds: Most people have had the experience of boiling over with emotion. That is, they’ve experienced instances during which they believed a particular emotion didn’t serve them and they didn’t want to be caught up in it, and yet they couldn’t help themselves. It’s clear that there’s an ability to inhibit or suppress emotions; recent findings have suggest that the neural pathways involved with voluntary suppression are different from those used when one is persuaded to suppress the emotion.
Of course, there’s also evidence that continually suppressing emotion can have a downside. While it remains an unclear correlation, it’s commonly believed that suppression of emotion is related to untimely deaths from certain diseases–e.g. cancer, and there has been some evidence to support this.
Still other evidence supports the notion that there are healthy ways of regulating one’s emotional life rather than the ineffective and counterproductive process of just suppressing emotions. The key may lie in changing one’s way of perceiving events rather than telling oneself to not show emotions. Rather than one’s conscious mind wrestling with the emotion, activities like breath control have shown effective in regulating exposure to stressful situations.
The brain is an organ we all possess, and we intuitively think we’ve got a grasp of it. Yet brain science is one of the scientific disciplines in which we have the most to learn.