Information

Does thinking or focusing on something alter a neuron's speed?

Does thinking or focusing on something alter a neuron's speed?



We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

I Googled about this but couldn't find any thing precise to be that does thinking about something hard alters the speed of neurons impulses? I have heard of neurons velocity being variable but want to understand the exact circumstances in which the velocity varies.


There are, as far as I know, 2 main factors that determine the speed of action potential propagation:

  • Diameter: Axons with a large diameter have lower resistances, and hence higher conduction velocities;
  • Myelination: Myelin prevents passive diffusion of ions through the cell's membrane and increases charge separation across the cell membrane, speeding up conduction. Additionally, the spaces between the myelin sheaths, known as the nodes of Ranvier, facilitate saltatory conduction, which is akin to the action potential 'jumping' from node to node with a net increase in propagation as a consequence.

These factors are physical properties of a neuron and hence are pretty much constant under physiological conditions, and are not influenced by thinking or focus.


Turning Off The Switch

Psychologist Alia Crum and her research team at Yale University discovered that in order to avoid stress taking over your life and dominating you, your thoughts and beliefs must change.

Crum goes on to say that how we view stress, primarily affects how our brains change. Meaning if we only see stress as something horrible and react in that form, then a thought loop will occur and every stressful event that you encounter will send your brain into a frenzy.

Instead, Crum recommends understanding that stress can be helpful in allowing you to grow and develop internally. When research patients were tested with this mindset, all of a sudden, the things that used to cause so much stress, no longer had the impact that they once did.

And so I leave you with this, view the world through rose-colored lenses and you will find that stress may be a killer but can also propel you forward. Its all about how YOU react, not the nature of the stress.


A secret of science: Mistakes boost understanding

Many kids fear making a mistake will label them as a failure. In fact, science is built on a mountain of mistakes, many made by the greatest minds. The trick is to view each mistake as a step along the path to understanding something better.

marrio31/iStock/Getty Images Plus

Share this:

September 10, 2020 at 6:30 am

This past spring, before COVID-19 turned the world on its head, Anne Smith’s 9th-grade physics class was learning about electric circuits. Smith teaches science at Carmel Catholic High School in Mundelein, Ill. She gave her students paper clips, batteries, tape and a lightbulb. Then she said, “Have a go. See if you can make the bulb light up.”

Students in Anne Smith’s 9th grade chemistry lab had to determine their own methods for deciding which solutions were best for making bubbles. A. Smith

Smith sees value in letting her students experiment. She believes that some of the most powerful learning happens through trial and error. “When students are allowed to struggle through difficult material, they gain confidence,” she says. “They learn that making mistakes is part of the scientific process.”

This isn’t to say that once the task is given, Smith sits back and watches her students fail. Instead, she selects activities that may have more than one answer. Then she encourages students to try multiple approaches. She wants them to think about different ways to solve a problem.

Throughout the lesson, the students engage in group discussions. Their observations and reflections focus on the process, not the outcome.

Smith praises students for working through hard tasks. She wants to highlight how their struggles can reward them with benefits. “The point,” Smith says, “is to explore ideas and evaluate the methods [being] tried.” In doing so, students learn to value mistakes. Indeed, she finds, mistakes are an essential part of learning.

Failing to succeed

“Failure is the most important ingredient in science,” says Stuart Firestein. He studies the biology of the brain at Columbia University in New York City. He also wrote a book called Failure: Why Science is So Successful.

“When an experiment fails or doesn’t work out the way you expected, it tells you there is something you didn’t know,” he says. It suggests you need to go back and rethink: What went wrong? And why? Was there a problem with the idea? With your approach or assumptions? With your measurements? In the environment, such as temperature, lighting or pollution?

Thomas Edison, at left, poses with an early electric car that ran on his storage batteries. It took Edison thousands of failed experiments over many years to develop these batteries. National Park Service

This is the value of failure. It leads us to what Firestein calls “the portal of the unknown.” It is where the deepest and most worthwhile questions come from. And asking those questions can spark new ideas and types of experiments. The best thing a scientist can discover is “a new or better question,” Firestein says. “Failure is where the action is. It propels science forward.”

Thomas Edison is reputed to have said much the same thing, according to a 1910 biography. He wanted to make a better battery. But after working seven days a week for more than five months, he still hadn’t succeeded. He told a friend, Walter S. Mallory, that he had already done more than 9,000 experiments for the project. According to the book, Mallory replied: “Isn’t it a shame that with the tremendous amount of work you have done you haven’t been able to get any results?” The book goes on to say that Edison “with a smile replied: ‘Results! Why, man, I have gotten a lot of results! I know several thousand things that won’t work.’”

Nor did the dogged inventor stop. Eventually, he got the new battery to work. He patented it, too. Although Edison is best known for the light bulb, those batteries eventually became the most commercially successful product of his later life.

Educators and Parents, Sign Up for The Cheat Sheet

Weekly updates to help you use Science News for Students in the learning environment

Most schools do not encourage students to fail

Smith’s and Edison’s approach is different from the way science is carried out in most classrooms. Schools tend to focus on covering lots of topics and memorizing countless facts. Many classes rely on textbooks to give students as much info as quickly as possible. The problem with those books, explains Firestein, is “they have no context.” They state the results of experiments but they don’t tell us why people did them. Nor do they describe experiments that didn’t work. By focusing on successful outcomes, Firestein says, “we leave out 90 percent of science.”

Instead, he suggests, science learning should include details of those failures. This shows the realistic process of getting to an answer. Also, students can discover why specific scientific questions arose and see how people arrived at the answer we have now.

You thought you followed the directions carefully, but the cookies still burned. What went wrong? A scientist would look at and test each of the factors that could have led to the problem. Is the oven thermostat reading the right temperature? Did you misread the baking time? Zhenikeyev/iStock/Getty Images Plus

When we fail, we question thoughts, opinions and ideas. This is what teachers refer to as critical thinking. Through such questions, we connect ideas and challenge reasoning. Both skills are highly valued in a scientist, says Firestein.

Learning about failure also makes science more approachable. Science is not just a sequence of geniuses making one discovery after another. Rather, the history of science is full of mistakes and wrong turns.

Some of the most well-known scientific facts follow a trail of failures. For example, physics icon Isaac Newton was wrong about gravity. Although Firestein explains that Newton’s laws of motion are “great for launching satellites and building bridges, his idea about how gravity works was wrong.” It was Albert Einstein, 200 years later, who corrected it — again as a result of a failed experiment. He studied Newton’s idea until he came up with his Special Theory of Relativity. This changed science’s perception of gravity. The scientific process is where you arrive at the truth by making mistakes, Firestein explains, “each a little less of a mistake than the one before.”

Explainer: Where antibiotics came from

Failures have also led to great discoveries. Penicillin, X-rays and insulin are all the results of experiments gone wrong. “Two-thirds of Nobel laureates have announced their winning discovery was the result of a failed experiment,” says Firestein. This explains why Isaac Asimov, an American writer and biochemist, is crediting with saying: “The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ but ‘That’s funny.’”

The importance of failure is just as prevalent in other fields. Take this observation by professional basketball player Michael Jordan in a 1997 Nike commercial: “I’ve missed more than 9,000 shots in my career. I’ve lost almost 300 games. Twenty-six times, I’ve been trusted to take the game-winning shot — and missed. I’ve failed over and over and over again in my life. And that is why I succeed.”

A changeable brain

Michael Merzenich is a neuroscientist who worked at the University of California, San Francisco. In the 1970s, he found evidence that brains can rewire themselves over time. His work challenged the common idea that people were born with a fixed number of brain cells organized in unchanging paths. Perhaps our potential to think, learn and reason was not set from birth, he proposed.

Merzenich and his team began their research with monkeys. They aimed to map which brain cells fired when the monkeys completed a given task. The resulting “brain maps” amazed the scientific community. But he found an even bigger surprise when he later revisited the maps: The monkey’s neural pathways had changed. “What we saw,” said Merzenich, “was absolutely astounding. I couldn’t understand it.” The only possible explanation was that the monkeys’ brains had wired new neural pathways, he decided. Norman Doidge recounted the observation in his book The Brain that Changes Itself.

Merzenich’s research pointed to a concept that would come to be known as “brain plasticity.” It’s the ability of the brain to adapt and change in response to experiences. His studies went on to show that when we learn something new, an electrical signal fires and connects cells in different parts of the brain.

Scientists Say: Synapse

The place where these electrical sparks jump between brain cells is called a synapse. Synapses fire when we do things such as read a book, play with toys or have conversations. That firing strengthens connections between brain cells. If we do something only once, synaptic pathways can fade away. But if we practice and learn something deeply, the synaptic activity will form lasting networks in the brain. Indeed, learning rewires the brain.

If learning can cause our brain to adapt and change, what happens when we make a mistake? In 2011, Jason Moser studied how the brain reacts when people make an error. Moser is a psychologist at Michigan State University in East Lansing. He teamed up with four other researchers. They asked 25 participants to complete a test with 480 questions. During the test, each person wore a stretch cap with electrodes that recorded activity in different parts of the brain.

The participants’ brain activity rose when they made a mistake, Moser and his colleagues found. “When a participant experienced conflict between a correct response and an error, the brain was challenged,” he says. “Trying to make sense of this new knowledge was a time of struggle and need for change.” This is when the brain reacted most strongly.

He also found two typical brain responses to a mistake. The first response indicated that something went wrong. The second reaction only came when test-takers treated the mistake as a problem that needed greater attention. Participants who responded to their error by giving it more consideration were able to do better on the test after making their mistake. Moser concluded that “by thinking about what we got wrong, we learn how to get it right.”

Even in math, a right answer isn’t the only goal. Showing classmates why you’re taking certain steps to a solution can help students learn together — or together troubleshoot where something went wrong. SDI Productions/E+/Getty Images

A new view of mistakes

Championing the value of mistakes, Jo Boaler has started a revolution in math. She teaches math education at Stanford University in California. In her 2019 book, Limitless Mind, she said people need to give up the idea that one’s ability to learn is fixed, or unchanging. Instead, she argues, we should view learning as putting us “all on a growth journey.”

She wanted to give students positive messages about mistakes and create a “mistakes-friendly” environment where students celebrate errors. Seeking to bring this idea into the classroom, Boaler established a three-week summer math camp known as Youcubed. (The last in-person session was in 2019. She now offers it as an online course.) The aim of this program is to boost confidence in math among 6th and 7th graders. When kids give answers, they are encouraged to explain their thinking. Discussing the process helped other students analyze their reasoning. This pushed them to keep trying.

At the beginning of camp, students often reported that struggling with math was a sign you weren’t doing well. But by the end of three weeks, most reported feeling more positive about making mistakes. They enjoyed being challenged and described having higher self-esteem. “When students see mistakes as positive,” Boaler says, “it has an incredibly liberating effect.”

Anne Smith’s students designed balloon rockets and then explained why they succeeded or failed using Newton’s laws. A. Smith

Learning through collaboration can also help us see mistakes in a more positive way. Janet Metcalfe studies the effects of errors and how they can benefit learning. A psychologist at Columbia University in New York City, she observed several middle-grade math classes. The most effective learning technique, she found, was giving students a chance to discuss their errors.

They might be asked: What do you think about that? How did you get your answer? Sharing the way they did a problem took much of the focus off of mistakes. Instead they described their theories and ideas. This collaboration with classmates resulted in higher test scores.

“When you connect with someone else’s ideas,” Metcalfe points out, “you go deeper.” Mistakes are simply the starting point for discussion. “You have only got something to learn if you make a mistake,” she concludes.

And that is the message Smith tries to give to her high school physics students. Yet some still come to class with a fear of failure. They believe that a wrong answer means they are not smart. Some give up before they even try because they are so scared of being incorrect.

“It is especially important for these students to see mistakes as a learning opportunity,” Smith says. Getting people to see mistakes as a natural part of learning takes time. We have all been embarrassed by making a mistake in public. But finding success as a result of struggling toward a solution will hopefully help students become more eager to approach future challenges.

To Smith, being able to “approach mistakes with confidence” is more important than anything else a student can learn.

Power Words

battery: A device that can convert chemical energy into electrical energy.

biology: The study of living things. The scientists who study them are known as biologists.

brain plasticity: The ability of the brain to essentially re-wire itself by changing directions in the flow of nerve signals or changing connections between neurons. Such changes allow the brain to grow and mature from infancy through old age, and sometimes to even repair brain injuries.

circuit: A network that transmits electrical signals. In the body, nerve cells create circuits that relay electrical signals to the brain. In electronics, wires typically route those signals to activate some mechanical, computational or other function.

COVID-19: A name given the coronavirus that caused a massive outbreak of potentially lethal disease, beginning in December 2019. Symptoms included pneumonia, fever, headaches and trouble breathing.

critical thinking: Sometimes described as a scientific attitude of mind, it is the careful and probing consideration of any belief, purported fact, idea or values, based upon the data or experiences available — and then using that assessment to make some conclusion.

develop: To emerge or to make come into being, either naturally or through human intervention, such as by manufacturing.

electric current: A flow of electric charge — electricity — usually from the movement of negatively charged particles, called electrons.

environment: The sum of all of the things that exist around some organism or the process and the condition those things create. Environment may refer to the weather and ecosystem in which some animal lives, or, perhaps, the temperature and humidity (or even the placement of things in the vicinity of an item of interest).

error: (In statistics) The non-deterministic (random) part of the relationship between two or more variables.

fire: (in neuroscience) The activation of a nerve or neural pathway.

focus: (in behavior) To look or concentrate intently on some particular point or thing.

gravity: The force that attracts anything with mass, or bulk, toward any other thing with mass. The more mass that something has, the greater its gravity.

high school: A designation for grades nine through 12 in the U.S. system of compulsory public education. High-school graduates may apply to colleges for further, advanced education.

icon: (adj. iconic) Something that represents another thing, often as an ideal version of it.

insulin: A hormone produced in the pancreas (an organ that is part of the digestive system) that helps the body use glucose as fuel.

Isaac Newton: This English physicist and mathematician became most famous for describing his law of gravity. Born in 1642, he developed into a scientist with wide-ranging interests. Among some of his discoveries: that white light is made from a combination of all the colors in the rainbow, which can be split apart again using a prism the mathematics that describe the orbital motions of things around a center of force that the speed of sound waves can be calculated from the density of air early elements of the mathematics now known as calculus and an explanation for why things “fall:” the gravitational pull of one object towards another, which would be proportional to the mass of each.

network: A group of interconnected people or things.

neuron: An impulse-conducting cell. Such cells are found in the brain, spinal column and nervous system.

neuroscientist: Someone who studies the structure or function of the brain and other parts of the nervous system.

penicillin: The first antibiotic (although not the first one used on people), it’s a natural product that comes from a mold. In 1928, Alexander Fleming, a British scientist, discovered that it could kill certain bacteria. He would later share the 1945 Nobel Prize in Medicine for it.

perception: The state of being aware of something — or the process of becoming aware of something — through use of the senses.

physics: The scientific study of the nature and properties of matter and energy. Classical physics is an explanation of the nature and properties of matter and energy that relies on descriptions such as Newton’s laws of motion. A scientist who works in such areas is known as a physicist.

plasticity: Adaptable or reshapable. (in biology) The ability of an organ, such as the brain or skeleton to adapt in ways that stretch its normal function or abilities. This might include the brain’s ability to rewire itself to recover some lost functions and compensate for damage.

psychologist: A scientist or mental-health professional who studies the human mind, especially in relation to actions and behaviors.

relativity: (in physics) A theory developed by physicist Albert Einstein showing that neither space nor time are constant, but instead affected by one’s velocity and the mass of things in your vicinity.

satellite: A moon orbiting a planet or a vehicle or other manufactured object that orbits some celestial body in space.

sequence: The precise order of related things within some series.

synapse: The junction between neurons that transmits chemical and electrical signals.

theory: (in science) A description of some aspect of the natural world based on extensive observations, tests and reason. A theory can also be a way of organizing a broad body of knowledge that applies in a broad range of circumstances to explain what will happen. Unlike the common definition of theory, a theory in science is not just a hunch. Ideas or conclusions that are based on a theory — and not yet on firm data or observations — are referred to as theoretical. Scientists who use mathematics and/or existing data to project what might happen in new situations are known as theorists.

trait: A characteristic feature of something.

transmit: (n. transmission) To send or pass along.

wave: A disturbance or variation that travels through space and matter in a regular, oscillating fashion.

X-ray: A type of radiation analogous to gamma rays, but having somewhat lower energy.

Citations

Journal:​ J. Metcalfe. Learning from errors. Annual Review of Psychology. Vol. 68, January 2017, p. 465. doi: 10.1146/annurev-psych-010416-044022.

Book: S. Firestein. Failure: Why Science is So Successful. Oxford University Press. October 2015.

Journal:​ J.S. Moser et al. Mind your errors: Evidence for a neural mechanism linking growth mind-set to adaptive posterror adjustments. Psychological Science. Vol. 22, Oct. 31, 2011, p. 1484. doi: 10.1177/0956797611419520.

Book: F.L. Dyer and T.C. Martin. Edison: His Life and Inventions. Harper and Bros. 1910. Available as free e-book as part of Project Gutenberg.

Classroom Resources for This Article Learn more

Free educator resources are available for this article. Register to access:


How to Work Well with Others: The Psychology Behind Group Work

Do your perceptions change when someone discusses an opinion different from your own? Do you find yourself more engaged when your teammates smile and agree with you, or do you find yourself lost for words when someone is talking over you, uptight, and not listening? Did you know that research shows 5 specific personality traits that reflect positive leadership? That you have a powerful impact on others — and they may have the same impact on you — emotionally or mentally — through imitation?

Two words come into play when dealing with group work: Social contagion.

What exactly does this mean?

Social contagion theory (also known as emotional contagion theory) is a psychological phenomenon indicating that, to a certain degree, people have power and influence over you. You are somewhat of a product of whom you know, such as your close friends.

Your friends can help shape your perceptions, outlook, values, culture, emotions, and behaviors

It’s based on interpersonal relationships we form, but it’s not just limited to the people we know.

Social contagion happens in many places on the internet, from human rights campaigns to changing your profile picture to the equal sign to support marriage equality. You may be influenced because of how many people are doing it.

Which of these describes your experience with group work? Photo credits—Left: Be-Younger.com Right: Andrew McCluskey

Social scientific research continues to confirm the idea that attitudes, belief systems, behaviors, and the effects of others can most definitely spread through populations with great speed, like an epidemic.

But isn’t it peculiar to find yourself (or watch someone else) be convinced of something, because another person tends to explain themselves more assertively or dominantly? This spreading of ideas can directly affect groups through imitation and conformity to specific ideas.

Assertiveness is not the same as leadership

In a study conducted at the University of California, Berkeley, individuals with a dominant personality trait consistently upheld significant influence among others in a group setting. Dominance personality traits include assertiveness independence confidence and fearless, original thinking.

However, this does not necessarily mean that people can attain influence by acting only assertively or forcefully, as research shows. Behavior such as bullying and intimidation does not show results for influential success. Rather, leadership skills, interpersonal skills, and emotional intelligence play a huge role with traits such as assertiveness, making the dominant-trait individual appear competent — even if they actually lack competence.

Individuals high in a dominance trait tend to:

  • speak more
  • gain more control over group processes
  • hold increased levels of perceived control over group decisions

In contrast, people with high leadership skills:

  • possess social skills that allow them to take lead
  • communicate very well
  • motivate others

Good communication requires listening — and can make a huge difference in group work. For example, by listening to various opinions, you could take the lead by incorporating all of them, or explaining why only one of them will work. Teammates should feel they have a voice, which is very important, but if you want to be more influential, you can be more engaged, (talkative), show how motivated you are (while motivating others), and share your own perceptions frequently. Your peers will view you as someone they can look up to and work well with.

This could be a “confidence boost” for the next time you are ill-prepared to discuss something out loud.

You can be more influential by imitating those with influence

Mirror neurons are brain cells that activate when we observe someone performing an action and when we perform the same action—for example, throwing a ball and then watching someone throw a ball back to you. Neuroscientists believe that mirror neurons help people read into the minds of others and empathize with them.

Mirror neurons have helped developmental and evolutionary psychologists understand human behavior, and how the brain can observe the actions of others to help us understand our world. A research study at the University of Washington showed that infants and adults do indeed imitate adults to understand perceptions of themselves and others, and to understand how they might be different or similar.

Another study conducted at the University of California shows that culture can influence mirror neurons. When participants in the study looked at someone who shared their culture, the mirror neurons had higher activity compared to when the participant looked at someone who didn’t share the same culture.

This shows that mirror neurons have an important role in imitation learning, physical actions and other peoples’ behavior.

Do what the best influential novice can do: observe and imitate the social constructs of someone you respect.

What behaviors do influential people typically demonstrate?

  • They express their opinions more frequently.
  • They make more direct eye contact.
  • They use a relaxed, expansive, and welcoming posture.

In school and work, we have the ability to think and speak for ourselves and with one another. We’re in our own personalized groups — such as friends, students, colleagues, teachers, and staff. At Penn State World Campus, we often connect with and influence others through our integrative chat rooms, email, discussion forums, and social networks.

Some personality traits are more effective for group work

A study conducted at Yale University compared the influence of one person who consistently displayed positive or negative emotions. The findings? The group with one member focusing on positive emotions had improved cooperation, less conflict, and higher perception of task performance.

Aside from being positive, what are the top personality traits involved with group work?

Two researchers explored these personality traits. Warren Norman in 1963 coined the name the Five-Factor Model, and Lewis Goldberg, in 1990, the Big Five on the basis of these traits. These traits have been supported by research such as job performance, job satisfaction, turnover rate, and interpersonal skills among colleagues.

“Big Five” personality traits

  1. Extraversion — talkative in nature outgoing and associated with behaviors such as being sociable, gregarious, active, and assertive
  2. Agreeableness friendly, cooperative, good-natured, flexible, courteous, and tolerant
  3. Conscientiousness particularly self-disciplined, organized, thorough, hardworking, and achievement-oriented
  4. Emotional Stability calm, secure, poised, relaxed, aware, and rational
  5. Openness to Experience imaginative, original, intelligent, artistically sensitive, attentive to inner feelings, and intellectually curious

Conscientiousness, agreeableness, and emotional stability are positively related to job performance, according to a Vanderbilt University study. In addition, emotional stability and agreeableness are strongly related to team work which involved interdependently working with other employees versus other people, such as customers or clients. This shows that these personality characteristics are important for work and predicting work outcomes among employees.

Further research in this area examined this question: How does the team’s composition of personalities influence team members’ level of satisfaction? Results found that the more agreeable and emotionally stable team members are, the more satisfied they are with their team.

On the other hand, the more dissimilar team members are from their teammates in regards to conscientiousness, the less satisfied they are with their team. Team work satisfaction increases with teammates if they are more agreeable and emotionally stable are similarly conscientious similar in nature and less extraverted. Overall, agreeableness and emotional stability had the greatest effects on satisfaction within a team.

Final thoughts

It’s okay if you are not a leader or the most influential. Some people like to follow others. People sometimes don’t even realize they can be a leader until something they believe in or are passionate about becomes the center of their universe. Then, their influence can become contagious.


Summary

Consciousness is our subjective awareness of ourselves and our environment including bodily sensations and thoughts.

Consciousness is functional because we use it to reason logically, to plan activities, and to monitor our progress toward the goals we set for ourselves.

The French philosopher René Descartes (1596-1650) was a proponent of dualism, the idea that the mind, a nonmaterial entity, is separate from (although connected to) the physical body. In contrast to the dualists, psychologists believe the consciousness (and thus the mind) exists in the brain, not separate from it.

Several philosophical theories of human consciousness inform the present study of behaviour and mental processes. Socrates (490–399 BC) argued that free will is limited, or at least so it seems, after he noticed that people often do things they really do not want to do. He called this akrasia or a lack of control over oneself.

A few centuries later, the Roman thinker Plotinus (AD 205–270) was presumably the first to allude to the possibility of unconscious psychological processes where he noted that the absence of conscious perception does not necessarily prove the absence of mental activity.

Consciousness has been central to many theories of psychology. Freud’s personality theories differentiated between the unconscious and the conscious aspects of behaviour, and present-day psychologists distinguish between automatic (unconscious) and controlled (conscious) behaviours and between implicit (unconscious) and explicit (conscious) cognitive processes.

Freud introduced the concept of the subconscious to account for things like memory and motivation that remain outside of the realm of consciousness

The concept of preconscious refers to information that we could pay attention to if we wanted, and where memories are stored for easy retrieval.

Awareness operates on two levels and humans fluctuate between these high and low thinking states.

Low awareness of subtle and even subliminal influences can become conscious as a result of cues.

Cues are stimulus of significant meaning.

High awareness refers to consciousness of what is going on around us.

Mindfulness is a state of heightened awareness, focus, and evaluation of our thoughts.

Attention is a mental resource that can be vigilant and sustained or divided and selective. William James referred to attention as a concentration of consciousness.

Priming studies aim to activate certain concepts and associations in people’s memory below conscious awareness in order to understand the effect on subsequent behaviour.

Researchers can engage the implicit associations test (IAT) to study unconscious motives and beliefs.

The Flexible Correction Model suggests that humans have ability to correct or change beliefs and evaluations that have been influenced or biased by an undue outside source.

Because the brain varies in its current level and type of activity, consciousness is transitory. If we drink too much coffee or beer, the caffeine or alcohol influences the activity in our brain, and our consciousness may change. When we are anesthetized before an operation or experience a concussion after a knock on the head, we may lose consciousness entirely as a result of changes in brain activity. We also lose consciousness when we sleep.

Sleep is unique because while we lack full awareness in this state of consciousness, the brain is still active.

Sleep serves the function of mental and physical restoration.

The behaviour of organisms is influenced by biological rhythms, including the daily circadian rhythms that guide the waking and sleeping cycle in many animals.

Sleep researchers have found that sleeping people undergo a fairly consistent pattern of sleep stages, each lasting about 90 minutes. Each of the sleep stages has its own distinct pattern of brain activity. Rapid eye movement (REM) accounts for about 25% of our total sleep time, during which we dream. Non-rapid eye movement (non-REM) sleep is a deep sleep characterized by very slow brain waves, and is further subdivided into three stages: N1, N2, and N3.

Sleep has a vital restorative function, and a prolonged lack of sleep results in increased anxiety, diminished performance, and, if severe and extended, even death. Sleep deprivation suppresses immune responses that fight off infection, and it can lead to obesity, hypertension, and memory impairment.

Some people suffer from sleep disorders, including insomnia, sleep apnea, narcolepsy, sleepwalking, and REM sleep behaviour disorder.

Dream theories suggest that dreaming is our nonconscious attempt to make sense of daily experience and learning

According to Freud, dreams represent troublesome wishes and desires. Freud believed that the primary function of dreams was wish fulfilment, and he differentiated between the manifest and latent content of dreams.

Other theories of dreaming propose that we dream primarily to help with consolidation — the moving of information into long-term memory. The activation-synthesis theory of dreaming proposes that dreams are simply our brain’s interpretation of the random firing of neurons in the brain stem.

Hypnosis is a trancelike state of consciousness, usually induced by a procedure known as hypnotic induction, which consists of heightened suggestibility, deep relaxation, and intense focus. Hypnosis also is frequently used to attempt to change unwanted behaviours, such as to reduce smoking, eating, and alcohol abuse.

Sensory deprivation is the intentional reduction of stimuli affecting one or more of the five senses, with the possibility of resulting changes in consciousness. Although sensory deprivation is used for relaxation or meditation purposes and to produce enjoyable changes in consciousness, when deprivation is prolonged, it is unpleasant and can be used as a means of torture.

Meditation refers to techniques in which the individual focuses on something specific, such as an object, a word, or one’s breathing, with the goal of ignoring external distractions. Meditation has a variety of positive health effects.

A trance state involves a dissociation of the self where people are said to have less voluntary control over their behaviors and actions.

In some cases, consciousness may become aversive, and we may engage in behaviours that help us escape from consciousness, through the use of psychoactive drugs, for example.

Some substances can have a powerful effect on perception and on consciousness.

Psychoactive drugs are chemicals that change our states of consciousness, and particularly our perceptions and moods. The use (especially in combination) of psychoactive drugs has the potential to create very negative side effects, including tolerance, dependence, withdrawal symptoms, and addiction.

Depressants, including alcohol, barbiturates, benzodiazepines, and toxic inhalants, reduce the activity of the CNS. They are widely used as prescription medicines to relieve pain, to lower heart rate and respiration, and as anticonvulsants. Toxic inhalants are some of the most dangerous recreational drugs, with a safety index below 10, and their continued use may lead to permanent brain damage.

Stimulants speed up the body’s physiological and mental processes. Stimulants, including caffeine, nicotine, cocaine, and amphetamine, are psychoactive drugs that operate by blocking the reuptake of dopamine, norepinephrine, and serotonin in the synapses of the CNS. Some amphetamines, such as Ecstasy, have very low safety ratios and thus are highly dangerous.

Opioids, including opium, morphine, heroin, and codeine, are chemicals that increase activity in opioid receptor neurons in the brain and in the digestive system, producing euphoria, analgesia, slower breathing, and constipation.

Hallucinogens, including cannabis, mescaline, and LSD, are psychoactive drugs that alter sensation and perception, and may create hallucinations.

Even when we know the potential costs of using drugs, we may engage in using them anyway because the rewards from using the drugs are occurring right now, whereas the potential costs are abstract and only in the future. And drugs are not the only things we enjoy or can abuse. It is normal to refer to the abuse of other behaviours, such as gambling, sex, overeating, and even overworking, as “addictions” to describe the overuse of pleasant stimuli.


Playing board games may help protect thinking skills in old age

People who play games -- such as cards and board games -- are more likely to stay mentally sharp in later life, a study suggests.

Those who regularly played non-digital games scored better on memory and thinking tests in their 70s, the research found.

The study also found that a behaviour change in later life could still make a difference.

People who increased game playing during their 70s were more likely to maintain certain thinking skills as they grew older.

Psychologists at the University of Edinburgh tested more than 1000 people aged 70 for memory, problem solving, thinking speed and general thinking ability.

The participants then repeated the same thinking tests every three years until aged 79.

The group were also asked how often they played games like cards, chess, bingo or crosswords -- at ages 70 and 76.

Researchers used statistical models to analyse the relationship between a person's level of game playing and their thinking skills.

The team took into account the results of an intelligence test that the participants sat when they were 11 years old.

They also considered lifestyle factors, such as education, socio-economic status and activity levels.

People who increased game playing in later years were found to have experienced less decline in thinking skills in their seventies -- particularly in memory function and thinking speed.

Researchers say the findings help to better understand what kinds of lifestyles and behaviours might be associated with better outcomes for cognitive health in later life.

The study may also help people make decisions about how best to protect their thinking skills as they age.

Dr Drew Altschul, of the University of Edinburgh's School of Philosophy, Psychology and Language Sciences, said: "These latest findings add to evidence that being more engaged in activities during the life course might be associated with better thinking skills in later life. For those in their 70s or beyond, another message seems to be that playing non-digital games may be a positive behaviour in terms of reducing cognitive decline."

Professor Ian Deary, Director of the University of Edinburgh's Centre for Cognitive Ageing and Cognitive Epidemiology (CCACE), said: "We and others are narrowing down the sorts of activities that might help to keep people sharp in older age. In our Lothian sample, it's not just general intellectual and social activity, it seems it is something in this group of games that has this small but detectable association with better cognitive ageing. It'd be good to find out if some of these games are more potent than others. We also point out that several other things are related to better cognitive ageing, such as being physically fit and not smoking."

Caroline Abrahams, Charity Director at Age UK, said: "Even though some people's thinking skills can decline as we get older, this research is further evidence that it doesn't have to be inevitable. The connection between playing board games and other non-digital games later in life and sharper thinking and memory skills adds to what we know about steps we can take to protect our cognitive health, including not drinking excess alcohol, being active and eating a healthy diet."

The participants were part of the Lothian Birth Cohort 1936 study, a group of individuals who were born in 1936 and took part in the Scottish Mental Survey of 1947.

Since 1999, researchers have been working with the Lothian Birth Cohorts to chart how a person's thinking power changes over their lifetime. The follow-up times in the Cohorts are among the longest in the world.


How To Make Learning Stick: Top Tips From Learning Psychology

The "sheep dip" exercises of old just don’t cut it if you’re looking for eLearning to have any long-term impact on behaviors and performance. Getting your head around the basics of learning psychology will help you create brain-friendly learning experiences that genuinely help people retain information and improve and develop their skills in a way that sticks.

In a recent interview, Learning Psychologist Stella Collins shared her advice for creating brain-friendly learning using a great checklist with the acronym "LEARNS." We’ve picked out some of Stella’s top tips that you can try today and those are the following:

1. Encourage Active Engagement

To truly learn something, you can’t expect to simply read a book or watch a video. Think about how you can encourage learners to actively engage—for example, giving opportunities to practice and gain feedback.

2. Use What People Already Know

When we are learning, the brain tries to link neurons together. If your learning content can hook onto something that people already know, it helps the brain form those links. One way to help create links is to get people to guess the answer. This has been shown to increase their ability to recall the right answer, even if they guessed wrong.

3. Tap Into Emotions

Emotions help the brain make decisions. If you’re trying to help people make better decisions in certain situations, creating emotional ties can help. A problem scenario that is connected to a healthy dose of fear, for example, makes your learner more likely to remove the obstacle and feel good about it afterward.

4. Make Deliberate Links

Use anchors to create links between 2 things. This can help to trigger memories or encourage people to act. For example, think about how certain songs remind you of a particular TV or radio advertisement—the advertisers have created an anchor in our mind. You can do this with visual prompts, words, sounds etc.

5. Repeat

To make neural connections stronger and faster, they need to be used repeatedly—think of it like building brain muscle. So, build practice into learning experiences so those connections keep firing.

6. Keep It Fresh

We zone out of things we’re used to seeing. Marketers are always having to think of new ways to attract attention, and learning designers need to do this too. Think about what you can do to surprise people or be different so it becomes memorable and worthy of talking about.

7. Use Storytelling

Great stories are remembered and retold. There’s a reason for that, and it’s because they tick every box of Stella’s "LEARNS" acronym. Check out this great eLearning example that uses immersive storytelling (and shares 4 tips).

The LEARNS Acronym For Brain-Friendly Learning

Stella Collins has provided 6 fantastic devices to help you make learning that sticks, all wrapped up in the memorable anchor of LEARNS. Get more detail on what this stands for and how you can create brain-friendly learning—click here.

Trying to come up with a winning brain-friendly approach for your project? Get our guide to Conceptualizing eLearning ideas.


Does Thinking Really Hard Burn More Calories?

Between October and June they shuffle out of auditoriums, gymnasiums and classrooms, their eyes adjusting to the sunlight as their fingers fumble to awaken cell phones that have been silent for four consecutive hours. Some raise a hand to their foreheads, as though trying to rub away a headache. Others linger in front of the parking lot, unsure of what to do next. They are absolutely exhausted, but not because of any strenuous physical activity. Rather, these high school students have just taken the SAT. "I was fast asleep as soon as I got home," Ikra Ahmad told The Local, a New York Times blog, when she was interviewed for a story on "SAT hangover."

Temporary mental exhaustion is a genuine and common phenomenon, which, it is important to note, differs from chronic mental fatigue associated with regular sleep deprivation and some medical disorders. Everyday mental weariness makes sense, intuitively. Surely complex thought and intense concentration require more energy than routine mental processes. Just as vigorous exercise tires our bodies, intellectual exertion should drain the brain. What the latest science reveals, however, is that the popular notion of mental exhaustion is too simplistic. The brain continuously slurps up huge amounts of energy for an organ of its size, regardless of whether we are tackling integral calculus or clicking through the week's top 10 LOLcats. Although firing neurons summon extra blood, oxygen and glucose, any local increases in energy consumption are tiny compared with the brain's gluttonous baseline intake. So, in most cases, short periods of additional mental effort require a little more brainpower than usual, but not much more. Most laboratory experiments, however, have not subjected volunteers to several hours' worth of challenging mental acrobatics. And something must explain the feeling of mental exhaustion, even if its physiology differs from physical fatigue. Simply believing that our brains have expended a lot of effort might be enough to make us lethargic.

Brainpower
Although the average adult human brain weighs about 1.4 kilograms, only 2 percent of total body weight, it demands 20 percent of our resting metabolic rate (RMR)&mdashthe total amount of energy our bodies expend in one very lazy day of no activity. RMR varies from person to person depending on age, gender, size and health. If we assume an average resting metabolic rate of 1,300 calories, then the brain consumes 260 of those calories just to keep things in order. That's 10.8 calories every hour or 0.18 calories each minute. (For comparison's sake, see Harvard's table of calories burned during different activities). With a little math, we can convert that number into a measure of power:

&mdashResting metabolic rate: 1300 kilocalories, or kcal, the kind used in nutrition
&mdash1,300 kcal over 24 hours = 54.16 kcal per hour = 15.04 gram calories per second
&mdash15.04 gram calories/sec = 62.93 joules/sec = about 63 watts
&mdash20 percent of 63 watts = 12.6 watts

So a typical adult human brain runs on around 12 watts&mdasha fifth of the power required by a standard 60 watt lightbulb. Compared with most other organs, the brain is greedy pitted against man-made electronics, it is astoundingly efficient. IBM's Watson, the supercomputer that defeated Jeopardy! champions, depends on ninety IBM Power 750 servers, each of which requires around one thousand watts.

Energy travels to the brain via blood vessels in the form of glucose, which is transported across the blood-brain barrier and used to produce adenosine triphosphate (ATP), the main currency of chemical energy within cells. Experiments with both animals and people have confirmed that when neurons in a particular brain region fire, local capillaries dilate to deliver more blood than usual, along with extra glucose and oxygen. This consistent response makes neuroimaging studies possible: functional magnetic resonance imaging (fMRI) depends on the unique magnetic properties of blood flowing to and from firing neurons. Research has also confirmed that once dilated blood vessels deliver extra glucose, brain cells lap it up.

Extending the logic of such findings, some scientists have proposed the following: if firing neurons require extra glucose, then especially challenging mental tasks should decrease glucose levels in the blood and, likewise, eating foods rich in sugars should improve performance on such tasks. Although quite a few studies have confirmed these predictions, the evidence as a whole is mixed and most of the changes in glucose levels range from the miniscule to the small. In a study at Northumbria University, for example, volunteers that completed a series of verbal and numerical tasks showed a larger drop in blood glucose than people who just pressed a key repeatedly. In the same study, a sugary drink improved performance on one of the tasks, but not the others. At Liverpool John Moores University volunteers performed two versions of the Stroop task, in which they had to identify the color of ink in which a word was printed, rather than reading the word itself: In one version, the words and colors matched&mdashBLUE appeared in blue ink in the tricky version, the word BLUE appeared in green or red ink. Volunteers who performed the more challenging task showed bigger dips in blood glucose, which the researchers interpreted as a direct cause of greater mental effort. Some studies have found that when people are not very good at a particular task, they exert more mental effort and use more glucose and that, likewise, the more skilled you are, the more efficient your brain is and the less glucose you need. Complicating matters, at least one study suggests the opposite&mdashthat more skillful brains recruit more energy.*

Not so simple sugars
Unsatisfying and contradictory findings from glucose studies underscore that energy consumption in the brain is not a simple matter of greater mental effort sapping more of the body's available energy. Claude Messier of the University of Ottawa has reviewed many such studies. He remains unconvinced that any one cognitive task measurably changes glucose levels in the brain or blood. "In theory, yes, a more difficult mental task requires more energy because there is more neural activity," he says, "but when people do one mental task you won't see a large increase of glucose consumption as a significant percentage of the overall rate. The base level is quite a lot of energy&mdasheven in slow-wave sleep with very little activity there is still a high baseline consumption of glucose." Most organs do not require so much energy for basic housekeeping. But the brain must actively maintain appropriate concentrations of charged particles across the membranes of billions of neurons, even when those cells are not firing. Because of this expensive and continuous maintenance, the brain usually has the energy it needs for a little extra work.

Authors of other review papers have reached similar conclusions. Robert Kurzban of the University of Pennsylvania points to studies showing that moderate exercise improves people's ability to focus. In one study, for example, children who walked for 20 minutes on a treadmill performed better on an academic achievement test than children who read quietly before the exam. If mental effort and ability were a simple matter of available glucose, then the children who exercised&mdashand burnt up more energy&mdashshould have performed worse than their quiescent peers.

The influence of a mental task's difficulty on energy consumption "appears to be subtle and probably depends on individual variation in effort required, engagement and resources available, which might be related to variables such as age, personality and gluco-regulation," wrote Leigh Gibson of Roehampton University in a review on carbohydrates and mental function.

Both Gibson and Messier conclude that when someone has trouble regulating glucose properly&mdashor has fasted for a long time&mdasha sugary drink or food can improve their subsequent performance on certain kinds of memory tasks. But for most people, the body easily supplies what little extra glucose the brain needs for additional mental effort.

Body and mind
If challenging cognitive tasks consume only a little more fuel than usual, what explains the feeling of mental exhaustion following the SAT or a similarly grueling mental marathon? One answer is that maintaining unbroken focus or navigating demanding intellectual territory for several hours really does burn enough energy to leave one feeling drained, but that researchers have not confirmed this because they have simply not been tough enough on their volunteers. In most experiments, participants perform a single task of moderate difficulty, rarely for more than an hour or two. "Maybe if we push them harder, and get people to do things they are not good at, we would see clearer results," Messier suggests.

Equally important to the duration of mental exertion is one's attitude toward it. Watching a thrilling biopic with a complex narrative excites many different brain regions for a good two hours, yet people typically do not shamble out of the theater complaining of mental fatigue. Some people regularly curl up with densely written novels that others might throw across the room in frustration. Completing a complex crossword or sudoku puzzle on a Sunday morning does not usually ruin one's ability to focus for the rest of the day&mdashin fact, some claim it sharpens their mental state. In short, people routinely enjoy intellectually invigorating activities without suffering mental exhaustion.

Such fatigue seems much more likely to follow sustained mental effort that we do not seek for pleasure&mdashsuch as the obligatory SAT&mdashespecially when we expect that the ordeal will drain our brains. If we think an exam or puzzle will be difficult, it often will be. Studies have shown that something similar happens when people exercise and play sports: a large component of physical exhaustion is in our heads. In related research, volunteers that cycled on an exercise bike following a 90-minute computerized test of sustained attention quit pedaling from exhaustion sooner than participants that watched emotionally neutral documentaries before exercising. Even if the attention test did not consume significantly more energy than watching movies, the volunteers reported feeling less energetic. That feeling was powerful enough to limit their physical performance.

In the specific case of the SAT, something beyond pure mental effort likely contributes to post-exam stupor: stress. After all, the brain does not function in a vacuum. Other organs burn up energy, too. Taking an exam that partially determines where one will spend the next four years is nerve-racking enough to send stress hormones swimming through the blood stream, induce sweating, quicken heart rates and encourage fidgeting and contorted body postures. The SAT and similar trials are not just mentally taxing&mdashthey are physically exhausting, too.

A small but revealing study suggests that even mildly stressful intellectual challenges change our emotional states and behaviors, even if they do not profoundly alter brain metabolism. Fourteen female Canadian college students either sat around, summarized a passage of text or completed a series of computerized attention and memory tests for 45 minutes before feasting on a buffet lunch. Students who exercised their brains helped themselves to around 200 more calories than students who relaxed. Their blood glucose levels also fluctuated more than those of students who just sat there, but not in any consistent way. Levels of the stress hormone cortisol, however, were significantly higher in students whose brains were busy, as were their heart rates, blood pressure and self-reported anxiety. In all likelihood, these students did not eat more because their haggard brains desperately needed more fuel rather, they were stress eating.

Messier has related explanation for everyday mental weariness: "My general hypothesis is that the brain is a lazy bum," he says. "The brain has a hard time staying focused on just one thing for too long. It's possible that sustained concentration creates some changes in the brain that promote avoidance of that state. It could be like a timer that says, 'Okay you're done now.' Maybe the brain just doesn't like to work so hard for so long."

*Editor's note: The last two sentences of the seventh paragraph were edited after publication for clarity and accuracy


Summary

Consciousness is our subjective awareness of ourselves and our environment including bodily sensations and thoughts.

Consciousness is functional because we use it to reason logically, to plan activities, and to monitor our progress toward the goals we set for ourselves.

The French philosopher René Descartes (1596-1650) was a proponent of dualism, the idea that the mind, a nonmaterial entity, is separate from (although connected to) the physical body. In contrast to the dualists, psychologists believe the consciousness (and thus the mind) exists in the brain, not separate from it.

Several philosophical theories of human consciousness inform the present study of behaviour and mental processes. Socrates (490–399 BC) argued that free will is limited, or at least so it seems, after he noticed that people often do things they really do not want to do. He called this akrasia or a lack of control over oneself.

A few centuries later, the Roman thinker Plotinus (AD 205–270) was presumably the first to allude to the possibility of unconscious psychological processes where he noted that the absence of conscious perception does not necessarily prove the absence of mental activity.

Consciousness has been central to many theories of psychology. Freud’s personality theories differentiated between the unconscious and the conscious aspects of behaviour, and present-day psychologists distinguish between automatic (unconscious) and controlled (conscious) behaviours and between implicit (unconscious) and explicit (conscious) cognitive processes.

Freud introduced the concept of the subconscious to account for things like memory and motivation that remain outside of the realm of consciousness

The concept of preconscious refers to information that we could pay attention to if we wanted, and where memories are stored for easy retrieval.

Awareness operates on two levels and humans fluctuate between these high and low thinking states.

Low awareness of subtle and even subliminal influences can become conscious as a result of cues.

Cues are stimulus of significant meaning.

High awareness refers to consciousness of what is going on around us.

Mindfulness is a state of heightened awareness, focus, and evaluation of our thoughts.

Attention is a mental resource that can be vigilant and sustained or divided and selective. William James referred to attention as a concentration of consciousness.

Priming studies aim to activate certain concepts and associations in people’s memory below conscious awareness in order to understand the effect on subsequent behaviour.

Researchers can engage the implicit associations test (IAT) to study unconscious motives and beliefs.

The Flexible Correction Model suggests that humans have ability to correct or change beliefs and evaluations that have been influenced or biased by an undue outside source.

Because the brain varies in its current level and type of activity, consciousness is transitory. If we drink too much coffee or beer, the caffeine or alcohol influences the activity in our brain, and our consciousness may change. When we are anesthetized before an operation or experience a concussion after a knock on the head, we may lose consciousness entirely as a result of changes in brain activity. We also lose consciousness when we sleep.

Sleep is unique because while we lack full awareness in this state of consciousness, the brain is still active.

Sleep serves the function of mental and physical restoration.

The behaviour of organisms is influenced by biological rhythms, including the daily circadian rhythms that guide the waking and sleeping cycle in many animals.

Sleep researchers have found that sleeping people undergo a fairly consistent pattern of sleep stages, each lasting about 90 minutes. Each of the sleep stages has its own distinct pattern of brain activity. Rapid eye movement (REM) accounts for about 25% of our total sleep time, during which we dream. Non-rapid eye movement (non-REM) sleep is a deep sleep characterized by very slow brain waves, and is further subdivided into three stages: N1, N2, and N3.

Sleep has a vital restorative function, and a prolonged lack of sleep results in increased anxiety, diminished performance, and, if severe and extended, even death. Sleep deprivation suppresses immune responses that fight off infection, and it can lead to obesity, hypertension, and memory impairment.

Some people suffer from sleep disorders, including insomnia, sleep apnea, narcolepsy, sleepwalking, and REM sleep behaviour disorder.

Dream theories suggest that dreaming is our nonconscious attempt to make sense of daily experience and learning

According to Freud, dreams represent troublesome wishes and desires. Freud believed that the primary function of dreams was wish fulfilment, and he differentiated between the manifest and latent content of dreams.

Other theories of dreaming propose that we dream primarily to help with consolidation — the moving of information into long-term memory. The activation-synthesis theory of dreaming proposes that dreams are simply our brain’s interpretation of the random firing of neurons in the brain stem.

Hypnosis is a trancelike state of consciousness, usually induced by a procedure known as hypnotic induction, which consists of heightened suggestibility, deep relaxation, and intense focus. Hypnosis also is frequently used to attempt to change unwanted behaviours, such as to reduce smoking, eating, and alcohol abuse.

Sensory deprivation is the intentional reduction of stimuli affecting one or more of the five senses, with the possibility of resulting changes in consciousness. Although sensory deprivation is used for relaxation or meditation purposes and to produce enjoyable changes in consciousness, when deprivation is prolonged, it is unpleasant and can be used as a means of torture.

Meditation refers to techniques in which the individual focuses on something specific, such as an object, a word, or one’s breathing, with the goal of ignoring external distractions. Meditation has a variety of positive health effects.

A trance state involves a dissociation of the self where people are said to have less voluntary control over their behaviors and actions.

In some cases, consciousness may become aversive, and we may engage in behaviours that help us escape from consciousness, through the use of psychoactive drugs, for example.

Some substances can have a powerful effect on perception and on consciousness.

Psychoactive drugs are chemicals that change our states of consciousness, and particularly our perceptions and moods. The use (especially in combination) of psychoactive drugs has the potential to create very negative side effects, including tolerance, dependence, withdrawal symptoms, and addiction.

Depressants, including alcohol, barbiturates, benzodiazepines, and toxic inhalants, reduce the activity of the CNS. They are widely used as prescription medicines to relieve pain, to lower heart rate and respiration, and as anticonvulsants. Toxic inhalants are some of the most dangerous recreational drugs, with a safety index below 10, and their continued use may lead to permanent brain damage.

Stimulants speed up the body’s physiological and mental processes. Stimulants, including caffeine, nicotine, cocaine, and amphetamine, are psychoactive drugs that operate by blocking the reuptake of dopamine, norepinephrine, and serotonin in the synapses of the CNS. Some amphetamines, such as Ecstasy, have very low safety ratios and thus are highly dangerous.

Opioids, including opium, morphine, heroin, and codeine, are chemicals that increase activity in opioid receptor neurons in the brain and in the digestive system, producing euphoria, analgesia, slower breathing, and constipation.

Hallucinogens, including cannabis, mescaline, and LSD, are psychoactive drugs that alter sensation and perception, and may create hallucinations.

Even when we know the potential costs of using drugs, we may engage in using them anyway because the rewards from using the drugs are occurring right now, whereas the potential costs are abstract and only in the future. And drugs are not the only things we enjoy or can abuse. It is normal to refer to the abuse of other behaviours, such as gambling, sex, overeating, and even overworking, as “addictions” to describe the overuse of pleasant stimuli.


How To Make Learning Stick: Top Tips From Learning Psychology

The "sheep dip" exercises of old just don’t cut it if you’re looking for eLearning to have any long-term impact on behaviors and performance. Getting your head around the basics of learning psychology will help you create brain-friendly learning experiences that genuinely help people retain information and improve and develop their skills in a way that sticks.

In a recent interview, Learning Psychologist Stella Collins shared her advice for creating brain-friendly learning using a great checklist with the acronym "LEARNS." We’ve picked out some of Stella’s top tips that you can try today and those are the following:

1. Encourage Active Engagement

To truly learn something, you can’t expect to simply read a book or watch a video. Think about how you can encourage learners to actively engage—for example, giving opportunities to practice and gain feedback.

2. Use What People Already Know

When we are learning, the brain tries to link neurons together. If your learning content can hook onto something that people already know, it helps the brain form those links. One way to help create links is to get people to guess the answer. This has been shown to increase their ability to recall the right answer, even if they guessed wrong.

3. Tap Into Emotions

Emotions help the brain make decisions. If you’re trying to help people make better decisions in certain situations, creating emotional ties can help. A problem scenario that is connected to a healthy dose of fear, for example, makes your learner more likely to remove the obstacle and feel good about it afterward.

4. Make Deliberate Links

Use anchors to create links between 2 things. This can help to trigger memories or encourage people to act. For example, think about how certain songs remind you of a particular TV or radio advertisement—the advertisers have created an anchor in our mind. You can do this with visual prompts, words, sounds etc.

5. Repeat

To make neural connections stronger and faster, they need to be used repeatedly—think of it like building brain muscle. So, build practice into learning experiences so those connections keep firing.

6. Keep It Fresh

We zone out of things we’re used to seeing. Marketers are always having to think of new ways to attract attention, and learning designers need to do this too. Think about what you can do to surprise people or be different so it becomes memorable and worthy of talking about.

7. Use Storytelling

Great stories are remembered and retold. There’s a reason for that, and it’s because they tick every box of Stella’s "LEARNS" acronym. Check out this great eLearning example that uses immersive storytelling (and shares 4 tips).

The LEARNS Acronym For Brain-Friendly Learning

Stella Collins has provided 6 fantastic devices to help you make learning that sticks, all wrapped up in the memorable anchor of LEARNS. Get more detail on what this stands for and how you can create brain-friendly learning—click here.

Trying to come up with a winning brain-friendly approach for your project? Get our guide to Conceptualizing eLearning ideas.


Playing board games may help protect thinking skills in old age

People who play games -- such as cards and board games -- are more likely to stay mentally sharp in later life, a study suggests.

Those who regularly played non-digital games scored better on memory and thinking tests in their 70s, the research found.

The study also found that a behaviour change in later life could still make a difference.

People who increased game playing during their 70s were more likely to maintain certain thinking skills as they grew older.

Psychologists at the University of Edinburgh tested more than 1000 people aged 70 for memory, problem solving, thinking speed and general thinking ability.

The participants then repeated the same thinking tests every three years until aged 79.

The group were also asked how often they played games like cards, chess, bingo or crosswords -- at ages 70 and 76.

Researchers used statistical models to analyse the relationship between a person's level of game playing and their thinking skills.

The team took into account the results of an intelligence test that the participants sat when they were 11 years old.

They also considered lifestyle factors, such as education, socio-economic status and activity levels.

People who increased game playing in later years were found to have experienced less decline in thinking skills in their seventies -- particularly in memory function and thinking speed.

Researchers say the findings help to better understand what kinds of lifestyles and behaviours might be associated with better outcomes for cognitive health in later life.

The study may also help people make decisions about how best to protect their thinking skills as they age.

Dr Drew Altschul, of the University of Edinburgh's School of Philosophy, Psychology and Language Sciences, said: "These latest findings add to evidence that being more engaged in activities during the life course might be associated with better thinking skills in later life. For those in their 70s or beyond, another message seems to be that playing non-digital games may be a positive behaviour in terms of reducing cognitive decline."

Professor Ian Deary, Director of the University of Edinburgh's Centre for Cognitive Ageing and Cognitive Epidemiology (CCACE), said: "We and others are narrowing down the sorts of activities that might help to keep people sharp in older age. In our Lothian sample, it's not just general intellectual and social activity, it seems it is something in this group of games that has this small but detectable association with better cognitive ageing. It'd be good to find out if some of these games are more potent than others. We also point out that several other things are related to better cognitive ageing, such as being physically fit and not smoking."

Caroline Abrahams, Charity Director at Age UK, said: "Even though some people's thinking skills can decline as we get older, this research is further evidence that it doesn't have to be inevitable. The connection between playing board games and other non-digital games later in life and sharper thinking and memory skills adds to what we know about steps we can take to protect our cognitive health, including not drinking excess alcohol, being active and eating a healthy diet."

The participants were part of the Lothian Birth Cohort 1936 study, a group of individuals who were born in 1936 and took part in the Scottish Mental Survey of 1947.

Since 1999, researchers have been working with the Lothian Birth Cohorts to chart how a person's thinking power changes over their lifetime. The follow-up times in the Cohorts are among the longest in the world.


How to Work Well with Others: The Psychology Behind Group Work

Do your perceptions change when someone discusses an opinion different from your own? Do you find yourself more engaged when your teammates smile and agree with you, or do you find yourself lost for words when someone is talking over you, uptight, and not listening? Did you know that research shows 5 specific personality traits that reflect positive leadership? That you have a powerful impact on others — and they may have the same impact on you — emotionally or mentally — through imitation?

Two words come into play when dealing with group work: Social contagion.

What exactly does this mean?

Social contagion theory (also known as emotional contagion theory) is a psychological phenomenon indicating that, to a certain degree, people have power and influence over you. You are somewhat of a product of whom you know, such as your close friends.

Your friends can help shape your perceptions, outlook, values, culture, emotions, and behaviors

It’s based on interpersonal relationships we form, but it’s not just limited to the people we know.

Social contagion happens in many places on the internet, from human rights campaigns to changing your profile picture to the equal sign to support marriage equality. You may be influenced because of how many people are doing it.

Which of these describes your experience with group work? Photo credits—Left: Be-Younger.com Right: Andrew McCluskey

Social scientific research continues to confirm the idea that attitudes, belief systems, behaviors, and the effects of others can most definitely spread through populations with great speed, like an epidemic.

But isn’t it peculiar to find yourself (or watch someone else) be convinced of something, because another person tends to explain themselves more assertively or dominantly? This spreading of ideas can directly affect groups through imitation and conformity to specific ideas.

Assertiveness is not the same as leadership

In a study conducted at the University of California, Berkeley, individuals with a dominant personality trait consistently upheld significant influence among others in a group setting. Dominance personality traits include assertiveness independence confidence and fearless, original thinking.

However, this does not necessarily mean that people can attain influence by acting only assertively or forcefully, as research shows. Behavior such as bullying and intimidation does not show results for influential success. Rather, leadership skills, interpersonal skills, and emotional intelligence play a huge role with traits such as assertiveness, making the dominant-trait individual appear competent — even if they actually lack competence.

Individuals high in a dominance trait tend to:

  • speak more
  • gain more control over group processes
  • hold increased levels of perceived control over group decisions

In contrast, people with high leadership skills:

  • possess social skills that allow them to take lead
  • communicate very well
  • motivate others

Good communication requires listening — and can make a huge difference in group work. For example, by listening to various opinions, you could take the lead by incorporating all of them, or explaining why only one of them will work. Teammates should feel they have a voice, which is very important, but if you want to be more influential, you can be more engaged, (talkative), show how motivated you are (while motivating others), and share your own perceptions frequently. Your peers will view you as someone they can look up to and work well with.

This could be a “confidence boost” for the next time you are ill-prepared to discuss something out loud.

You can be more influential by imitating those with influence

Mirror neurons are brain cells that activate when we observe someone performing an action and when we perform the same action—for example, throwing a ball and then watching someone throw a ball back to you. Neuroscientists believe that mirror neurons help people read into the minds of others and empathize with them.

Mirror neurons have helped developmental and evolutionary psychologists understand human behavior, and how the brain can observe the actions of others to help us understand our world. A research study at the University of Washington showed that infants and adults do indeed imitate adults to understand perceptions of themselves and others, and to understand how they might be different or similar.

Another study conducted at the University of California shows that culture can influence mirror neurons. When participants in the study looked at someone who shared their culture, the mirror neurons had higher activity compared to when the participant looked at someone who didn’t share the same culture.

This shows that mirror neurons have an important role in imitation learning, physical actions and other peoples’ behavior.

Do what the best influential novice can do: observe and imitate the social constructs of someone you respect.

What behaviors do influential people typically demonstrate?

  • They express their opinions more frequently.
  • They make more direct eye contact.
  • They use a relaxed, expansive, and welcoming posture.

In school and work, we have the ability to think and speak for ourselves and with one another. We’re in our own personalized groups — such as friends, students, colleagues, teachers, and staff. At Penn State World Campus, we often connect with and influence others through our integrative chat rooms, email, discussion forums, and social networks.

Some personality traits are more effective for group work

A study conducted at Yale University compared the influence of one person who consistently displayed positive or negative emotions. The findings? The group with one member focusing on positive emotions had improved cooperation, less conflict, and higher perception of task performance.

Aside from being positive, what are the top personality traits involved with group work?

Two researchers explored these personality traits. Warren Norman in 1963 coined the name the Five-Factor Model, and Lewis Goldberg, in 1990, the Big Five on the basis of these traits. These traits have been supported by research such as job performance, job satisfaction, turnover rate, and interpersonal skills among colleagues.

“Big Five” personality traits

  1. Extraversion — talkative in nature outgoing and associated with behaviors such as being sociable, gregarious, active, and assertive
  2. Agreeableness friendly, cooperative, good-natured, flexible, courteous, and tolerant
  3. Conscientiousness particularly self-disciplined, organized, thorough, hardworking, and achievement-oriented
  4. Emotional Stability calm, secure, poised, relaxed, aware, and rational
  5. Openness to Experience imaginative, original, intelligent, artistically sensitive, attentive to inner feelings, and intellectually curious

Conscientiousness, agreeableness, and emotional stability are positively related to job performance, according to a Vanderbilt University study. In addition, emotional stability and agreeableness are strongly related to team work which involved interdependently working with other employees versus other people, such as customers or clients. This shows that these personality characteristics are important for work and predicting work outcomes among employees.

Further research in this area examined this question: How does the team’s composition of personalities influence team members’ level of satisfaction? Results found that the more agreeable and emotionally stable team members are, the more satisfied they are with their team.

On the other hand, the more dissimilar team members are from their teammates in regards to conscientiousness, the less satisfied they are with their team. Team work satisfaction increases with teammates if they are more agreeable and emotionally stable are similarly conscientious similar in nature and less extraverted. Overall, agreeableness and emotional stability had the greatest effects on satisfaction within a team.

Final thoughts

It’s okay if you are not a leader or the most influential. Some people like to follow others. People sometimes don’t even realize they can be a leader until something they believe in or are passionate about becomes the center of their universe. Then, their influence can become contagious.


Contents

The term epigenetics in its contemporary usage emerged in the 1990s, but for some years has been used with somewhat variable meanings. [8] A consensus definition of the concept of epigenetic trait as a "stably heritable phenotype resulting from changes in a chromosome without alterations in the DNA sequence" was formulated at a Cold Spring Harbor meeting in 2008, [4] although alternate definitions that include non-heritable traits are still being used. [9]

The term epigenesis has a generic meaning of "extra growth", and has been used in English since the 17th century. [10]

Waddington's canalisation, 1940s Edit

From the generic meaning, and the associated adjective epigenetic, British embryologist C. H. Waddington coined the term epigenetics in 1942 as pertaining to epigenesis, in parallel to Valentin Haecker's 'phenogenetics' (Phänogenetik). [11] Epigenesis in the context of the biology of that period referred to the differentiation of cells from their initial totipotent state during embryonic development. [12]

When Waddington coined the term, the physical nature of genes and their role in heredity was not known. He used it instead as a conceptual model of how genetic components might interact with their surroundings to produce a phenotype he used the phrase "epigenetic landscape" as a metaphor for biological development. Waddington held that cell fates were established during development in a process he called canalisation much as a marble rolls down to the point of lowest local elevation. [13] Waddington suggested visualising increasing irreversibility of cell type differentiation as ridges rising between the valleys where the marbles (analogous to cells) are travelling. [14]

In recent times, Waddington's notion of the epigenetic landscape has been rigorously formalized in the context of the systems dynamics state approach to the study of cell-fate. [15] [16] Cell-fate determination is predicted to exhibit certain dynamics, such as attractor-convergence (the attractor can be an equilibrium point, limit cycle or strange attractor) or oscillatory. [16]

Contemporary Edit

Robin Holliday defined in 1990 epigenetics as "the study of the mechanisms of temporal and spatial control of gene activity during the development of complex organisms." [17] Thus, in its broadest sense, epigenetic can be used to describe anything other than DNA sequence that influences the development of an organism.

More recent usage of the word in biology follows stricter definitions. It is, as defined by Arthur Riggs and colleagues, "the study of mitotically and/or meiotically heritable changes in gene function that cannot be explained by changes in DNA sequence." [18]

The term has also been used, however, to describe processes which have not been demonstrated to be heritable, such as some forms of histone modification there are therefore attempts to redefine "epigenetics" in broader terms that would avoid the constraints of requiring heritability. For example, Adrian Bird defined epigenetics as "the structural adaptation of chromosomal regions so as to register, signal or perpetuate altered activity states." [5] This definition would be inclusive of transient modifications associated with DNA repair or cell-cycle phases as well as stable changes maintained across multiple cell generations, but exclude others such as templating of membrane architecture and prions unless they impinge on chromosome function. Such redefinitions however are not universally accepted and are still subject to debate. [3] The NIH "Roadmap Epigenomics Project", ongoing as of 2016, uses the following definition: "For purposes of this program, epigenetics refers to both heritable changes in gene activity and expression (in the progeny of cells or of individuals) and also stable, long-term alterations in the transcriptional potential of a cell that are not necessarily heritable." [9] In 2008, a consensus definition of the epigenetic trait, a "stably heritable phenotype resulting from changes in a chromosome without alterations in the DNA sequence", was made at a Cold Spring Harbor meeting. [4]

The similarity of the word to "genetics" has generated many parallel usages. The "epigenome" is a parallel to the word "genome", referring to the overall epigenetic state of a cell, and epigenomics refers to global analyses of epigenetic changes across the entire genome. [9] The phrase "genetic code" has also been adapted – the "epigenetic code" has been used to describe the set of epigenetic features that create different phenotypes in different cells from the same underlying DNA sequence. Taken to its extreme, the "epigenetic code" could represent the total state of the cell, with the position of each molecule accounted for in an epigenomic map, a diagrammatic representation of the gene expression, DNA methylation and histone modification status of a particular genomic region. More typically, the term is used in reference to systematic efforts to measure specific, relevant forms of epigenetic information such as the histone code or DNA methylation patterns. [ citation needed ]

Developmental psychology Edit

In a sense somewhat unrelated to its use in biological disciplines, the term "epigenetic" has also been used in developmental psychology to describe psychological development as the result of an ongoing, bi-directional interchange between heredity and the environment. [19] Interactive ideas of development have been discussed in various forms and under various names throughout the 19th and 20th centuries. An early version was proposed, among the founding statements in embryology, by Karl Ernst von Baer and popularized by Ernst Haeckel. A radical epigenetic view (physiological epigenesis) was developed by Paul Wintrebert. Another variation, probabilistic epigenesis, was presented by Gilbert Gottlieb in 2003. [20] This view encompasses all of the possible developing factors on an organism and how they not only influence the organism and each other but how the organism also influences its own development. Like wise, the long-standing notion "cells that fire together, wire together" derives from Hebbian theory which asserts that synaptogenesis, a developmental process with great epigenetic precedence, depends on the activity of the respective synapses within a neural network. Where experience alters the excitability of neurons, increased neural activity has been linked to increased demethylation . [21]

The developmental psychologist Erik Erikson wrote of an epigenetic principle in his 1968 book Identity: Youth and Crisis, encompassing the notion that we develop through an unfolding of our personality in predetermined stages, and that our environment and surrounding culture influence how we progress through these stages. This biological unfolding in relation to our socio-cultural settings is done in stages of psychosocial development, where "progress through each stage is in part determined by our success, or lack of success, in all the previous stages." [22] [23] [24]

Although empirical studies have yielded discrepant results, epigenetic modifications are thought to be a biological mechanism for transgenerational trauma. [25] [26] [27] [28] [29] [30]

Epigenetic changes modify the activation of certain genes, but not the genetic code sequence of DNA. The microstructure (not code) of DNA itself or the associated chromatin proteins may be modified, causing activation or silencing. This mechanism enables differentiated cells in a multicellular organism to express only the genes that are necessary for their own activity. Epigenetic changes are preserved when cells divide. Most epigenetic changes only occur within the course of one individual organism's lifetime however, these epigenetic changes can be transmitted to the organism's offspring through a process called transgenerational epigenetic inheritance. Moreover, if gene inactivation occurs in a sperm or egg cell that results in fertilization, this epigenetic modification may also be transferred to the next generation. [31]

DNA damage Edit

DNA damage can also cause epigenetic changes. [32] [33] [34] DNA damage is very frequent, occurring on average about 60,000 times a day per cell of the human body (see DNA damage (naturally occurring)). These damages are largely repaired, but at the site of a DNA repair, epigenetic changes can remain. [35] In particular, a double strand break in DNA can initiate unprogrammed epigenetic gene silencing both by causing DNA methylation as well as by promoting silencing types of histone modifications (chromatin remodeling - see next section). [36] In addition, the enzyme Parp1 (poly(ADP)-ribose polymerase) and its product poly(ADP)-ribose (PAR) accumulate at sites of DNA damage as part of a repair process. [37] This accumulation, in turn, directs recruitment and activation of the chromatin remodeling protein ALC1 that can cause nucleosome remodeling. [38] Nucleosome remodeling has been found to cause, for instance, epigenetic silencing of DNA repair gene MLH1. [18] [39] DNA damaging chemicals, such as benzene, hydroquinone, styrene, carbon tetrachloride and trichloroethylene, cause considerable hypomethylation of DNA, some through the activation of oxidative stress pathways. [40]

Foods are known to alter the epigenetics of rats on different diets. [41] Some food components epigenetically increase the levels of DNA repair enzymes such as MGMT and MLH1 [42] and p53. [43] [44] Other food components can reduce DNA damage, such as soy isoflavones. In one study, markers for oxidative stress, such as modified nucleotides that can result from DNA damage, were decreased by a 3-week diet supplemented with soy. [45] A decrease in oxidative DNA damage was also observed 2 h after consumption of anthocyanin-rich bilberry (Vaccinium myrtillius L.) pomace extract. [46]

Techniques used to study epigenetics Edit

Epigenetic research uses a wide range of molecular biological techniques to further understanding of epigenetic phenomena, including chromatin immunoprecipitation (together with its large-scale variants ChIP-on-chip and ChIP-Seq), fluorescent in situ hybridization, methylation-sensitive restriction enzymes, DNA adenine methyltransferase identification (DamID) and bisulfite sequencing. [47] Furthermore, the use of bioinformatics methods has a role in computational epigenetics. [47]

Several types of epigenetic inheritance systems may play a role in what has become known as cell memory, [48] note however that not all of these are universally accepted to be examples of epigenetics.

Covalent modifications Edit

Covalent modifications of either DNA (e.g. cytosine methylation and hydroxymethylation) or of histone proteins (e.g. lysine acetylation, lysine and arginine methylation, serine and threonine phosphorylation, and lysine ubiquitination and sumoylation) play central roles in many types of epigenetic inheritance. Therefore, the word "epigenetics" is sometimes used as a synonym for these processes. However, this can be misleading. Chromatin remodeling is not always inherited, and not all epigenetic inheritance involves chromatin remodeling. [49] In 2019, a further lysine modification appeared in the scientific literature linking epigenetics modification to cell metabolism, i.e. Lactylation [50]

Because the phenotype of a cell or individual is affected by which of its genes are transcribed, heritable transcription states can give rise to epigenetic effects. There are several layers of regulation of gene expression. One way that genes are regulated is through the remodeling of chromatin. Chromatin is the complex of DNA and the histone proteins with which it associates. If the way that DNA is wrapped around the histones changes, gene expression can change as well. Chromatin remodeling is accomplished through two main mechanisms:

  1. The first way is post translational modification of the amino acids that make up histone proteins. Histone proteins are made up of long chains of amino acids. If the amino acids that are in the chain are changed, the shape of the histone might be modified. DNA is not completely unwound during replication. It is possible, then, that the modified histones may be carried into each new copy of the DNA. Once there, these histones may act as templates, initiating the surrounding new histones to be shaped in the new manner. By altering the shape of the histones around them, these modified histones would ensure that a lineage-specific transcription program is maintained after cell division.
  2. The second way is the addition of methyl groups to the DNA, mostly at CpG sites, to convert cytosine to 5-methylcytosine. 5-Methylcytosine performs much like a regular cytosine, pairing with a guanine in double-stranded DNA. However when methylated cytosines are present in CpG sites in the promoter and enhancer regions of genes, the genes are often repressed. [51][52] When methylated cytosines are present in CpG sites in the gene body (in the coding region excluding the transcription start site) expression of the gene is often enhanced. Transcription of a gene usually depends on a transcription factor binding to a (10 base or less) recognition sequence at the promoter region of that gene. About 22% of transcription factors are inhibited from binding when the recognition sequence has a methylated cytosine. In addition, presence of methylated cytosines at a promoter region can attract methyl-CpG-binding domain (MBD) proteins. All MBDs interact with nucleosome remodeling and histone deacetylase complexes, which leads to gene silencing. In addition, another covalent modification involving methylated cytosine is its demethylation by TET enzymes. Hundreds of such demethylations occur, for instance, during learning and memory forming events in neurons.

Mechanisms of heritability of histone state are not well understood however, much is known about the mechanism of heritability of DNA methylation state during cell division and differentiation. Heritability of methylation state depends on certain enzymes (such as DNMT1) that have a higher affinity for 5-methylcytosine than for cytosine. If this enzyme reaches a "hemimethylated" portion of DNA (where 5-methylcytosine is in only one of the two DNA strands) the enzyme will methylate the other half.

Although histone modifications occur throughout the entire sequence, the unstructured N-termini of histones (called histone tails) are particularly highly modified. These modifications include acetylation, methylation, ubiquitylation, phosphorylation, sumoylation, ribosylation and citrullination. Acetylation is the most highly studied of these modifications. For example, acetylation of the K14 and K9 lysines of the tail of histone H3 by histone acetyltransferase enzymes (HATs) is generally related to transcriptional competence. [ citation needed ]

One mode of thinking is that this tendency of acetylation to be associated with "active" transcription is biophysical in nature. Because it normally has a positively charged nitrogen at its end, lysine can bind the negatively charged phosphates of the DNA backbone. The acetylation event converts the positively charged amine group on the side chain into a neutral amide linkage. This removes the positive charge, thus loosening the DNA from the histone. When this occurs, complexes like SWI/SNF and other transcriptional factors can bind to the DNA and allow transcription to occur. This is the "cis" model of the epigenetic function. In other words, changes to the histone tails have a direct effect on the DNA itself. [53]

Another model of epigenetic function is the "trans" model. In this model, changes to the histone tails act indirectly on the DNA. For example, lysine acetylation may create a binding site for chromatin-modifying enzymes (or transcription machinery as well). This chromatin remodeler can then cause changes to the state of the chromatin. Indeed, a bromodomain – a protein domain that specifically binds acetyl-lysine – is found in many enzymes that help activate transcription, including the SWI/SNF complex. It may be that acetylation acts in this and the previous way to aid in transcriptional activation.

The idea that modifications act as docking modules for related factors is borne out by histone methylation as well. Methylation of lysine 9 of histone H3 has long been associated with constitutively transcriptionally silent chromatin (constitutive heterochromatin). It has been determined that a chromodomain (a domain that specifically binds methyl-lysine) in the transcriptionally repressive protein HP1 recruits HP1 to K9 methylated regions. One example that seems to refute this biophysical model for methylation is that tri-methylation of histone H3 at lysine 4 is strongly associated with (and required for full) transcriptional activation. Tri-methylation, in this case, would introduce a fixed positive charge on the tail.

It has been shown that the histone lysine methyltransferase (KMT) is responsible for this methylation activity in the pattern of histones H3 & H4. This enzyme utilizes a catalytically active site called the SET domain (Suppressor of variegation, Enhancer of zeste, Trithorax). The SET domain is a 130-amino acid sequence involved in modulating gene activities. This domain has been demonstrated to bind to the histone tail and causes the methylation of the histone. [54]

Differing histone modifications are likely to function in differing ways acetylation at one position is likely to function differently from acetylation at another position. Also, multiple modifications may occur at the same time, and these modifications may work together to change the behavior of the nucleosome. The idea that multiple dynamic modifications regulate gene transcription in a systematic and reproducible way is called the histone code, although the idea that histone state can be read linearly as a digital information carrier has been largely debunked. One of the best-understood systems that orchestrate chromatin-based silencing is the SIR protein based silencing of the yeast hidden mating-type loci HML and HMR.

DNA methylation frequently occurs in repeated sequences, and helps to suppress the expression and mobility of 'transposable elements': [55] Because 5-methylcytosine can be spontaneously deaminated (replacing nitrogen by oxygen) to thymidine, CpG sites are frequently mutated and become rare in the genome, except at CpG islands where they remain unmethylated. Epigenetic changes of this type thus have the potential to direct increased frequencies of permanent genetic mutation. DNA methylation patterns are known to be established and modified in response to environmental factors by a complex interplay of at least three independent DNA methyltransferases, DNMT1, DNMT3A, and DNMT3B, the loss of any of which is lethal in mice. [56] DNMT1 is the most abundant methyltransferase in somatic cells, [57] localizes to replication foci, [58] has a 10–40-fold preference for hemimethylated DNA and interacts with the proliferating cell nuclear antigen (PCNA). [59]

By preferentially modifying hemimethylated DNA, DNMT1 transfers patterns of methylation to a newly synthesized strand after DNA replication, and therefore is often referred to as the ‘maintenance' methyltransferase. [60] DNMT1 is essential for proper embryonic development, imprinting and X-inactivation. [56] [61] To emphasize the difference of this molecular mechanism of inheritance from the canonical Watson-Crick base-pairing mechanism of transmission of genetic information, the term 'Epigenetic templating' was introduced. [62] Furthermore, in addition to the maintenance and transmission of methylated DNA states, the same principle could work in the maintenance and transmission of histone modifications and even cytoplasmic (structural) heritable states. [63]

Histones H3 and H4 can also be manipulated through demethylation using histone lysine demethylase (KDM). This recently identified enzyme has a catalytically active site called the Jumonji domain (JmjC). The demethylation occurs when JmjC utilizes multiple cofactors to hydroxylate the methyl group, thereby removing it. JmjC is capable of demethylating mono-, di-, and tri-methylated substrates. [64]

Chromosomal regions can adopt stable and heritable alternative states resulting in bistable gene expression without changes to the DNA sequence. Epigenetic control is often associated with alternative covalent modifications of histones. [65] The stability and heritability of states of larger chromosomal regions are suggested to involve positive feedback where modified nucleosomes recruit enzymes that similarly modify nearby nucleosomes. [66] A simplified stochastic model for this type of epigenetics is found here. [67] [68]

It has been suggested that chromatin-based transcriptional regulation could be mediated by the effect of small RNAs. Small interfering RNAs can modulate transcriptional gene expression via epigenetic modulation of targeted promoters. [69]

RNA transcripts Edit

Sometimes a gene, after being turned on, transcribes a product that (directly or indirectly) maintains the activity of that gene. For example, Hnf4 and MyoD enhance the transcription of many liver-specific and muscle-specific genes, respectively, including their own, through the transcription factor activity of the proteins they encode. RNA signalling includes differential recruitment of a hierarchy of generic chromatin modifying complexes and DNA methyltransferases to specific loci by RNAs during differentiation and development. [70] Other epigenetic changes are mediated by the production of different splice forms of RNA, or by formation of double-stranded RNA (RNAi). Descendants of the cell in which the gene was turned on will inherit this activity, even if the original stimulus for gene-activation is no longer present. These genes are often turned on or off by signal transduction, although in some systems where syncytia or gap junctions are important, RNA may spread directly to other cells or nuclei by diffusion. A large amount of RNA and protein is contributed to the zygote by the mother during oogenesis or via nurse cells, resulting in maternal effect phenotypes. A smaller quantity of sperm RNA is transmitted from the father, but there is recent evidence that this epigenetic information can lead to visible changes in several generations of offspring. [71]

MicroRNAs Edit

MicroRNAs (miRNAs) are members of non-coding RNAs that range in size from 17 to 25 nucleotides. miRNAs regulate a large variety of biological functions in plants and animals. [72] So far, in 2013, about 2000 miRNAs have been discovered in humans and these can be found online in a miRNA database. [73] Each miRNA expressed in a cell may target about 100 to 200 messenger RNAs(mRNAs) that it downregulates. [74] Most of the downregulation of mRNAs occurs by causing the decay of the targeted mRNA, while some downregulation occurs at the level of translation into protein. [75]

It appears that about 60% of human protein coding genes are regulated by miRNAs. [76] Many miRNAs are epigenetically regulated. About 50% of miRNA genes are associated with CpG islands, [72] that may be repressed by epigenetic methylation. Transcription from methylated CpG islands is strongly and heritably repressed. [77] Other miRNAs are epigenetically regulated by either histone modifications or by combined DNA methylation and histone modification. [72]

MRNA Edit

In 2011, it was demonstrated that the methylation of mRNA plays a critical role in human energy homeostasis. The obesity-associated FTO gene is shown to be able to demethylate N6-methyladenosine in RNA. [78] [79]

SRNAs Edit

sRNAs are small (50–250 nucleotides), highly structured, non-coding RNA fragments found in bacteria. They control gene expression including virulence genes in pathogens and are viewed as new targets in the fight against drug-resistant bacteria. [80] They play an important role in many biological processes, binding to mRNA and protein targets in prokaryotes. Their phylogenetic analyses, for example through sRNA–mRNA target interactions or protein binding properties, are used to build comprehensive databases. [81] sRNA-gene maps based on their targets in microbial genomes are also constructed. [82]

Prions Edit

Prions are infectious forms of proteins. In general, proteins fold into discrete units that perform distinct cellular functions, but some proteins are also capable of forming an infectious conformational state known as a prion. Although often viewed in the context of infectious disease, prions are more loosely defined by their ability to catalytically convert other native state versions of the same protein to an infectious conformational state. It is in this latter sense that they can be viewed as epigenetic agents capable of inducing a phenotypic change without a modification of the genome. [83]

Fungal prions are considered by some to be epigenetic because the infectious phenotype caused by the prion can be inherited without modification of the genome. PSI+ and URE3, discovered in yeast in 1965 and 1971, are the two best studied of this type of prion. [84] [85] Prions can have a phenotypic effect through the sequestration of protein in aggregates, thereby reducing that protein's activity. In PSI+ cells, the loss of the Sup35 protein (which is involved in termination of translation) causes ribosomes to have a higher rate of read-through of stop codons, an effect that results in suppression of nonsense mutations in other genes. [86] The ability of Sup35 to form prions may be a conserved trait. It could confer an adaptive advantage by giving cells the ability to switch into a PSI+ state and express dormant genetic features normally terminated by stop codon mutations. [87] [88] [89] [90]

Structural inheritance Edit

In ciliates such as Tetrahymena and Paramecium, genetically identical cells show heritable differences in the patterns of ciliary rows on their cell surface. Experimentally altered patterns can be transmitted to daughter cells. It seems existing structures act as templates for new structures. The mechanisms of such inheritance are unclear, but reasons exist to assume that multicellular organisms also use existing cell structures to assemble new ones. [91] [92] [93]

Nucleosome positioning Edit

Eukaryotic genomes have numerous nucleosomes. Nucleosome position is not random, and determine the accessibility of DNA to regulatory proteins. Promoters active in different tissues have been shown to have different nucleosome positioning features. [94] This determines differences in gene expression and cell differentiation. It has been shown that at least some nucleosomes are retained in sperm cells (where most but not all histones are replaced by protamines). Thus nucleosome positioning is to some degree inheritable. Recent studies have uncovered connections between nucleosome positioning and other epigenetic factors, such as DNA methylation and hydroxymethylation. [95]

Genomic architecture Edit

The three-dimensional configuration of the genome (the 3D genome) is complex, dynamic and crucial for regulating genomic function and nuclear processes such as DNA replication, transcription and DNA-damage repair.

Development Edit

Developmental epigenetics can be divided into predetermined and probabilistic epigenesis. Predetermined epigenesis is a unidirectional movement from structural development in DNA to the functional maturation of the protein. "Predetermined" here means that development is scripted and predictable. Probabilistic epigenesis on the other hand is a bidirectional structure-function development with experiences and external molding development. [96]

Somatic epigenetic inheritance, particularly through DNA and histone covalent modifications and nucleosome repositioning, is very important in the development of multicellular eukaryotic organisms. [95] The genome sequence is static (with some notable exceptions), but cells differentiate into many different types, which perform different functions, and respond differently to the environment and intercellular signaling. Thus, as individuals develop, morphogens activate or silence genes in an epigenetically heritable fashion, giving cells a memory. In mammals, most cells terminally differentiate, with only stem cells retaining the ability to differentiate into several cell types ("totipotency" and "multipotency"). In mammals, some stem cells continue producing newly differentiated cells throughout life, such as in neurogenesis, but mammals are not able to respond to loss of some tissues, for example, the inability to regenerate limbs, which some other animals are capable of. Epigenetic modifications regulate the transition from neural stem cells to glial progenitor cells (for example, differentiation into oligodendrocytes is regulated by the deacetylation and methylation of histones. [97] Unlike animals, plant cells do not terminally differentiate, remaining totipotent with the ability to give rise to a new individual plant. While plants do utilize many of the same epigenetic mechanisms as animals, such as chromatin remodeling, it has been hypothesized that some kinds of plant cells do not use or require "cellular memories", resetting their gene expression patterns using positional information from the environment and surrounding cells to determine their fate. [98]

Epigenetic changes can occur in response to environmental exposure – for example, maternal dietary supplementation with genistein (250 mg/kg) have epigenetic changes affecting expression of the agouti gene, which affects their fur color, weight, and propensity to develop cancer. [99] [100] [101] Ongoing research is focused on exploring the impact of other known teratogens, such as diabetic embryopathy, on methylation signatures. [102]

Controversial results from one study suggested that traumatic experiences might produce an epigenetic signal that is capable of being passed to future generations. Mice were trained, using foot shocks, to fear a cherry blossom odor. The investigators reported that the mouse offspring had an increased aversion to this specific odor. [103] [104] They suggested epigenetic changes that increase gene expression, rather than in DNA itself, in a gene, M71, that governs the functioning of an odor receptor in the nose that responds specifically to this cherry blossom smell. There were physical changes that correlated with olfactory (smell) function in the brains of the trained mice and their descendants. Several criticisms were reported, including the study's low statistical power as evidence of some irregularity such as bias in reporting results. [105] Due to limits of sample size, there is a probability that an effect will not be demonstrated to within statistical significance even if it exists. The criticism suggested that the probability that all the experiments reported would show positive results if an identical protocol was followed, assuming the claimed effects exist, is merely 0.4%. The authors also did not indicate which mice were siblings, and treated all of the mice as statistically independent. [106] The original researchers pointed out negative results in the paper's appendix that the criticism omitted in its calculations, and undertook to track which mice were siblings in the future. [107]

Transgenerational Edit

Epigenetic mechanisms were a necessary part of the evolutionary origin of cell differentiation. [108] [ need quotation to verify ] Although epigenetics in multicellular organisms is generally thought to be a mechanism involved in differentiation, with epigenetic patterns "reset" when organisms reproduce, there have been some observations of transgenerational epigenetic inheritance (e.g., the phenomenon of paramutation observed in maize). Although most of these multigenerational epigenetic traits are gradually lost over several generations, the possibility remains that multigenerational epigenetics could be another aspect to evolution and adaptation. As mentioned above, some define epigenetics as heritable.

A sequestered germ line or Weismann barrier is specific to animals, and epigenetic inheritance is more common in plants and microbes. Eva Jablonka, Marion J. Lamb and Étienne Danchin have argued that these effects may require enhancements to the standard conceptual framework of the modern synthesis and have called for an extended evolutionary synthesis. [109] [110] [111] Other evolutionary biologists, such as John Maynard Smith, have incorporated epigenetic inheritance into population-genetics models [112] or are openly skeptical of the extended evolutionary synthesis (Michael Lynch). [113] Thomas Dickins and Qazi Rahman state that epigenetic mechanisms such as DNA methylation and histone modification are genetically inherited under the control of natural selection and therefore fit under the earlier "modern synthesis". [114]

Two important ways in which epigenetic inheritance can differ from traditional genetic inheritance, with important consequences for evolution, are:

  • rates of epimutation can be much faster than rates of mutation [115]
  • the epimutations are more easily reversible [116]

In plants, heritable DNA methylation mutations are 100,000 times more likely to occur compared to DNA mutations. [117] An epigenetically inherited element such as the PSI+ system can act as a "stop-gap", good enough for short-term adaptation that allows the lineage to survive for long enough for mutation and/or recombination to genetically assimilate the adaptive phenotypic change. [118] The existence of this possibility increases the evolvability of a species.

More than 100 cases of transgenerational epigenetic inheritance phenomena have been reported in a wide range of organisms, including prokaryotes, plants, and animals. [119] For instance, mourning-cloak butterflies will change color through hormone changes in response to experimentation of varying temperatures. [120]

The filamentous fungus Neurospora crassa is a prominent model system for understanding the control and function of cytosine methylation. In this organism, DNA methylation is associated with relics of a genome-defense system called RIP (repeat-induced point mutation) and silences gene expression by inhibiting transcription elongation. [121]

The yeast prion PSI is generated by a conformational change of a translation termination factor, which is then inherited by daughter cells. This can provide a survival advantage under adverse conditions, examplifying epigenetic regulation which enables unicellular organisms to respond rapidly to environmental stress. Prions can be viewed as epigenetic agents capable of inducing a phenotypic change without modification of the genome. [122]

Direct detection of epigenetic marks in microorganisms is possible with single molecule real time sequencing, in which polymerase sensitivity allows for measuring methylation and other modifications as a DNA molecule is being sequenced. [123] Several projects have demonstrated the ability to collect genome-wide epigenetic data in bacteria. [124] [125] [126] [127]

While epigenetics is of fundamental importance in eukaryotes, especially metazoans, it plays a different role in bacteria. Most importantly, eukaryotes use epigenetic mechanisms primarily to regulate gene expression which bacteria rarely do. However, bacteria make widespread use of postreplicative DNA methylation for the epigenetic control of DNA-protein interactions. Bacteria also use DNA adenine methylation (rather than DNA cytosine methylation) as an epigenetic signal. DNA adenine methylation is important in bacteria virulence in organisms such as Escherichia coli, Salmonella, Vibrio, Yersinia, Haemophilus, and Brucella. In Alphaproteobacteria, methylation of adenine regulates the cell cycle and couples gene transcription to DNA replication. In Gammaproteobacteria, adenine methylation provides signals for DNA replication, chromosome segregation, mismatch repair, packaging of bacteriophage, transposase activity and regulation of gene expression. [122] [128] There exists a genetic switch controlling Streptococcus pneumoniae (the pneumococcus) that allows the bacterium to randomly change its characteristics into six alternative states that could pave the way to improved vaccines. Each form is randomly generated by a phase variable methylation system. The ability of the pneumococcus to cause deadly infections is different in each of these six states. Similar systems exist in other bacterial genera. [129] In Firmicutes such as Clostridioides difficile, adenine methylation regulates sporulation, biofilm formation and host-adaptation. [130]

Epigenetics has many and varied potential medical applications. [131] In 2008, the National Institutes of Health announced that $190 million had been earmarked for epigenetics research over the next five years. In announcing the funding, government officials noted that epigenetics has the potential to explain mechanisms of aging, human development, and the origins of cancer, heart disease, mental illness, as well as several other conditions. Some investigators, like Randy Jirtle, Ph.D., of Duke University Medical Center, think epigenetics may ultimately turn out to have a greater role in disease than genetics. [132]

Twins Edit

Direct comparisons of identical twins constitute an optimal model for interrogating environmental epigenetics. In the case of humans with different environmental exposures, monozygotic (identical) twins were epigenetically indistinguishable during their early years, while older twins had remarkable differences in the overall content and genomic distribution of 5-methylcytosine DNA and histone acetylation. [8] The twin pairs who had spent less of their lifetime together and/or had greater differences in their medical histories were those who showed the largest differences in their levels of 5-methylcytosine DNA and acetylation of histones H3 and H4. [133]

Dizygotic (fraternal) and monozygotic (identical) twins show evidence of epigenetic influence in humans. [133] [134] [135] DNA sequence differences that would be abundant in a singleton-based study do not interfere with the analysis. Environmental differences can produce long-term epigenetic effects, and different developmental monozygotic twin subtypes may be different with respect to their susceptibility to be discordant from an epigenetic point of view. [136]

A high-throughput study, which denotes technology that looks at extensive genetic markers, focused on epigenetic differences between monozygotic twins to compare global and locus-specific changes in DNA methylation and histone modifications in a sample of 40 monozygotic twin pairs. [133] In this case, only healthy twin pairs were studied, but a wide range of ages was represented, between 3 and 74 years. One of the major conclusions from this study was that there is an age-dependent accumulation of epigenetic differences between the two siblings of twin pairs. This accumulation suggests the existence of epigenetic "drift". Epigenetic drift is the term given to epigenetic modifications as they occur as a direct function with age. While age is a known risk factor for many diseases, age-related methylation has been found to occur differentially at specific sites along the genome. Over time, this can result in measurable differences between biological and chronological age. Epigenetic changes have been found to be reflective of lifestyle and may act as functional biomarkers of disease before clinical threshold is reached. [137]

A more recent study, where 114 monozygotic twins and 80 dizygotic twins were analyzed for the DNA methylation status of around 6000 unique genomic regions, concluded that epigenetic similarity at the time of blastocyst splitting may also contribute to phenotypic similarities in monozygotic co-twins. This supports the notion that microenvironment at early stages of embryonic development can be quite important for the establishment of epigenetic marks. [134] Congenital genetic disease is well understood and it is clear that epigenetics can play a role, for example, in the case of Angelman syndrome and Prader-Willi syndrome. These are normal genetic diseases caused by gene deletions or inactivation of the genes but are unusually common because individuals are essentially hemizygous because of genomic imprinting, and therefore a single gene knock out is sufficient to cause the disease, where most cases would require both copies to be knocked out. [138]

Genomic imprinting Edit

Some human disorders are associated with genomic imprinting, a phenomenon in mammals where the father and mother contribute different epigenetic patterns for specific genomic loci in their germ cells. [139] The best-known case of imprinting in human disorders is that of Angelman syndrome and Prader-Willi syndrome – both can be produced by the same genetic mutation, chromosome 15q partial deletion, and the particular syndrome that will develop depends on whether the mutation is inherited from the child's mother or from their father. [140] This is due to the presence of genomic imprinting in the region. Beckwith-Wiedemann syndrome is also associated with genomic imprinting, often caused by abnormalities in maternal genomic imprinting of a region on chromosome 11.

Methyl CpG-binding protein 2 (MeCP2) is a transcriptional regulator that must be phosphorylated before releasing from the BDNF promoter, allowing transcription. Rett syndrome is underlain by mutations in the MeCP2 gene despite no large-scale changes in expression of MeCP2 being found in microarray analyses. BDNF is downregulated in the MECP2 mutant resulting in Rett syndrome, as well as the increase of early neural senescence and accumulation of damaged DNA. [141]

In the Överkalix study, paternal (but not maternal) grandsons [142] of Swedish men who were exposed during preadolescence to famine in the 19th century were less likely to die of cardiovascular disease. If food was plentiful, then diabetes mortality in the grandchildren increased, suggesting that this was a transgenerational epigenetic inheritance. [143] The opposite effect was observed for females – the paternal (but not maternal) granddaughters of women who experienced famine while in the womb (and therefore while their eggs were being formed) lived shorter lives on average. [144]

Cancer Edit

A variety of epigenetic mechanisms can be perturbed in different types of cancer. Epigenetic alterations of DNA repair genes or cell cycle control genes are very frequent in sporadic (non-germ line) cancers, being significantly more common than germ line (familial) mutations in these sporadic cancers. [145] [146] Epigenetic alterations are important in cellular transformation to cancer, and their manipulation holds great promise for cancer prevention, detection, and therapy. [147] [148] Several medications which have epigenetic impact are used in several of these diseases. These aspects of epigenetics are addressed in cancer epigenetics.

Diabetic wound healing Edit

Epigenetic modifications have given insight into the understanding of the pathophysiology of different disease conditions. Though, they are strongly associated with cancer, their role in other pathological conditions are of equal importance. It appears that the hyperglycaemic environment could imprint such changes at the genomic level, that macrophages are primed towards a pro-inflammatory state and could fail to exhibit any phenotypic alteration towards the pro-healing type. This phenomenon of altered Macrophage Polarization is mostly associated with all the diabetic complications in a clinical set-up. As of 2018, several reports reveal the relevance of different epigenetic modifications with respect to diabetic complications. Sooner or later, with the advancements in biomedical tools, the detection of such biomarkers as prognostic and diagnostic tools in patients could possibly emerge out as alternative approaches. It is noteworthy to mention here that the use of epigenetic modifications as therapeutic targets warrant extensive preclinical as well as clinical evaluation prior to use. [149]

Examples of drugs altering gene expression from epigenetic events Edit

The use of beta-lactam antibiotics can alter glutamate receptor activity and the action of cyclosporine on multiple transcription factors. Additionally, lithium can impact autophagy of aberrant proteins, and opioid drugs via chronic use can increase the expression of genes associated with addictive phenotypes. [150]

Early life stress Edit

In a groundbreaking 2003 report, Caspi and colleagues demonstrated that in a robust cohort of over one-thousand subjects assessed multiple times from preschool to adulthood, subjects who carried one or two copies of the short allele of the serotonin transporter promoter polymorphism exhibited higher rates of adult depression and suicidality when exposed to childhood maltreatment when compared to long allele homozygotes with equal ELS exposure. [151]

Parental nutrition, in utero exposure to stress or endocrine disrupting chemicals, [152] male-induced maternal effects such as the attraction of differential mate quality, and maternal as well as paternal age, and offspring gender could all possibly influence whether a germline epimutation is ultimately expressed in offspring and the degree to which intergenerational inheritance remains stable throughout posterity. [153] However, whether and to what extent epigenetic effects can be transmitted across generations remains unclear, particularly in humans. [154] [155]

Addiction Edit

Addiction is a disorder of the brain's reward system which arises through transcriptional and neuroepigenetic mechanisms and occurs over time from chronically high levels of exposure to an addictive stimulus (e.g., morphine, cocaine, sexual intercourse, gambling, etc.). [156] [157] [158] [159] Transgenerational epigenetic inheritance of addictive phenotypes has been noted to occur in preclinical studies. [160] [161] However, robust evidence in support of the persistence of epigenetic effects across multiple generations has yet to be established in humans for example, an epigenetic effect of prenatal exposure to smoking that is observed in great-grandchildren who had not been exposed. [154]

In the meantime, however, evidence has been accumulating that the use of Cannabis by mothers during - and also by both parents before - pregnancy leads to epigenetic alterations in neonates that are known to be associated with an increased risk to develop psychiatric disorders later during their lives, such as autism, ADHD, schizophrenia, addictive behavior, and others. [162] [163] [164] [165]

Depression Edit

Epigenetic inheritance of depression-related phenotypes has also been reported in a preclinical study. [166] Inheritance of paternal stress-induced traits across generations involved small non-coding RNA signals transmitted via the paternal germline.

The two forms of heritable information, namely genetic and epigenetic, are collectively denoted as dual inheritance. Members of the APOBEC/AID family of cytosine deaminases may concurrently influence genetic and epigenetic inheritance using similar molecular mechanisms, and may be a point of crosstalk between these conceptually compartmentalized processes. [167]

Fluoroquinolone antibiotics induce epigenetic changes in mammalian cells through iron chelation. This leads to epigenetic effects through inhibition of α-ketoglutarate-dependent dioxygenases that require iron as a co-factor. [168]

Various pharmacological agents are applied for the production of induced pluripotent stem cells (iPSC) or maintain the embryonic stem cell (ESC) phenotypic via epigenetic approach. Adult stem cells like bone marrow stem cells have also shown a potential to differentiate into cardiac competent cells when treated with G9a histone methyltransferase inhibitor BIX01294. [169] [170]

Due to epigenetics being in the early stages of development as a science and the sensationalism surrounding it in the public media, David Gorski and geneticist Adam Rutherford advised caution against proliferation of false and pseudoscientific conclusions by new age authors who make unfounded suggestions that a person's genes and health can be manipulated by mind control. Misuse of the scientific term by quack authors has produced misinformation among the general public. [2] [171]


Turning Off The Switch

Psychologist Alia Crum and her research team at Yale University discovered that in order to avoid stress taking over your life and dominating you, your thoughts and beliefs must change.

Crum goes on to say that how we view stress, primarily affects how our brains change. Meaning if we only see stress as something horrible and react in that form, then a thought loop will occur and every stressful event that you encounter will send your brain into a frenzy.

Instead, Crum recommends understanding that stress can be helpful in allowing you to grow and develop internally. When research patients were tested with this mindset, all of a sudden, the things that used to cause so much stress, no longer had the impact that they once did.

And so I leave you with this, view the world through rose-colored lenses and you will find that stress may be a killer but can also propel you forward. Its all about how YOU react, not the nature of the stress.


A secret of science: Mistakes boost understanding

Many kids fear making a mistake will label them as a failure. In fact, science is built on a mountain of mistakes, many made by the greatest minds. The trick is to view each mistake as a step along the path to understanding something better.

marrio31/iStock/Getty Images Plus

Share this:

September 10, 2020 at 6:30 am

This past spring, before COVID-19 turned the world on its head, Anne Smith’s 9th-grade physics class was learning about electric circuits. Smith teaches science at Carmel Catholic High School in Mundelein, Ill. She gave her students paper clips, batteries, tape and a lightbulb. Then she said, “Have a go. See if you can make the bulb light up.”

Students in Anne Smith’s 9th grade chemistry lab had to determine their own methods for deciding which solutions were best for making bubbles. A. Smith

Smith sees value in letting her students experiment. She believes that some of the most powerful learning happens through trial and error. “When students are allowed to struggle through difficult material, they gain confidence,” she says. “They learn that making mistakes is part of the scientific process.”

This isn’t to say that once the task is given, Smith sits back and watches her students fail. Instead, she selects activities that may have more than one answer. Then she encourages students to try multiple approaches. She wants them to think about different ways to solve a problem.

Throughout the lesson, the students engage in group discussions. Their observations and reflections focus on the process, not the outcome.

Smith praises students for working through hard tasks. She wants to highlight how their struggles can reward them with benefits. “The point,” Smith says, “is to explore ideas and evaluate the methods [being] tried.” In doing so, students learn to value mistakes. Indeed, she finds, mistakes are an essential part of learning.

Failing to succeed

“Failure is the most important ingredient in science,” says Stuart Firestein. He studies the biology of the brain at Columbia University in New York City. He also wrote a book called Failure: Why Science is So Successful.

“When an experiment fails or doesn’t work out the way you expected, it tells you there is something you didn’t know,” he says. It suggests you need to go back and rethink: What went wrong? And why? Was there a problem with the idea? With your approach or assumptions? With your measurements? In the environment, such as temperature, lighting or pollution?

Thomas Edison, at left, poses with an early electric car that ran on his storage batteries. It took Edison thousands of failed experiments over many years to develop these batteries. National Park Service

This is the value of failure. It leads us to what Firestein calls “the portal of the unknown.” It is where the deepest and most worthwhile questions come from. And asking those questions can spark new ideas and types of experiments. The best thing a scientist can discover is “a new or better question,” Firestein says. “Failure is where the action is. It propels science forward.”

Thomas Edison is reputed to have said much the same thing, according to a 1910 biography. He wanted to make a better battery. But after working seven days a week for more than five months, he still hadn’t succeeded. He told a friend, Walter S. Mallory, that he had already done more than 9,000 experiments for the project. According to the book, Mallory replied: “Isn’t it a shame that with the tremendous amount of work you have done you haven’t been able to get any results?” The book goes on to say that Edison “with a smile replied: ‘Results! Why, man, I have gotten a lot of results! I know several thousand things that won’t work.’”

Nor did the dogged inventor stop. Eventually, he got the new battery to work. He patented it, too. Although Edison is best known for the light bulb, those batteries eventually became the most commercially successful product of his later life.

Educators and Parents, Sign Up for The Cheat Sheet

Weekly updates to help you use Science News for Students in the learning environment

Most schools do not encourage students to fail

Smith’s and Edison’s approach is different from the way science is carried out in most classrooms. Schools tend to focus on covering lots of topics and memorizing countless facts. Many classes rely on textbooks to give students as much info as quickly as possible. The problem with those books, explains Firestein, is “they have no context.” They state the results of experiments but they don’t tell us why people did them. Nor do they describe experiments that didn’t work. By focusing on successful outcomes, Firestein says, “we leave out 90 percent of science.”

Instead, he suggests, science learning should include details of those failures. This shows the realistic process of getting to an answer. Also, students can discover why specific scientific questions arose and see how people arrived at the answer we have now.

You thought you followed the directions carefully, but the cookies still burned. What went wrong? A scientist would look at and test each of the factors that could have led to the problem. Is the oven thermostat reading the right temperature? Did you misread the baking time? Zhenikeyev/iStock/Getty Images Plus

When we fail, we question thoughts, opinions and ideas. This is what teachers refer to as critical thinking. Through such questions, we connect ideas and challenge reasoning. Both skills are highly valued in a scientist, says Firestein.

Learning about failure also makes science more approachable. Science is not just a sequence of geniuses making one discovery after another. Rather, the history of science is full of mistakes and wrong turns.

Some of the most well-known scientific facts follow a trail of failures. For example, physics icon Isaac Newton was wrong about gravity. Although Firestein explains that Newton’s laws of motion are “great for launching satellites and building bridges, his idea about how gravity works was wrong.” It was Albert Einstein, 200 years later, who corrected it — again as a result of a failed experiment. He studied Newton’s idea until he came up with his Special Theory of Relativity. This changed science’s perception of gravity. The scientific process is where you arrive at the truth by making mistakes, Firestein explains, “each a little less of a mistake than the one before.”

Explainer: Where antibiotics came from

Failures have also led to great discoveries. Penicillin, X-rays and insulin are all the results of experiments gone wrong. “Two-thirds of Nobel laureates have announced their winning discovery was the result of a failed experiment,” says Firestein. This explains why Isaac Asimov, an American writer and biochemist, is crediting with saying: “The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ but ‘That’s funny.’”

The importance of failure is just as prevalent in other fields. Take this observation by professional basketball player Michael Jordan in a 1997 Nike commercial: “I’ve missed more than 9,000 shots in my career. I’ve lost almost 300 games. Twenty-six times, I’ve been trusted to take the game-winning shot — and missed. I’ve failed over and over and over again in my life. And that is why I succeed.”

A changeable brain

Michael Merzenich is a neuroscientist who worked at the University of California, San Francisco. In the 1970s, he found evidence that brains can rewire themselves over time. His work challenged the common idea that people were born with a fixed number of brain cells organized in unchanging paths. Perhaps our potential to think, learn and reason was not set from birth, he proposed.

Merzenich and his team began their research with monkeys. They aimed to map which brain cells fired when the monkeys completed a given task. The resulting “brain maps” amazed the scientific community. But he found an even bigger surprise when he later revisited the maps: The monkey’s neural pathways had changed. “What we saw,” said Merzenich, “was absolutely astounding. I couldn’t understand it.” The only possible explanation was that the monkeys’ brains had wired new neural pathways, he decided. Norman Doidge recounted the observation in his book The Brain that Changes Itself.

Merzenich’s research pointed to a concept that would come to be known as “brain plasticity.” It’s the ability of the brain to adapt and change in response to experiences. His studies went on to show that when we learn something new, an electrical signal fires and connects cells in different parts of the brain.

Scientists Say: Synapse

The place where these electrical sparks jump between brain cells is called a synapse. Synapses fire when we do things such as read a book, play with toys or have conversations. That firing strengthens connections between brain cells. If we do something only once, synaptic pathways can fade away. But if we practice and learn something deeply, the synaptic activity will form lasting networks in the brain. Indeed, learning rewires the brain.

If learning can cause our brain to adapt and change, what happens when we make a mistake? In 2011, Jason Moser studied how the brain reacts when people make an error. Moser is a psychologist at Michigan State University in East Lansing. He teamed up with four other researchers. They asked 25 participants to complete a test with 480 questions. During the test, each person wore a stretch cap with electrodes that recorded activity in different parts of the brain.

The participants’ brain activity rose when they made a mistake, Moser and his colleagues found. “When a participant experienced conflict between a correct response and an error, the brain was challenged,” he says. “Trying to make sense of this new knowledge was a time of struggle and need for change.” This is when the brain reacted most strongly.

He also found two typical brain responses to a mistake. The first response indicated that something went wrong. The second reaction only came when test-takers treated the mistake as a problem that needed greater attention. Participants who responded to their error by giving it more consideration were able to do better on the test after making their mistake. Moser concluded that “by thinking about what we got wrong, we learn how to get it right.”

Even in math, a right answer isn’t the only goal. Showing classmates why you’re taking certain steps to a solution can help students learn together — or together troubleshoot where something went wrong. SDI Productions/E+/Getty Images

A new view of mistakes

Championing the value of mistakes, Jo Boaler has started a revolution in math. She teaches math education at Stanford University in California. In her 2019 book, Limitless Mind, she said people need to give up the idea that one’s ability to learn is fixed, or unchanging. Instead, she argues, we should view learning as putting us “all on a growth journey.”

She wanted to give students positive messages about mistakes and create a “mistakes-friendly” environment where students celebrate errors. Seeking to bring this idea into the classroom, Boaler established a three-week summer math camp known as Youcubed. (The last in-person session was in 2019. She now offers it as an online course.) The aim of this program is to boost confidence in math among 6th and 7th graders. When kids give answers, they are encouraged to explain their thinking. Discussing the process helped other students analyze their reasoning. This pushed them to keep trying.

At the beginning of camp, students often reported that struggling with math was a sign you weren’t doing well. But by the end of three weeks, most reported feeling more positive about making mistakes. They enjoyed being challenged and described having higher self-esteem. “When students see mistakes as positive,” Boaler says, “it has an incredibly liberating effect.”

Anne Smith’s students designed balloon rockets and then explained why they succeeded or failed using Newton’s laws. A. Smith

Learning through collaboration can also help us see mistakes in a more positive way. Janet Metcalfe studies the effects of errors and how they can benefit learning. A psychologist at Columbia University in New York City, she observed several middle-grade math classes. The most effective learning technique, she found, was giving students a chance to discuss their errors.

They might be asked: What do you think about that? How did you get your answer? Sharing the way they did a problem took much of the focus off of mistakes. Instead they described their theories and ideas. This collaboration with classmates resulted in higher test scores.

“When you connect with someone else’s ideas,” Metcalfe points out, “you go deeper.” Mistakes are simply the starting point for discussion. “You have only got something to learn if you make a mistake,” she concludes.

And that is the message Smith tries to give to her high school physics students. Yet some still come to class with a fear of failure. They believe that a wrong answer means they are not smart. Some give up before they even try because they are so scared of being incorrect.

“It is especially important for these students to see mistakes as a learning opportunity,” Smith says. Getting people to see mistakes as a natural part of learning takes time. We have all been embarrassed by making a mistake in public. But finding success as a result of struggling toward a solution will hopefully help students become more eager to approach future challenges.

To Smith, being able to “approach mistakes with confidence” is more important than anything else a student can learn.

Power Words

battery: A device that can convert chemical energy into electrical energy.

biology: The study of living things. The scientists who study them are known as biologists.

brain plasticity: The ability of the brain to essentially re-wire itself by changing directions in the flow of nerve signals or changing connections between neurons. Such changes allow the brain to grow and mature from infancy through old age, and sometimes to even repair brain injuries.

circuit: A network that transmits electrical signals. In the body, nerve cells create circuits that relay electrical signals to the brain. In electronics, wires typically route those signals to activate some mechanical, computational or other function.

COVID-19: A name given the coronavirus that caused a massive outbreak of potentially lethal disease, beginning in December 2019. Symptoms included pneumonia, fever, headaches and trouble breathing.

critical thinking: Sometimes described as a scientific attitude of mind, it is the careful and probing consideration of any belief, purported fact, idea or values, based upon the data or experiences available — and then using that assessment to make some conclusion.

develop: To emerge or to make come into being, either naturally or through human intervention, such as by manufacturing.

electric current: A flow of electric charge — electricity — usually from the movement of negatively charged particles, called electrons.

environment: The sum of all of the things that exist around some organism or the process and the condition those things create. Environment may refer to the weather and ecosystem in which some animal lives, or, perhaps, the temperature and humidity (or even the placement of things in the vicinity of an item of interest).

error: (In statistics) The non-deterministic (random) part of the relationship between two or more variables.

fire: (in neuroscience) The activation of a nerve or neural pathway.

focus: (in behavior) To look or concentrate intently on some particular point or thing.

gravity: The force that attracts anything with mass, or bulk, toward any other thing with mass. The more mass that something has, the greater its gravity.

high school: A designation for grades nine through 12 in the U.S. system of compulsory public education. High-school graduates may apply to colleges for further, advanced education.

icon: (adj. iconic) Something that represents another thing, often as an ideal version of it.

insulin: A hormone produced in the pancreas (an organ that is part of the digestive system) that helps the body use glucose as fuel.

Isaac Newton: This English physicist and mathematician became most famous for describing his law of gravity. Born in 1642, he developed into a scientist with wide-ranging interests. Among some of his discoveries: that white light is made from a combination of all the colors in the rainbow, which can be split apart again using a prism the mathematics that describe the orbital motions of things around a center of force that the speed of sound waves can be calculated from the density of air early elements of the mathematics now known as calculus and an explanation for why things “fall:” the gravitational pull of one object towards another, which would be proportional to the mass of each.

network: A group of interconnected people or things.

neuron: An impulse-conducting cell. Such cells are found in the brain, spinal column and nervous system.

neuroscientist: Someone who studies the structure or function of the brain and other parts of the nervous system.

penicillin: The first antibiotic (although not the first one used on people), it’s a natural product that comes from a mold. In 1928, Alexander Fleming, a British scientist, discovered that it could kill certain bacteria. He would later share the 1945 Nobel Prize in Medicine for it.

perception: The state of being aware of something — or the process of becoming aware of something — through use of the senses.

physics: The scientific study of the nature and properties of matter and energy. Classical physics is an explanation of the nature and properties of matter and energy that relies on descriptions such as Newton’s laws of motion. A scientist who works in such areas is known as a physicist.

plasticity: Adaptable or reshapable. (in biology) The ability of an organ, such as the brain or skeleton to adapt in ways that stretch its normal function or abilities. This might include the brain’s ability to rewire itself to recover some lost functions and compensate for damage.

psychologist: A scientist or mental-health professional who studies the human mind, especially in relation to actions and behaviors.

relativity: (in physics) A theory developed by physicist Albert Einstein showing that neither space nor time are constant, but instead affected by one’s velocity and the mass of things in your vicinity.

satellite: A moon orbiting a planet or a vehicle or other manufactured object that orbits some celestial body in space.

sequence: The precise order of related things within some series.

synapse: The junction between neurons that transmits chemical and electrical signals.

theory: (in science) A description of some aspect of the natural world based on extensive observations, tests and reason. A theory can also be a way of organizing a broad body of knowledge that applies in a broad range of circumstances to explain what will happen. Unlike the common definition of theory, a theory in science is not just a hunch. Ideas or conclusions that are based on a theory — and not yet on firm data or observations — are referred to as theoretical. Scientists who use mathematics and/or existing data to project what might happen in new situations are known as theorists.

trait: A characteristic feature of something.

transmit: (n. transmission) To send or pass along.

wave: A disturbance or variation that travels through space and matter in a regular, oscillating fashion.

X-ray: A type of radiation analogous to gamma rays, but having somewhat lower energy.

Citations

Journal:​ J. Metcalfe. Learning from errors. Annual Review of Psychology. Vol. 68, January 2017, p. 465. doi: 10.1146/annurev-psych-010416-044022.

Book: S. Firestein. Failure: Why Science is So Successful. Oxford University Press. October 2015.

Journal:​ J.S. Moser et al. Mind your errors: Evidence for a neural mechanism linking growth mind-set to adaptive posterror adjustments. Psychological Science. Vol. 22, Oct. 31, 2011, p. 1484. doi: 10.1177/0956797611419520.

Book: F.L. Dyer and T.C. Martin. Edison: His Life and Inventions. Harper and Bros. 1910. Available as free e-book as part of Project Gutenberg.

Classroom Resources for This Article Learn more

Free educator resources are available for this article. Register to access:


Does thinking or focusing on something alter a neuron's speed? - Psychology

You just got off the phone with one of your most important clients. The game-changing deal you were trying to close is off. They're not interested.

You've just pitched 10 potential investors. They all say they're "interested" but it's been two weeks. You refresh your inbox hourly, and yet still no word.

How do you react in these situations?

If you're like most people, your mind floods with negativity. "Maybe our product sucks," "Why can't I just get a break?" or "Maybe there's something wrong with me."

Neuroscientists have a name for this automatic habit of the brain: "negativity bias." It's an adaptive trait of human psychology that served us well when we were hunting with spears on the savanna 120,000 years ago.

In modern times, however, this habit of the brain leaves us reacting to a harsh email or difficult conversation as if our life were in danger. It activates a cascade of stress hormones and leaves us fixated on potential threats, unable to see the bigger picture.

Neuroscientist Rick Hanson has a great analogy for this strange quality of the mind. "Your brain," he writes in his book Buddha's Brain, "is like Velcro for negative experiences and Teflon for positive ones." When you lose a client, when the investors don't come calling, or when you face the hundreds of other daily disappointments of life, you're wired to forget all the good things and to instead obsess over the negative.