Thursday, May 21, 2015

Decision-making 101

Teenagers are notorious for poor decision-making. Of course that is inevitable, given that their brains are still developing, and they have had relatively little life experience to show them how to predict what works and what doesn’t. Unfortunately, what doesn’t work may have more emotional appeal, and most of us at any age are more susceptible to our emotions than cold, hard logic.
Seniors also are prone to poor decision-making if senility has set it. Unscrupulous people take advantage of such seniors because a brain that is deteriorating has a hard time making wise decisions.
In between teenage and senility is when the brain is at its peak for good decision making. Wisdom comes with age, up to a point. Some Eastern cultures venerate their old people as generally being especially wise. After all, it you live long enough, and are still mentally healthy, you ought to make good decisions because you have a lifetime of experience to teach you what future choices are likely to work and which are not.
Much of that knowledge comes from learning from one’s mistakes. On the other hand, some people, regardless of age, can’t seem to learn from their mistakes. Most of the time the problem is not stupidity but a flawed habitual process by which one is motivated to make wise decisions and evaluate options. Best of all is learning from somebody else’s mistakes, so you don’t have to make them yourself.
Learning from your mistakes can be negative, if you fret about it. Learning what you can do to avoid repeating a mistake is one thing, but dwelling on it erodes one’s confidence and sense of self worth. I can never forget the good advice I read from, of all people, T. Boone Pickens. He was quoted in an interview as saying that he was able to re-make his fortune on multiple occasions because he didn’t dwell on losing the fortunes. He credited that attitude to his college basketball coach who told the team after each defeat, “Learn from your mistakes, but don’t dwell on them. Learn from what you did right and do more of that.”
It would help if we knew how the brain made decisions, so we could train it to operate better. “Decision neuroscience” is an emerging field of study aimed at how learning how brains make decisions and how to optimize the process. Neuroscientists seemed to have honed in on two theories, both of which deal with how the brain handles the processing of alternate options to arrive at a decision.
One theory is that each option is processed in its own competing pool of neurons. As processing evolves, the activity in each pool builds up and down as each pool competes for dominance. At some point, activity builds up in one of the pools to reach a threshold, in winner-take-all fashion, to allow the activity in that pool to dominate and issue the appropriate decision commands to the parts of the brain needed for execution. As one possible example, two or more pools of neurons separately receive input that reflects the representation of different options. Each pool sends an output to another set of neurons that feed back either excitatory or inhibitory influences, thus providing a way for competition among pools to select the pool that eventually dominates because it has built up more impulse activity than the others.

The other theory is based on guided gating wherein input to pools of decision-making neurons is gated to regulate how much excitatory influence can accumulate in each given pool. [i] The specific routing paths involve inhibitory neurons that shut down certain routes, thus preferentially routing input to a preferred accumulating circuit. The route is biased by estimated salience of each option, current emotional state, memories of past learning, and the expected reward value for the outcome of each option.
These decision-making possibilities involve what is called “integrate and fire.” That is, input to all relevant pools of neurons accumulates and leads to various levels of firing in each pool. The pool firing the most is most likely to dominate the output, that is, the decision.
However circuits make decisions, there is considerable evidence that nerve impulse representations for each given choice option simultaneously code for expected outcome and reward value. These value estimates update on the fly.[ii] Networks containing these representations compete to arrive at a decision.
Any choice among alternative options is affected by how much information for each option the brain has to work on. When the brain is consciously trying to make a decision, this often means how much relevant information the brain can hold in working memory. Working memory is notoriously low-capacity, so the key becomes remembering the sub-sets of information that are the most relevant to each option. Humans think with what is in their working memory. Experiments have shown that older people are more likely to hold the most useful information in working memory, and therefore they can think more effectively. The National Institute of Aging began funding decision-making research in 2010 at Stanford University’s Center on Longevity. Results of their research are showing that older people often make better decisions than younger people.
As one example, older people are more likely to make rational cost-benefit analyses. Older people are more likely to recognize when they have made a bad investment and walk away rather than throwing more good money after bad.
A key factor seems to be that older people are more selective about what they remember. For example, one study from the Stanford Center compared the ability of young and old people to remember a list of words. Not surprisingly, younger people remembered more words, but when words were assigned a number value, with some words being more valuable than others, older people were better at remembering high-value words and ignoring low-value words. It seems that older people selectively remember what is important, which should make it easier to make better decisions.
Decision-making skills are important of learning achievement in school. Students need to know how to focus in general, and focus on what is most relevant in particular. They are not learning that skill, and their multi-tasking culture is teaching them many bad habits.
Those of us who care deeply about educational development of youngsters need to push our schools to address the thinking and learning skills of students. "Teaching to the test" detracts from time spent in teaching what matters most. Today's culture of multi-tasking is making matters worse. Children don't learn how to attend selectively and intensely to the most relevant information, because they are distracted by superficial attention to everything. Despite their daily use of Apple computers and smart phones, only one college student out of 85 could draw the Apple logo correctly.[iii]
Memory training is generally absent from teacher training programs. Despite my locally well-publicized experience in memory training, no one in the College of Education at my university has ever asked me to share my knowledge with their faculty or with pre-service teachers. The paradox is that teachers are trained to help students remember curricular answers for high-stakes tests. What could be more important than learning how to decide what to remember and how to remember it? And we wonder why student performance is so poor?


"Memory Medic" is author of Memory Power 101 (Skyhorse) and Better Grades, Less Effort (Benecton).



[i]Purcell, B. A.; Heitz, R. P.; Cohen, J. Y.; Schall, et al. (2010). Neurally constrained modeling of perceptual decision making. Psychological Review, 117(4), 1113-1143.

[ii] McCoy, A. N., and Platt, M. L. (2005). Expectations and outcomes: decision-making in the primate brain. J. Comp. Physiol A 191, 201-211.

[iii] Blake, Adam B., Nazarian, Meenely, and Castel, Alan D. (2015). The Apple of the mind's eye: Everyday attention, metamemory, and reconstructive memory for the Apple logo. The Quarterly Journal of Experimental Psychology, 2015; 1 DOI: 10.1080/17470218.2014.1002798

Thursday, April 30, 2015

Music Effects on Cognitive Function of the Elderly


Whether the music is orchestral, rock, country, or jazz, most seniors like to listen to some kind of music. Music can soothe or energize, make us happy or sad, but the kind we like to hear does something that can be positively reinforcing or otherwise we would not listen to it. As my 80-year-old jazz trumpeter friend, Richard Phelps, recently said at his birthday party, "Where there is life there is music. Where there is music, there is life."
Relatively little research has been done on the effects of music on brain function in older people. But one study recently reported the effects in older adults of background music on brain processing speed and two kinds of memory (episodic and semantic). The subjects were not musicians and had an average age of 69 years.
The music test conditions were: 1) no music control, 2) white noise control, 3) a Mozart recording, and 4) a Mahler recording. All 65 subjects were tested in counter-balanced order in all four categories. The music was played at modest volume as background before and during performance of the cognitive tasks, a mental processing speed task and the two memory tasks. The episodic memory task involved trying to recall a list of 15 words immediately after a two-minute study period. The semantic memory task involved word fluency in which subjects wrote as many words as they could think of beginning with three letters of the alphabet.
Processing speed performance was faster while listening to Mozart than with the Mahler or white noise conditions. No improvement in the Mahler condition was seen over white noise or no music.
Episodic memory performance was better when listening to either type of music thatn while hearing white noise or no music. No difference was noted between the two types of music.
Semantic memory was better for both kinds of music than with white noise and better with Mozart that with no music.
Recognizing that emotions could be a relevant factor, the experimenters analyzed a mood questionnaire comparing the two music conditions with white noise. Mozart generated higher happiness indicators than did Mahler or white noise. Mahler was rated more sad than Mozart and comparable to white noise.
Thus, happy, but not sad, music correlated with increased processing speed. The researchers speculated that happy subjects were more around and alert.
Surprisingly, both happy and sad music enhanced both kinds of memory over the white noise or silence condition. But it is not clear if this observation is generally applicable. The authors did mention without emphasis that the both kinds of music were instrumental and lacked loudness or lyrics that could have been distracting and thus impair memory. I think this point is substantial. When lyrics are present, the brain is dragged into trying to hear the words and thinking about their meaning. These thought processes would surely interfere with trying to memorize new information or recall previous learned material.
A point not considered at all is personal preference for a certain types of music. There are people who don't like classical music, and the data in this study could have been made "noisy" if enough of the 65 people disliked classical music and were actually distracted by it. In other words, the effects noted in this study might have been magnified if the subjects were allowed to hear their preferred music.
My take-home lesson was actually formed over five decades ago when I listed to jazz records while plowing my way through memorizing a veterinary medical curriculum. Then, I thought that the benefit was stress reduction (veterinary school IS stressful and happy jazz certainly reduces stress). Now perhaps I see that frequent listening to music that was pleasurable for me might have actually helped my memory capability. If you still have doubts you might want to check my latest blog post, "Happy thoughts can make you more competent" (http://thankyoubrain.blogspot.com/2015/01/happy-thoughts-can-make-you-more.html).
Anyway, now that I am in the elderly category, I see there is still reason to listen to the music I like. Music can be therapy for old age.


“People haven't always been there for me but music always has.”
    —Taylor Swift



"Memory Medic's" latest book is "Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine." It is available in inexpensive e-book form at Amazon or in all formats at Smashwords.com.


Source:

Bottiroli, Sara et al. (2014). The cognitive effects of listening to background music on older adults: processing speed improves with upbeat music, while memory seems to benefit from both upbeat and downbeat music. Frontiers in Aging Neuroscience. Oct. 15. doi: 10.3389/fnagi.2014.00284.



Saturday, April 25, 2015

What Is the Optimal Spacing for Study?

We have all been told by teachers that learning occurs best when we spread it out over time, rather than trying to cram everything into our memory banks at one time. But what is the optimal spacing? There is no general consensus.
However we do know that immediately after a learning experience the memory of the event is extremely volatile and easily lost. It's like looking up a number in the phone book: if you think about something else at the same time you may have to look the number up again before you can dial it. School settings commonly create this problem. One learning object may be immediately followed by another, and the succession of such new information tends to erase the memory of the preceding ones.
Memory researchers have known for a long time that repeated retrieval enhances long-term retention. This happens because each time we retrieve a memory, it has to be reconsolidated and each such reconsolidation strengthens the memory. Though optimal spacing intervals have not been identified, research confirms the importance of spaced retrieval. No doubt, the nature of the information, the effectiveness of initial encoding, competing experiences, and individual variability affect the optimal interval for spaced learning.
One study revealed that repeated retrieval of learned information (100 Swahili–English word pairs) with long intervals produced a 200% improvement in long-term retention relative to repeated retrieval with no spacing between tests. Investigators compared different-length intervals of 15, 30, or 90 minute spacing that expanded (for example, 15-30-45 min), stayed the same (30-30-30 min) or contracted (45-30-15 min) revealed that no one relative spacing interval pattern was superior to any other.[1]
Another study[2] has revealed that the optimally efficient gap between study sessions depends on when the information will be tested in the future. A very comprehensive study of this matter in 1,350 individuals involved teaching them a set of facts and then testing them for long-term retention after 3.5 months. A final test was given at a further delay of up to one year. At any test delay, increasing the inter-study gap between the first learning and a study of that material at first increased and then gradually reduced final test performance. Expressed as a ratio, the optimal gap equaled 10-20% of the test delay. That is, for example, a one-day gap was best for a test to be given seven days later, while a 21-day gap was best for a test 70 days later. Few of any teachers or students know this, and their study times are rarely scheduled in any systematic way, typically being driven by test schedules for other subjects, convenience, or even the teacher's whim.
The bottom line: the optimal time to review a newly learned experience is just before you are about to forget it. Obviously, we usually don't know when this occurs, but in general the vast bulk of forgetting occurs within the first day after learning. As a rule of thumb, you can suspect that a few repetitions early on should be helpful in fully encoding the information and initiating a robust consolidation process. So, for example, after each class a student should quickly remind herself what was just learned—then that evening do another quick review. Before the next class on that subject, the student should review again. Teachers help this process by linking the next lesson to the preceding one.
Certain practices will reduce the amount of time needed for study and the degree of long-term memory formation. These include:

• Don't procrastinate. Do it now!
• Organize the information in ways that make sense (outlines, concept maps)
• Identify what needs to be memorized and what does not.
• Focus. Do not multi-task. No music, cell phones, TV or radio, or distractions of any kind.
• Association the new with things you already know.
• Associate words with mental images and link images to locations, or in story chains
• Think hard about the information, in different contexts
• Study small chunks of material, in short intervals. Then take a mental break.
• Say out loud what you are trying to remember.
• Practice soon after learning and frequently thereafter at spaced intervals.
• Explain what you are learning to somebody else. Work with study groups later.
• Self-test. Don't just "look over" the material. Truly engage with it.
• Never, never, ever CRAM!




[1] Karpicke, J. d., and Bauernschmidt, a. 2011. Spaced retrieval: absolute spacing enhances learning regardless of relative spacing. J. Exp. Psychol. 37 (5) 1250-1257.
[2] Cepeda, N. J. et al. 2008. Spacing effects in learning. A temporal ridgeline of optimal retention. 19  (11): 1095-1102

Friday, April 03, 2015

What Happened to The Wonder of Learning?

I read a lot about educational theory and research so that I can share "best practices" for better ways to teach and learn with my readers. Shared here are the ideas in a most informed and intelligent article on learning, written by Catherine L'Ecuyer, a Canadian lawyer with an MBA now living in Barcelona, Spain.[1] The article explains the fundamental importance for motivating children to learn: the sense of wonder.
This notion resonated with me, because I know it to be true from personal experience. To this
day, I have vivid memories of the excitement I had as a six-year old in Fort Myers, Florida, as I walked to my first day of school. Yes, in those days it was safe for kids to walk several blocks to school unattended. And yes, there was, at least for me, no kindergarten, pre-kindergarten, or day care.
Sauntering to school, I became entranced with all the new sights and sounds, stopping several times along the way to savor a new experience. A vivid memory was my stop at a beautiful flower I had never seen before. I physically probed the bloom, astonished at the elegant expression of nature. On that day, the prospect of school was a most joyous opportunity. It did not take long for school's pedantic nature, drills, and drudgery to squelch my sense of wonder. It was only in late middle school that my sense of wonder was resurrected, and that only occurred because I had a crush on my teacher and wanted to impress her with my learning. For many children, their inherent sense of wonder that school stamps out never returns.
Clearly, a child's state of mind affects how learning and the school environment are regarded. Among the more relevant states is stimulus seeking. That is why for example I wanted to explore the innards of that beautiful flower. Then too, there is the basic human responsiveness to positive reinforcement. If a learning experience is perceived as wondrous, it is perceived as good and beneficial, serving as incentive to have other such learning experiences. It obviously helps for a child to be aware of such perceptions.
L'Ecuyer adds the sense of wonder to the list of fundamentals of the motivation to learn. She makes the point that wonder is innate in children, especially when they are young. As a child matures, much of this sense of wonder can fade. For some people, the more they learn, the less wondrous the world seems. Among adults, scientists seem to be an exception (to a biologist, pond scum is beautiful and wondrous).
L'Ecuyer argues that modern educational paradigms are behaviorist and conflict with nurturing the sense of wonder in children. By behaviorist, she means that the guiding principle of teaching is that the environment directs learning with its emphasis on teachers, curriculum, and high-stakes testing. The popular mantra is that learning is better when it is provided earlier and in abundance. Curricula are designed to bombard students with information and testing. Do we really think that is motivating?
The problem is that children can be overwhelmed by too much too soon. Yet government policy increasingly advocates pre-kindergarten. Young developing brains do not respond well to too much stimulus, too much curriculum, and too much high-stakes testing. Children become preconditioned to expect high levels of stimulation, leading to attentiveness disorders. Children become passive and bored. The associated loss of the sense of wonder diminishes a child's motivation to cope with all this stimulus and pressure.
L'Ecuyer cites convincing research showing that compared to adults children learn at a slower pace than adults. They need more calm and silence. They are more intrigued by mystery. They need to trust in a human attachment figure, most commonly a caring mother.  Unfortunately, our educational culture assumes that we don't teach enough curriculum and don't demand enough of children. Children learn to pass tests, not love learning. Our multi-tasking culture only adds to sensory and cognitive overload that interferes with learning and mental performance in general. The family breakdown in our culture diminishes a child's trust in primary caregivers and degrades attachment to them. Schools cannot provide such trust and attachment. Nor can pre-kindergarten or day-care.
These are basic reasons why I push for a reform in education that stresses teaching learning skills to young children, as opposed to the domination of traditional curriculum and excessive high-stakes testing. I am writing such a book now. When children have good learning skills, learning stops being an onerous chore. The "Learning Skills Cycle" that I advocate begins with motivation, and motivation begins with a sense of wonder.[2]
Educational policy makers seem confused about why so many students fall behind. Every year, over 1.2 million students drop out of high school in the United States. That’s a student every 26 seconds – or 7,000 a day. About 25% of high school freshmen fail to graduate from high school on time.[3] At the college level, only 59% of full-time four-year college students graduate within six years.[4] Most college data use a six-year limit because so many college students can't finish in the usual four years.
Over the last 40 years, all the educational fads we have tried apparently do not work. In this time we have had such high-profile government initiatives as "Goals 2000, New Math, Nation at Risk, No Child Left Behind, Race to the Top, Common Core, Next Generation Science Standards, Charter Schools, and Head Start." Where is the evidence that any of this works? SAT scores have not improved, even declining in some years, while funding for educational has increased dramatically, on the order depending on the state of 200%.[5]
Despite much ballyhoo and funding, Head Start's effects wash out within a few years. Nonetheless, many states think Head Start did not start early enough and that what is needed is government funded pre-kindergarten. Nobody considers what this too-soon, too-much, too stressful education does to childhood sense of wonder and motivation to learn.
The key question asked by L'Ecuyer is this: Are today's educational paradigms and policies promoting the sense of wonder and motivation to learn or squelching it? While all our government programs to improve education seem valuable, the results say otherwise. Teaching is now driven by high-stakes testing. While accountability is necessary, when high-stakes testing becomes the focus of education, it poisons the learning atmosphere. The law of unintended consequences applies. Today's educational environment suffocates the wonder and love of learning for its own sake.

# # #

Dr. Klemm is author of two books on learning: Memory Power 101 and Better Grades, Less Effort. Reviews and information can be found at his web site, WRKlemm.com



[1] L'Ecuyer, Catherine. 2014. The wonder approach to learning. Frontiers in Human Neuroscience. Volume 8, October 6. doi:10.3389/fnhum.2014.00764.
[2] Klemm, W. R. 2014. Shift Away from Teaching to the Test: A Better Way to Improve Test Scores. The STATellite, 59 (1): 10-13. http://c.ymcdn.com/sites/www.statweb.org/resource/collection/ED7F18DF-2934-4034-BDF2-CF81CADE155A/WinterSTATellite2014.pdf
[3] https://www.dosomething.org/facts/11-facts-about-high-school-dropout-rates
[4] http://nces.ed.gov/fastfacts/display.asp?id=40
[5]http://www.cato.org/publications/policy-analysis/state-education-trends


Wednesday, March 11, 2015

Excuse-making by School Children

My last column on "Blaming the Victim" was a departure from my usual emphasis on improving learning and memory. But it did set the stage for this current post on the crippling effect of allowing children to make excuses for underperformance in school.
Most of us know how common it is for kids to make excuses ("the dog ate my homework" syndrome). When we adults were young, we also probably made excuses, blaming the textbook, the teacher, the school, and whatever else could serve to avoid facing the real causes of the problems.
Why do kids do that? The main reason is their fragile egos. Confronting personal weakness is especially hard for kids when they are embedded in an adult culture that inevitably reminds them that they are relatively powerless kids.
I remember a recent dinner-table conversation with my competitive 6th grade granddaughter, who was complaining about a test in which some of the questions were not aligned well with the instruction, which itself was deemed confusing. I said, "I understand that others did do better than you on the test. Wasn't everybody facing the same handicap?" No answer. Then I added, "It doesn't matter who the teacher is or what instruction you get. If you are not first in the class, it is your fault." Again, no response.
One approach that parents and teachers use is to bolster children's egos by praising them richly and often. Too much of a good thing is a bad thing. Too much praise makes kids narcissistic. Anybody who is not aware of the raging narcissism in today's youngsters must not be around young people very much. The most obvious sign is the compulsive checking of e-mail and texting, all in an effort by a child to be at the center of attention.
I and other professors notice narcissism in college students. In a selective college, most students think they are "A" students, and because of low standards in secondary school and grade inflation they are actually told they are A students. If they don't make As in college, it is somebody else's fault (usually the professor).
Scholars are beginning to address this growing narcissism. Eddie Brummelman at the University of Amsterdam in the Netherlands and his colleagues studied 565 children between the ages of 7 to 12. They picked this age group because most other such studies have been in adults, and they believed that early adolescence is when children develop narcissistic traits such as selfishness, self-centeredness and vanity.
Over 18 months, the children and their parents were given several detailed questionnaires that were designed to measure narcissistic traits and parental behavior. There was a small but significant link at each stage between how much parents praised their children and how narcissistic the children were six months later. Because the effect was only small, it suggests that other things also make people selfish and self-centered. I suspect the effect is larger in the U.S.
Maybe school culture is part of the problem. As in Lake Woebegon, "all kids are above average." For brighter students, the instructional rigor is so low that these kids get a false sense of how smart they are and how easy it is to be an "A" student.
I suspect that another factor is that students are not taught enough about how to be realistically self-aware. They may not even know when they are making excuses unless adults call them on it. Too often, parents side with the student in criticizing a teacher when the real problem is with the child.
Some of the blame shifting comes from biology. It is in human nature to claim ownership of things we do that turn out well, but disown actions that yield negative consequences. Experiments support this conclusion. The most recent experiments had a primary focus on our sense of time in association with voluntary actions. The experimental design was based on prior evidence that the perceived estimate of time lag between when we do something and when we think we did it is an implicit index of our sense of ownership. Investigators asked people to press a key, which was followed a quarter of a second later by negative sounds of fear or disgust, positive sounds of achievement or amusement, or neutral sounds. The subjects were then asked to estimate when they had made the action and when they heard the sound. Timing estimation errors were easily measured by computer. Subjects sensed a longer time lag between their actions and the consequences when the outcome (the sound) was negative than when it was positive.

Teaching Kids to Deal with Failure


There is a common denominator to most self-limiting styles of living. It is a fear of failure. Children express this fear by making excuses, which has the unintended effect of blocking the path to success. Excuses may provide immediate relief of anxiety, but it creates a self-limiting learning style that assures continued underachievement.
Whatever one’s station in life, one axiom is paramount: for things to get better for you, you have to get better. This point is well illustrated in an inspiring rags-to-riches success book by A. J. Williams. He points out that a main reason that people do not make the changes they need to is that they are afraid of failure. But, paradoxically, learning from failure is how many people turn their lives around and become happier. Children, I have noticed, are highly resistant to personal change, maybe more so than adults. I am dismayed at how often I show children how to memorize more effectively and they just can't bring themselves to study in a different way. It is as if they don't believe me enough to even try new approaches. Or maybe they have convinced themselves they are mediocre and need the shield of excuses to keep others from detecting their weaknesses.
Louis Armstrong, the famous trumpeter, told an instructive story about fear when he was a boy. One day when his mother asked him to go down to the levee to fetch a pail of drinking water, he came back home with an empty pail. Upon noticing the empty pail, his mother said, “I told you to bring back a pail of water for us to drink. How come your pail is empty?” Louis replied, “There’s an alligator there, and I was scared to death.” His mother then said, “You shouldn’t be afraid. That gator is as afraid of you as you are of him.” To which Louis answered, “If that’s the case, then that water ain’t fit to drink.”
If there is an alligator keeping you away from what you need to do, have faith you will prevail over your demons. But as long as a child lets fear get in the way, her pail will stay empty.
Other kinds of fear are also self-limiting. Many children fear commitment to learning. Commitment exacts an emotional price requiring dedication, passion, and self-discipline. Children fear confusion and difficulty. They fear disapproval.
Kids need to put their under-performance in perspective. Failure and under-achievement are not permanent. They are not pervasive reflections of inadequacy. Children can acquire learning skills that lead to success. Unfortunately, schools don't teach much about learning skills, being focused on teaching to high-stakes tests.
Kids need to recognize their weakness and strive to fix them. But to bolster their motivation and general attitude about school, they need to recognize what they have done well and strive to do even more of that. Dwelling on under-performance is counter-productive.

The Most Important Thing Kids Need to Learn


Excuse-making prevents a child from developing the attitude that will best serve them throughout life: a sense of personal efficacy, a state of perceived control over one's life. I explain this more thoroughly in my book, "Blame Game, How to Win It." But a summary here will have to suffice.
How children perceive their personal power determines how much effort they will expend to control their lives. If they lack a genuine sense of power, excuse-making applies salve to their wounded egos. Self-efficacy is not the same as self-esteem. Psychologist, Albert Bandura, puts it this way: “Perceived self-efficacy is concerned with judgments of personal capability, whereas self-esteem is concerned with judgments of self-worth.” Both are important for happiness, but it is perceived self-efficacy that drives academic achievement. One practical application where this distinction is apparently not recognized is with school teachers who think the cure for low achievement in school is to foster self-esteem. Teachers should emphasize self-efficacy. Children learn self-efficacy from teachers and parents who enable them to master their environment. Students who are filled with self-doubt do not put much effort into school work. They make excuses. As kids are progressively given the skills to achieve, they develop a sense of confidence in their ability to succeed, which will motivate them to strive for more achievement. When I was a kid, I only became a good student when I discovered, more or less by accident, that I could make good grades. Discovering that I could make good grades if I tried motivated me to do just that. This sense has to be earned. It does not come from excuses.


Sources:

Brummelman, Eddie, et al. (2015). Origins of narcissism in children. Proceedings of the National Academy of Sciences, DOI: 10.1073/pnas.1420870112

Klemm, W. R. (2008). Blame Game. How To Win It. Bryan, Tx.: Benecton.
           

Yoshie, et al. (2013). Negative emotional outcomes attenuate sense of agency over voluntary actions. Current Biology. Dx.doi.org/10.1016/j.cub2013.08.034

Wednesday, February 25, 2015

Study Smart Beats Study Hard

Keep your "nose to the grindstone" is the advice we often tell young people is an essential ingredient of learning difficult tasks. A joke captures the matter with the old bromide for success, "Keep your eye on the ball, your ear to the ground, your nose to the grindstone, your shoulder to the wheel: Now try to work in that position."


Over the years of teaching, I have seen many highly conscientious students work like demons in their study yet don't seem to learn as much as they should for all the effort they put in. Typically, it is because they don't study smart.
In an earlier post, I described a learning strategy wherein a student should spend short (say 15-20 minutes) of intense study followed immediately by a comparable rest period of "brain-dead" activity where they don't engage with intense stimuli or a new learning task. The idea is that during brain down-time the memory of just-learned material is more likely to be consolidated into long-term memory because there are no mental distractions to erase the temporary working memory while it is in the process of consolidation.
Now, new research suggests that too much nose-to-the-grindstone can impair learning. Margaret Schlichting, a graduate student researcher, and Alison Preston, an associate professor of psychology and neuroscience at the University of Texas tested the effect of mental rest with a learning task of remembering two sets of a series of associated photo pairs.  Between the two task sets, the participants rested and were allowed to think about whatever they wanted. Not surprisingly, those who used the rest time to reflect on what they had just learned were able to remember more upon re-test. Obviously, in this case, the brain is not really resting, as it is processing (that is, rehearsing) the new learning. But the brain is resting in the sense that new mental challenges are not encountered.
The university press release quotes the authors as saying, "We've shown for the first time that how the brain processes information during rest can improve future learning. We think replaying memories during rest makes those earlier memories stronger, not just impacting the original content, but impacting the memories to come." Despite the fact that this concept has been anointed as a new discovery in a prestigious science journal, the principle has been well-known for decades. I have explained this phenomenon in my memory books as the decades-old term of "interference theory of memory,"
What has not been well understood among teachers is the need to alter teaching practices to accommodate this principle. A typical class period involves teachers presenting a back-to-back succession of highly diverse learning objects and concepts. Each new topic interferes with memory formation of the prior topics. An additional interference occurs when a class period is disrupted by blaring announcements from the principal's office, designed to be loud to command attention (which has the effect of diverting attention away from the learning material). The typical classroom has a plethora of other distractions, such as windows for looking outside and multiple objects like animals, pictures, posters, banners, and ceiling mobiles designed to decorate and enliven the room. The room itself is a major distraction.
Then, to compound the problem, the class bell rings, and students rush out into the hall for their next class, socializing furiously in the limited time they have to get to the next class (on a different subject, by a different teacher, in a differently decorated classroom). You can be sure, little reflection occurs on the academic material they had just encountered.
The format of a typical school day is so well-entrenched that I doubt it can be changed. But there is no excuse for blaring loudspeaker announcements during the middle of a class period. Classrooms do not have to be decorated. A given class period does not have to be an information dump on overwhelmed students. Short periods of instruction need to be followed by short, low-key, periods of questioning, discussion, reflection, and application of what has just been taught. Content that doesn't get "covered" in class can be assigned as homework—or even exempted from being a learning requirement. It is better to learn a few things well than many things poorly. Indeed, this is the refreshing philosophy behind the new national science standards known as "Next Generation Science Standards."
Give our kids a rest: the right kind of mental rest.

Sources:

http://www.nextgenscience.org/

http://scicasts.com/neuroscience/2065-cognitive-science/8539-study-suggests-mental-rest-and-reflection-boost learning.

Schlicthing, M. L., and Preston, A. R. (2014). Memory reactivation during rest supports upcoming learning of related content. Proc. Nat. Acad. Science. Published ahead of print, Oct. 20.


Dr. Klemm's latest book, available at most retail outlets, is "Mental Biology. The New Science of How the Brain and Mind Relate" (Prometheus). See reviews at http://thankyoubrain.com

Saturday, February 07, 2015

How Learning Cursive Might Improve Reading Efficiency and Hand-eye Coordination

When directing the writing by hand, the brain has to visually track rapidly changing positions of the pencil and control hand and finger movements. To learn such skills, the brain must improve its control over eye-movement saccades and the processing of visual feedback to provide corrective feedback. Both tracking and movement control require much more engagement of neural resources in producing cursive or related handwriting methods than in hand printing, because the movements are more complex and nuanced. Thus, learning cursive is a much greater neural activator, which in turn must engage much more neural circuitry than the less demanding printing.

The key to learning successful handwriting, whether cursive, italics, or calligraphy, is well-controlled visual tracking and high-speed neural responses to the corrective feedback. Scientists are now starting to study the mechanisms, but not yet in the context of education. Two recent reports, seemingly unrelated to each other or to cursive, examined visual tracking and found results that could have profound educational implications for both reading and hand-eye coordination training, as in learning to touch type.

Visual targets are fixed by saccades. One theory is that the eyes scan the target with a linked series of saccades, in this case the changes in cursive letter structure as the letters are being rapidly formed. We already know that the brain predicts eye movements based on what they see at each saccade fixation. This is how our visual world is made stable, even though the eyes are flicking around; otherwise, the image would jitter back and forth constantly. This suggests that visual image representation is integrated rapidly over many successive saccades. The degree of tracking speed, accuracy, and prediction error must surely influence how well the letters are transcribed during handwriting. The corollary is that the better one learns to write by hand, the better the brain is learning how to track visually.

Scientists used to think that these predictions were the source of error in estimating the position of seen objects. In handwriting, for example, the brain would assess the shape of part of a letter as you draw it and predict how and where the next portion of the letter should be added. Learning how to optimize the drawing then would be a matter of learning how to reduce prediction errors.
However, a new study tested the hypothesis that if localization errors really are caused by faulty predictions, you would also expect those errors to occur if an eye movement, which has already been predicted in your brain, fails to take place at the very last moment in response to a signal to abort the eye movement. The investigators (Atsma et al. 2014) asked test subjects to look at a computer screen and tracked eye movement fixation on a very small ball that appeared at various random positions. During this task, the brain must correctly predict where the eyes have to move to keep the eye on the ball.

The experiment ended with one last ball on the screen, followed by a short flash of light near that ball. The person had to look at the last stationary ball while using the computer mouse to indicate the position of the flash of light. However, in some cases, a signal was sent around the time the last ball appeared, indicating that the subject was NOT allowed to look at the ball. In other words, the eye movement was cancelled at the last moment. The person being tested still had to indicate where the flash was visible.

Subjects did not make any mistakes in fixation on the light location during the abortion test, even though the brain had already predicted that it needed to fixate on the ball. Most mislocations occurred when the flash appeared at the moment the eye movement began. Thus, the errors seemed to be associated with neural commands for eye fixation, not with saccade predictions. The application for handwriting learning is that the neural circuits that control target fixation may be a major factor in learning how to write cursive well. Surely, these circuits would be responsive to training, though that was not done in this experiment. It would seem possible that these circuits might be trained via learning cursive to provide faster and more accurate visual tracking, which should have other benefits—as in reading.

A related study of visual tracking in monkeys reveals parallel processing during visual search (Shen and Paré. 2014). Recordings from neurons in the visual pathway during visual tracking of targets in a distracting field showed that in the untrained state, these neurons had indiscriminate responses to stimuli. However, with training the neuronal function evolved to predict where the moving target should be in advance of the actual saccade. Results also showed that more than half the neurons learned to predict where the next two eye movements (saccades) needed to be, which obviously suggests that accurate tracking can be accelerated without loss of information.

In short, learning cursive should train the brain to function more effectively in visual scanning. Theoretically, reading efficiency could benefit. I predict that new research would show that learning cursive will improve reading speed and will train the brain to have better hand-eye coordination. In other words, schools that drop cursive from the curriculum may lose an important learning-skills development tool. The more that students acquire learning skills, the less will be the need for "teaching to the test."

"Memory Medic's" latest books are 
Mental Biology (Prometheus) and Memory Power 101 (Skyhorse).

Sources:
Atsma, J. et al. (2014). No peri-saccadic mislocalization with abruptly cancelled saccades.
Journal of Neuroscience, 15 April 2014. ttp://www.jneurosci.org/content/34/16/5497.full.html


Shen, Kelly and ParĂ©, Martin. 2014.  Predictive saccade target selection in superior colliculus during visual search. The Journal of Neuroscience, 16 April 2014, 34(16): 5640-5648; doi: 10.1523/JNEUROSCI.3880-13.2014

Sunday, January 25, 2015

Health Benefits of Resveratrol: New Plaudits

Joe: My doctor told me to give up drinking, smoking, and fatty foods.
Sam: What will you do?
Joe: I think I’ll give up my doctor.

I try not to get too excited about memory benefits of supplements, because too often the claims are not substantiated by studies that are well controlled and peer reviewed. I now think resveratrol may be one of the few supplements that benefits brain function.

When I wrote my first blog on research on resveratrol benefits for brain function and memory, there were over 2,000 scientific papers.[1] Don't worry; I am only going to tell you about a few studies.

Resveratrol is an active ingredient in red wine. This compound has been credited for explaining why red-wine drinkers in France, who drink more wine than most people, are healthier than would be predicted by their lifestyle of little exercise and eating lots of cheese. The problem is most studies suggest you would have to drink a 100 or more glasses of red wine a day to get much resveratrol effect (and that effect would obviously be negated by a toxic dose of alcohol). An obviously more healthful choice is the highly concentrated pill forms of resveratrol that are now on the market.

Most of the protective biological actions associated with resveratrol have been associated with its scavenger properties for free radicals and the protective effects that it confers on the heart and diabetes. 

One important study comes from a diabetes research group in Brazil recently who reported a beneficial effect of resveratrol on diabetic rats.[2] Resveratrol (in a modest rat dose of 10 and 20 mg per kilogram per day for 30 days) prevented the impairment of memory induced by diabetes. Resveratrol may be protecting neuron terminals that diabetes can damage. An earlier study by another group showed resveratrol improved glucose metabolism and promoted longevity in diabetic mice.

Another benefit of resveratrol is the anti-oxidant property. The brain produces more free-radical damage than other organs, because it burns so much oxygen. Compared with other organs, the brain has especially low levels of antioxidant defense enzymes. 

One recent study has revealed resveratrol had protective effects against brain damage caused by a chemical that kills acetylcholine neurons. Injection of this toxin into the brain of rats impaired their memory performance in two kinds of maze tasks. The impairment was significantly reduced by repeated injection of resveratrol (10 and 20 mg/kg) per day for 25 days, beginning four days before the toxin injection.[3]

Another recent study examined effects on working memory in mice fed a resveratrol-supplemented diet for four weeks before being injected with a cytokine to induce inflammation and accelerate aging. Resveratrol significantly reduced memory impairment in the aged group, but not in the young adults[4]. The lack of benefit in young adults was a little misleading, in that there was a "ceiling effect" in that the young adults were not impaired by the cytokine injection.

 The practical issue for us is whether resveratrol will help cognitive function in humans, especially healthy humans. It seems likely because other substances that have strong anti-oxidant properties seem to improve memory capability. Because animal studies have shown promise for resveratrol in preventing or treatment several different conditions associated with aging, several human clinical trials have been initiated.[5]

 An impressive new study of older humans, male and female, has just been reported.[6] Twenty-three healthy, but overweight people completed 6 months of daily resveratrol intake (200 mg ― the commercial brand I take has 300 mg/capsule). A paired control group got placebo pills. A double-blind design assured that neither the subjects nor the experimenters knew which individuals were in each group during data processing. Memory tests of word recall revealed significant improvement in the resveratrol group. Resveratrol also increased brain-scan measures of functional connectivity, which identified linked neural activity between the hippocampus and several areas of cerebral cortex.

Because others had shown that resveratrol increased insulin sensitivity in humans, these authors examine several markers important to diabetes. Resveratrol decreased the standing levels of sugar-bound hemoglobin, a standard marker for glucose control.  

What foods besides red grapes have resveratrol? The most likely other sources you would eat or drink are blueberries, cranberries, and peanuts. It is not likely that you could drink or eat enough of such substances to get enough resveratrol to do much good. Because of the scientifically documented benefits of resveratrol, highly concentrated supplements are now on the market (I have been taking it for a couple of years). I haven't given up my two glasses of red wine each day, but I have started taking one of the supplements. I haven't seen any reports that high doses of resveratrol are toxic.




[2] Schmatz R, et al. (2009). Resveratrol prevents memory deficits and the increase in acetylcholinesterase activity in streptozotocin-induced diabetic rats. Eur J Pharmacol. 2009 May 21;610(1-3):42-8. Epub 2009 Mar 19.
[3] Kumar, A. et al. 2007. Neuroprotective effects of resveratrol against intracerebroventricular colchicine-induced cognitive impairment and oxidative stress in rats. Pharmacology.79 (1): 17-26. DOI: 10.1159/000097511
[4] Abraham, J., and Johnson, R. W. 2009. Consuming a diet supplemented with resveratrol reduced infection-related neuroinflammation and deficits in working memory in aged mice. Rejuvenation research. 12 (6): 445-453.  DOI: 10.1089/rej.2009.0888
[5] Smoliga, J. M. et al. (2011). Resveratrol and health – a comprehensive review of human clinical trials.  Mol. Nutrition Food Res. 55: 1129-1141
[6] Witte, A. V., et al. (2014) Effects of resveratrol on memory performance, hippocampal functional connectivity, and glucose metabolism in healthy older adults. J. Neuroscience. 34 23): 7862-7870.

"Memory Medic's latest book is for seniors (Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine," available in inexpensive e-book format at https://www.smashwords.com/books/view/496252 See also his recent book, "Mental Biology. The New Science of How the Brain and Mind Relate" (Prometheus).