Monday, August 03, 2015

Here's How You Become an Expert

Want to become an expert? It doesn’t matter what the subject is, the principle for developing expertise is the same. My many years of personal experience and observing students convinces me of a learning axiom: the more you know, the more you can know.
A recent research report helps to explain what the brain is doing as it acquires expertise. By observing which brain areas are active at the same time, one can conclude that such areas are probably functionally connected even though they are located at different locations within the brain’s network of circuits. In recently reported experiments, researchers used MRI scans of subjects as they rested after mastering a set of initial associations of pairs of faces and objects and as they learned new pairs. Scans were collected during rest immediately after subjects had memorized a series of face/object pairs, and during learning of new face/object pairs or pairs that did not overlap the original paired set. The data indicated that spontaneous activation of hippocampal and neocortical functional connectivity during rest was related to better subsequent learning of new pairs. Moreover, the degree of functional connectivity during rest predicted the brain-area functional connectivity activation during the new learning experience.
The rationale for the experiment includes the well-known fact that the hippocampus is needed to promote storage of explicit memories in the neocortex. Moreover, we know that “off-line” rehearsal of memories occurs during mental rest and even sleep because the participating neural circuitry becomes periodically reactivated. The issue that the researchers pursued was based on an assumption that one purpose of memory is to enhance the learning of future related material. Thus, the hippocampal-neocortex connectivity that occurred during initial learning should also recur during rest and be relevant to new related material.
Spontaneous activation of the hippocampal-neocortical functional connectivity in MRI scans is the index of this off-line memory processing. The data showing the relationship of this connectivity during rest and new learning support the author’s general conclusion that “how our brains capture and store new information is heavily influenced by what we already know.”
This brings me to the real practical relevance of this research: learning to learn. What we see here is scientific evidence for how the brain teaches itself by learning to have more learning.
Here is a practical example of what I mean. I just finished attending the Newport Jazz festival, which included interview of some of the artists. Jon Faddis, a phenomenal trumpet player who can begin a phrase with high C and go up from there, discussed his experience with his trainees. He tells them what most of them won’t do: “If you are not practicing 4-6 hours a day, every day, you are just wasting your time.”  In other words, to become an expert jazz musician, you have to accumulate a large amount of prior knowledge, which of course takes lots practice. I have noticed in my own career that over time I am getting more and more competent to move into new areas of neuroscience even though I am getting older and supposedly have less ability to learn than when I was young.
This brings me to the subject of education. Our educational system is crippled by the apparent assumption that children are good learners because their brains are young. Therefore, curriculum focuses on content and testing. But children don’t have much knowledge to build on to accomplish efficient learning of new content. To compensate, schools need much more emphasis on teaching basic learning skills, which children don’t know much about either, because again they don’t have much experience at learning how to learn. I'm not sure that teachers get enough training for teaching learning skills. 
Just what are these skills that I think should be taught explicitly in the early grades? I am writing a book on that to help parents and teachers. Here, I can only summarize. Learning skills operate in a cycle that begins with motivation–and yes, that is something you can learn, especially grit. Then comes learning how to be attentive and to focus. Next is knowing how to organize learning material coherently to make it easier to master. Material to be learned needs to be understood, not just memorized. There are multiple tactics one can learn to improve the ability to understand complex material. The better you understand a subject, the less you have to memorize because there is so much that you can acquire through reasoning. Memorization skills, however, are far more useful than most teachers realize or know how to teach. Most under-performance of students on high-stakes tests is due to poor memory, which is why teachers go over and over ad nauseum the same material in preparation for tests. The final steps in the learning skills cycle are problem solving and creativity. And yes, both of those skills are teachable for those who know how.
Regardless of subject matter, the process of acquiring enough knowledge to set the stage to become an expert includes also the implicit learning of how to learn new material in the field. There are no shortcuts to becoming an expert. The process begins with learning how to learn.

Dr. Klemm, a.k.a. “Memory Medic,” teaches teachers about the learning skills cycle. See his recent books, “Memory Power 101” and “Better Grades, Less Effort.”

Source:
Schlichting, Margaret L., and Preston, Alison R. (2015). Memory reactivation during rest supports upcoming learning of related content. Proc. Nat. Acad. Science, www.pnas.org/cgi/doi/10.1073/pnas.1404396111


Monday, June 15, 2015

Sleep Away Your Bad Attitudes

Generally speaking, you cannot learn from sounds of new information while you sleep, though this was a fad several decades ago. But in an earlier post, I discussed a new line of research where sleep learning can occur. The key is to play sound cues that were associated with learning that occurred during the previous wakefulness period. The explanation I posted was that cue-dependent sleep learning can work because a normal function of sleep is to strengthen memories of new information and that presenting relevant cues during sleep increases the retrieval of these memories and makes them accessible for rehearsal and strengthening.
The latest experiment by a different group shows that this cuing during sleep can modify bad attitudes and habits. The test involved counter stereotype-training of certain biased attitudes during wakefulness, and investigators reactivated that counter-training during sleep by playing a sound cue that had been associated with the wakefulness training.
In the experiment, before a 90-minute nap 40 white males and females were trained to counter their existing gender and racial biases by counter-training. A formal surveyed allowed quantification of each person's level of gender or racial bias before and after counter-training. For example, one bias was that females are not good at math. Subjects were conditioned to have a more favorable attitude about women and math with counter-training that repeatedly associated female faces with science-related words. Similarly, racial bias toward blacks was countered by associating black faces with highly positive words. In each training situation, whenever the subject saw a pairing that was incompatible with their existing bias they pressed a "correct" button, which yielded a confirmatory sound tone that was unique for each bias condition. Subjects were immediately tested for their learning by showing a face (female or black) and the counter-training cue, whereupon they were to drag the appropriate bias-free face on to a screen with the positive word. For example, if the first test screen was that of a woman, accompanied by the sound cue, the subject dragged a woman's face onto a second screen that said "good at math." Results revealed that this conditioning worked: both kinds of bias were reduced immediately after counter-conditioning.
Then during the nap, as soon as EEG signs indicated the presence of deep sleep, the appropriate sound cue was played repeatedly to reactivate the prior learning. When subjects re-took the bias survey a week later, the social bias was reduced in the sound-cued group, but not in the control group that was trained without sound cues.
Experimenters noted that the long-term improvement of bias was associated with rapid-eye-movement (REM) (dream) sleep which often followed the deep sleep during early stages of the nap. That is, the beneficial effect was proportional to the amount of nap time spent in both slow-wave sleep and REM sleep, not either alone. It may be that memories are reactivated by cuing during deep (slow-wave) sleep, but that the actual cell-level storage of memory is provided by REM sleep.
Implications of this approach to enhancing learning and memory show a great deal of promise. Can it be used for enhancing learning in school? Can it be used in rehabilitation of addicts or criminals? But there is a dark side. Now might be a good time to re-read Huxley's Brave New World wherein he actually described conditioning values in young children while they slept. Sleep is a state where people are mentally vulnerable and without conscious control over their thoughts. Malevolent people could impose this kind of conditioning and memory enhancement on others for nefarious purposes.  These techniques may have valid social engineering applications, but they must be guided by ethical considerations.

Dr. Klemm is author of Memory Power 101 (Skyhorse), Better Grades, Less Effort (Benecton), and Mental Biology (Prometheus).

Sources:

Klemm, W. R. (2013). New discoveries on optimizing femory formation.  http://thankyoubrain.blogspot.com/2013/05/new-discoveries-on-optimizing-memory.html


Hu, Xiaoqing et al. (2015. Unlearning implicit social biases during sleep. Science. 348(6238), 1013-1015.

Wednesday, June 03, 2015

Nine Steps to Remember What You Learn

The three most important times for learning are: Before, During, and (soon) After.

Before
1. Bring your “A game.” Choose to be positive and interested. Being bored is a choice— a self-defeating choice.
2. Check your foundation. Come prepared.
3. Expect to remember.

During
4. Pay Attention. Ask questions.
5. Take good notes.
6. THINK!

(soon) After

7. Avoid mental interference. Use quiet, uninterrupted reflection during rehearsal.
8. Apply what you just learned
9. Self- test. Really test, don't just "look over." Repeat several times in the next hours and days.



"Memory Medic" is author of Memory Power 101 and Better Grades, Less Effort. Both are available at Amazon.com.

Saturday, May 30, 2015

Decision-making 401

In the previous post, Decision-making 101, I provided evidence that selective attention to items that were retrieved into working memory were a major factor in making good decisions. This has generally unrecognized educational significance. Rarely is instructional material packaged with foreknowledge of how it can be optimized in terms of reducing the working memory cognitive load. New research from a cognitive neuroscience group in the U.K. is demonstrating the particular importance this has for learning how to correctly categorize new learning material. They show that learning is more effective when the instruction is optimized ("idealized" in their terminology).

Decisions often require categorizing novel stimuli, such as normal/abnormal, friend/foe, helpful/harmful, right/wrong or even assignment to one of multiple category options. Teaching students how to make correct category assignments is typically based on showing them examples for each category. Categorization issues routinely arise when learning is tested. For example, the common multiple-choice testing in schools requires that a decision be made on each potential answer as right or wrong.

In reviewing the literature on optimizing training, these investigators found reports that one approach that works is to present training in a specific order. For example, in teaching students how to classify by category, people perform better when a number of examples from one category are presented together followed by a number of contrasting examples from the other category. Other ordering manipulations are learned better if simple, unambiguous cases in either category are presented together early in training, while the harder, more confusing cases are presented afterwards. Such training strengthens the contrast between the two categories.

The British group has focused on the role of working memory in learning. Their idea is that ambiguity during learning is a problem. In real-world situations that require correct category identification, naturally occurring ambiguities make correct decisions difficult. Think of these ambiguities as cognitive "noise" that interferes with the training that is recalled into working memory. This noise clutters the encoding during learning and clutters the thinking process and impairs the rigorous thought processes that may be needed to make a correct distinction. In the real world of youngsters in school, other major cognitive noise sources are the task-irrelevant stimuli that come from multi-tasking habits so common in today's students.

The theory is that when performing a learned task, the student recalls what has been taught into working memory. Working memory has very limited capacity, so any "noise" associated with the initial learning may be incompletely encoded and the remembered noise may also complicate the thinking required to perform correctly. Thus, simplifying learning material should reduce remembered ambiguities, lower the working memory load, and enable better reasoning and test performance.


One example of optimizing learning is the study by Hornsby and Love (2014) who applied the concept to training people with no prior medical training to decide whether a given mammogram was normal or cancerous. They hypothesized that learning would be more efficient if students were trained on mammograms that were easily identified as normal or cancerous, and did not include examples where the distinction was not so obvious. The underlying premise is that decision-making involves recalling past remembered examples into working memory and accumulating the evidence for the appropriate category.  If the remembered items are noisy (i.e. ambiguous) the noise also accumulates and makes the decision more difficult. Thus, learners will have more difficulty if they are trained on examples across the whole range of possibilities from clearly evident to obscure than if they were separately trained on examples that were clearly evident as belong into one category or another.

Initially a group of learners was trained on a full-range mixture of mammograms so the images could be classified by diagnostic difficulty as easy or hard or in between. On each trial, three mammograms were shown: the left image was normal, the right was cancerous, and the middle was the test item requiring a diagnosis of whether it was normal or cancerous.

In the actual experiment, one student group was trained to classify a representative set of easy, medium, and hard images, while the other group was trained only on easy samples. During training trials, learners looked at the three mammograms, stated their diagnosis for the middle image, and were then given feedback as to whether they were right or wrong. After completing all 324 training trials, participants completed 18 test trials, which consisted of three previously unseen easy, medium and hard items from each category displayed in a random order. Test trials followed the same procedure as training trials.

When both groups were tested on samples across the range in both conditions, the optimized group was better able to distinguish normal from cancerous mammograms in both the easy and medium images. Note that the optimized group was not trained on medium images. However, no advantage was found in the case of hard test items; both groups made many errors on the hard cases, and optimized training yielded poorer results than regular training. 

We need to explain why this strategy does not seem to work on hard cases. I suspect that in easy and medium cases, not much understanding is required. It is just a matter of pattern recognition, made easier because the training was more straightforward and less ambiguous. The learner is just making casual visual associations. For hard cases, a learner must know and understand the criteria needed to make distinctions. The subtle differences go unrealized if diagnostic criteria are not made explicit in the training. In actual medical practice, many mammograms actually cannot be distinguished by visual inspection—they really are hard. Other diagnostic tests are needed.

The basic premise of such research is that learning objects or task should be pared down to the basics, eliminating extraneous and ambiguous information, which constitute “noise” that confounds the ability to make correct categorizations.

In common learning situations, a major source of noise is extraneous information, such as marginally relevant detail. Reducing this noise is achieved by focus on the underlying principle. Actually I stumbled on this basic premise of simplification over 50 years ago when I was a student trying to optimize my own learning. What I realized was the importance of homing in on the basic principle of what I was trying to learn from instructional material. If I understood a principle, I could use that understanding to think through to many of the implications and applications.

In other words, the principle is: "don't memorize any more than you have to." Use the principles as a way to figure out what was not memorized. Once core principles are understood, much of the basic information can be deduced or easily learned. This is akin to the standard practice of moving from the general to the specific. Even so, general ideas should emphasize principles.

Textbooks are sometimes quite poor in this regard. Too many texts have so much ancillary information in them that they should be thought of as reference books. That is why I have found a good market for my college-level neuroscience electronic textbook, “Core Ideas in Neuroscience,” in which each 2-3 page chapter is based entirely on each of the 75 core principles that cover the broad span of membrane biochemistry to human cognition.. A typical neuroscience textbook by other authors can run up to 1,500 pages.



Source:

Hornsby, Adam, and Love, B. C. (2014). Improved classification of mammograms following idealized training. J. Appl. Res. Memory and Cognition. 3(2):72-76.


Dr. Klemm is a Senior Professor of Neuroscience at Texas A&M. His latest books are Memory Power 101, (Skyhorse) and Mental Biology (Prometheus). He also writes learning and memory blogs for Psychology Today magazine and his own site at thankyoubrain.blogspot.com. His posts have nearly 1.5 million reader views.

Thursday, May 21, 2015

Decision-making 101

Teenagers are notorious for poor decision-making. Of course that is inevitable, given that their brains are still developing, and they have had relatively little life experience to show them how to predict what works and what doesn’t. Unfortunately, what doesn’t work may have more emotional appeal, and most of us at any age are more susceptible to our emotions than cold, hard logic.
Seniors also are prone to poor decision-making if senility has set it. Unscrupulous people take advantage of such seniors because a brain that is deteriorating has a hard time making wise decisions.
In between teenage and senility is when the brain is at its peak for good decision making. Wisdom comes with age, up to a point. Some Eastern cultures venerate their old people as generally being especially wise. After all, it you live long enough, and are still mentally healthy, you ought to make good decisions because you have a lifetime of experience to teach you what future choices are likely to work and which are not.
Much of that knowledge comes from learning from one’s mistakes. On the other hand, some people, regardless of age, can’t seem to learn from their mistakes. Most of the time the problem is not stupidity but a flawed habitual process by which one is motivated to make wise decisions and evaluate options. Best of all is learning from somebody else’s mistakes, so you don’t have to make them yourself.
Learning from your mistakes can be negative, if you fret about it. Learning what you can do to avoid repeating a mistake is one thing, but dwelling on it erodes one’s confidence and sense of self worth. I can never forget the good advice I read from, of all people, T. Boone Pickens. He was quoted in an interview as saying that he was able to re-make his fortune on multiple occasions because he didn’t dwell on losing the fortunes. He credited that attitude to his college basketball coach who told the team after each defeat, “Learn from your mistakes, but don’t dwell on them. Learn from what you did right and do more of that.”
It would help if we knew how the brain made decisions, so we could train it to operate better. “Decision neuroscience” is an emerging field of study aimed at how learning how brains make decisions and how to optimize the process. Neuroscientists seemed to have honed in on two theories, both of which deal with how the brain handles the processing of alternate options to arrive at a decision.
One theory is that each option is processed in its own competing pool of neurons. As processing evolves, the activity in each pool builds up and down as each pool competes for dominance. At some point, activity builds up in one of the pools to reach a threshold, in winner-take-all fashion, to allow the activity in that pool to dominate and issue the appropriate decision commands to the parts of the brain needed for execution. As one possible example, two or more pools of neurons separately receive input that reflects the representation of different options. Each pool sends an output to another set of neurons that feed back either excitatory or inhibitory influences, thus providing a way for competition among pools to select the pool that eventually dominates because it has built up more impulse activity than the others.

The other theory is based on guided gating wherein input to pools of decision-making neurons is gated to regulate how much excitatory influence can accumulate in each given pool. [i] The specific routing paths involve inhibitory neurons that shut down certain routes, thus preferentially routing input to a preferred accumulating circuit. The route is biased by estimated salience of each option, current emotional state, memories of past learning, and the expected reward value for the outcome of each option.
These decision-making possibilities involve what is called “integrate and fire.” That is, input to all relevant pools of neurons accumulates and leads to various levels of firing in each pool. The pool firing the most is most likely to dominate the output, that is, the decision.
However circuits make decisions, there is considerable evidence that nerve impulse representations for each given choice option simultaneously code for expected outcome and reward value. These value estimates update on the fly.[ii] Networks containing these representations compete to arrive at a decision.
Any choice among alternative options is affected by how much information for each option the brain has to work on. When the brain is consciously trying to make a decision, this often means how much relevant information the brain can hold in working memory. Working memory is notoriously low-capacity, so the key becomes remembering the sub-sets of information that are the most relevant to each option. Humans think with what is in their working memory. Experiments have shown that older people are more likely to hold the most useful information in working memory, and therefore they can think more effectively. The National Institute of Aging began funding decision-making research in 2010 at Stanford University’s Center on Longevity. Results of their research are showing that older people often make better decisions than younger people.
As one example, older people are more likely to make rational cost-benefit analyses. Older people are more likely to recognize when they have made a bad investment and walk away rather than throwing more good money after bad.
A key factor seems to be that older people are more selective about what they remember. For example, one study from the Stanford Center compared the ability of young and old people to remember a list of words. Not surprisingly, younger people remembered more words, but when words were assigned a number value, with some words being more valuable than others, older people were better at remembering high-value words and ignoring low-value words. It seems that older people selectively remember what is important, which should make it easier to make better decisions.
Decision-making skills are important of learning achievement in school. Students need to know how to focus in general, and focus on what is most relevant in particular. They are not learning that skill, and their multi-tasking culture is teaching them many bad habits.
Those of us who care deeply about educational development of youngsters need to push our schools to address the thinking and learning skills of students. "Teaching to the test" detracts from time spent in teaching what matters most. Today's culture of multi-tasking is making matters worse. Children don't learn how to attend selectively and intensely to the most relevant information, because they are distracted by superficial attention to everything. Despite their daily use of Apple computers and smart phones, only one college student out of 85 could draw the Apple logo correctly.[iii]
Memory training is generally absent from teacher training programs. Despite my locally well-publicized experience in memory training, no one in the College of Education at my university has ever asked me to share my knowledge with their faculty or with pre-service teachers. The paradox is that teachers are trained to help students remember curricular answers for high-stakes tests. What could be more important than learning how to decide what to remember and how to remember it? And we wonder why student performance is so poor?


"Memory Medic" is author of Memory Power 101 (Skyhorse) and Better Grades, Less Effort (Benecton).



[i]Purcell, B. A.; Heitz, R. P.; Cohen, J. Y.; Schall, et al. (2010). Neurally constrained modeling of perceptual decision making. Psychological Review, 117(4), 1113-1143.

[ii] McCoy, A. N., and Platt, M. L. (2005). Expectations and outcomes: decision-making in the primate brain. J. Comp. Physiol A 191, 201-211.

[iii] Blake, Adam B., Nazarian, Meenely, and Castel, Alan D. (2015). The Apple of the mind's eye: Everyday attention, metamemory, and reconstructive memory for the Apple logo. The Quarterly Journal of Experimental Psychology, 2015; 1 DOI: 10.1080/17470218.2014.1002798

Thursday, April 30, 2015

Music Effects on Cognitive Function of the Elderly


Whether the music is orchestral, rock, country, or jazz, most seniors like to listen to some kind of music. Music can soothe or energize, make us happy or sad, but the kind we like to hear does something that can be positively reinforcing or otherwise we would not listen to it. As my 80-year-old jazz trumpeter friend, Richard Phelps, recently said at his birthday party, "Where there is life there is music. Where there is music, there is life."
Relatively little research has been done on the effects of music on brain function in older people. But one study recently reported the effects in older adults of background music on brain processing speed and two kinds of memory (episodic and semantic). The subjects were not musicians and had an average age of 69 years.
The music test conditions were: 1) no music control, 2) white noise control, 3) a Mozart recording, and 4) a Mahler recording. All 65 subjects were tested in counter-balanced order in all four categories. The music was played at modest volume as background before and during performance of the cognitive tasks, a mental processing speed task and the two memory tasks. The episodic memory task involved trying to recall a list of 15 words immediately after a two-minute study period. The semantic memory task involved word fluency in which subjects wrote as many words as they could think of beginning with three letters of the alphabet.
Processing speed performance was faster while listening to Mozart than with the Mahler or white noise conditions. No improvement in the Mahler condition was seen over white noise or no music.
Episodic memory performance was better when listening to either type of music thatn while hearing white noise or no music. No difference was noted between the two types of music.
Semantic memory was better for both kinds of music than with white noise and better with Mozart that with no music.
Recognizing that emotions could be a relevant factor, the experimenters analyzed a mood questionnaire comparing the two music conditions with white noise. Mozart generated higher happiness indicators than did Mahler or white noise. Mahler was rated more sad than Mozart and comparable to white noise.
Thus, happy, but not sad, music correlated with increased processing speed. The researchers speculated that happy subjects were more around and alert.
Surprisingly, both happy and sad music enhanced both kinds of memory over the white noise or silence condition. But it is not clear if this observation is generally applicable. The authors did mention without emphasis that the both kinds of music were instrumental and lacked loudness or lyrics that could have been distracting and thus impair memory. I think this point is substantial. When lyrics are present, the brain is dragged into trying to hear the words and thinking about their meaning. These thought processes would surely interfere with trying to memorize new information or recall previous learned material.
A point not considered at all is personal preference for a certain types of music. There are people who don't like classical music, and the data in this study could have been made "noisy" if enough of the 65 people disliked classical music and were actually distracted by it. In other words, the effects noted in this study might have been magnified if the subjects were allowed to hear their preferred music.
My take-home lesson was actually formed over five decades ago when I listed to jazz records while plowing my way through memorizing a veterinary medical curriculum. Then, I thought that the benefit was stress reduction (veterinary school IS stressful and happy jazz certainly reduces stress). Now perhaps I see that frequent listening to music that was pleasurable for me might have actually helped my memory capability. If you still have doubts you might want to check my latest blog post, "Happy thoughts can make you more competent" (http://thankyoubrain.blogspot.com/2015/01/happy-thoughts-can-make-you-more.html).
Anyway, now that I am in the elderly category, I see there is still reason to listen to the music I like. Music can be therapy for old age.


“People haven't always been there for me but music always has.”
    —Taylor Swift



"Memory Medic's" latest book is "Improve Your Memory for a Healthy Brain. Memory Is the Canary in Your Brain's Coal Mine." It is available in inexpensive e-book form at Amazon or in all formats at Smashwords.com.


Source:

Bottiroli, Sara et al. (2014). The cognitive effects of listening to background music on older adults: processing speed improves with upbeat music, while memory seems to benefit from both upbeat and downbeat music. Frontiers in Aging Neuroscience. Oct. 15. doi: 10.3389/fnagi.2014.00284.



Saturday, April 25, 2015

What Is the Optimal Spacing for Study?

We have all been told by teachers that learning occurs best when we spread it out over time, rather than trying to cram everything into our memory banks at one time. But what is the optimal spacing? There is no general consensus.
However we do know that immediately after a learning experience the memory of the event is extremely volatile and easily lost. It's like looking up a number in the phone book: if you think about something else at the same time you may have to look the number up again before you can dial it. School settings commonly create this problem. One learning object may be immediately followed by another, and the succession of such new information tends to erase the memory of the preceding ones.
Memory researchers have known for a long time that repeated retrieval enhances long-term retention. This happens because each time we retrieve a memory, it has to be reconsolidated and each such reconsolidation strengthens the memory. Though optimal spacing intervals have not been identified, research confirms the importance of spaced retrieval. No doubt, the nature of the information, the effectiveness of initial encoding, competing experiences, and individual variability affect the optimal interval for spaced learning.
One study revealed that repeated retrieval of learned information (100 Swahili–English word pairs) with long intervals produced a 200% improvement in long-term retention relative to repeated retrieval with no spacing between tests. Investigators compared different-length intervals of 15, 30, or 90 minute spacing that expanded (for example, 15-30-45 min), stayed the same (30-30-30 min) or contracted (45-30-15 min) revealed that no one relative spacing interval pattern was superior to any other.[1]
Another study[2] has revealed that the optimally efficient gap between study sessions depends on when the information will be tested in the future. A very comprehensive study of this matter in 1,350 individuals involved teaching them a set of facts and then testing them for long-term retention after 3.5 months. A final test was given at a further delay of up to one year. At any test delay, increasing the inter-study gap between the first learning and a study of that material at first increased and then gradually reduced final test performance. Expressed as a ratio, the optimal gap equaled 10-20% of the test delay. That is, for example, a one-day gap was best for a test to be given seven days later, while a 21-day gap was best for a test 70 days later. Few of any teachers or students know this, and their study times are rarely scheduled in any systematic way, typically being driven by test schedules for other subjects, convenience, or even the teacher's whim.
The bottom line: the optimal time to review a newly learned experience is just before you are about to forget it. Obviously, we usually don't know when this occurs, but in general the vast bulk of forgetting occurs within the first day after learning. As a rule of thumb, you can suspect that a few repetitions early on should be helpful in fully encoding the information and initiating a robust consolidation process. So, for example, after each class a student should quickly remind herself what was just learned—then that evening do another quick review. Before the next class on that subject, the student should review again. Teachers help this process by linking the next lesson to the preceding one.
Certain practices will reduce the amount of time needed for study and the degree of long-term memory formation. These include:

• Don't procrastinate. Do it now!
• Organize the information in ways that make sense (outlines, concept maps)
• Identify what needs to be memorized and what does not.
• Focus. Do not multi-task. No music, cell phones, TV or radio, or distractions of any kind.
• Association the new with things you already know.
• Associate words with mental images and link images to locations, or in story chains
• Think hard about the information, in different contexts
• Study small chunks of material, in short intervals. Then take a mental break.
• Say out loud what you are trying to remember.
• Practice soon after learning and frequently thereafter at spaced intervals.
• Explain what you are learning to somebody else. Work with study groups later.
• Self-test. Don't just "look over" the material. Truly engage with it.
• Never, never, ever CRAM!




[1] Karpicke, J. d., and Bauernschmidt, a. 2011. Spaced retrieval: absolute spacing enhances learning regardless of relative spacing. J. Exp. Psychol. 37 (5) 1250-1257.
[2] Cepeda, N. J. et al. 2008. Spacing effects in learning. A temporal ridgeline of optimal retention. 19  (11): 1095-1102

Friday, April 03, 2015

What Happened to The Wonder of Learning?

I read a lot about educational theory and research so that I can share "best practices" for better ways to teach and learn with my readers. Shared here are the ideas in a most informed and intelligent article on learning, written by Catherine L'Ecuyer, a Canadian lawyer with an MBA now living in Barcelona, Spain.[1] The article explains the fundamental importance for motivating children to learn: the sense of wonder.
This notion resonated with me, because I know it to be true from personal experience. To this
day, I have vivid memories of the excitement I had as a six-year old in Fort Myers, Florida, as I walked to my first day of school. Yes, in those days it was safe for kids to walk several blocks to school unattended. And yes, there was, at least for me, no kindergarten, pre-kindergarten, or day care.
Sauntering to school, I became entranced with all the new sights and sounds, stopping several times along the way to savor a new experience. A vivid memory was my stop at a beautiful flower I had never seen before. I physically probed the bloom, astonished at the elegant expression of nature. On that day, the prospect of school was a most joyous opportunity. It did not take long for school's pedantic nature, drills, and drudgery to squelch my sense of wonder. It was only in late middle school that my sense of wonder was resurrected, and that only occurred because I had a crush on my teacher and wanted to impress her with my learning. For many children, their inherent sense of wonder that school stamps out never returns.
Clearly, a child's state of mind affects how learning and the school environment are regarded. Among the more relevant states is stimulus seeking. That is why for example I wanted to explore the innards of that beautiful flower. Then too, there is the basic human responsiveness to positive reinforcement. If a learning experience is perceived as wondrous, it is perceived as good and beneficial, serving as incentive to have other such learning experiences. It obviously helps for a child to be aware of such perceptions.
L'Ecuyer adds the sense of wonder to the list of fundamentals of the motivation to learn. She makes the point that wonder is innate in children, especially when they are young. As a child matures, much of this sense of wonder can fade. For some people, the more they learn, the less wondrous the world seems. Among adults, scientists seem to be an exception (to a biologist, pond scum is beautiful and wondrous).
L'Ecuyer argues that modern educational paradigms are behaviorist and conflict with nurturing the sense of wonder in children. By behaviorist, she means that the guiding principle of teaching is that the environment directs learning with its emphasis on teachers, curriculum, and high-stakes testing. The popular mantra is that learning is better when it is provided earlier and in abundance. Curricula are designed to bombard students with information and testing. Do we really think that is motivating?
The problem is that children can be overwhelmed by too much too soon. Yet government policy increasingly advocates pre-kindergarten. Young developing brains do not respond well to too much stimulus, too much curriculum, and too much high-stakes testing. Children become preconditioned to expect high levels of stimulation, leading to attentiveness disorders. Children become passive and bored. The associated loss of the sense of wonder diminishes a child's motivation to cope with all this stimulus and pressure.
L'Ecuyer cites convincing research showing that compared to adults children learn at a slower pace than adults. They need more calm and silence. They are more intrigued by mystery. They need to trust in a human attachment figure, most commonly a caring mother.  Unfortunately, our educational culture assumes that we don't teach enough curriculum and don't demand enough of children. Children learn to pass tests, not love learning. Our multi-tasking culture only adds to sensory and cognitive overload that interferes with learning and mental performance in general. The family breakdown in our culture diminishes a child's trust in primary caregivers and degrades attachment to them. Schools cannot provide such trust and attachment. Nor can pre-kindergarten or day-care.
These are basic reasons why I push for a reform in education that stresses teaching learning skills to young children, as opposed to the domination of traditional curriculum and excessive high-stakes testing. I am writing such a book now. When children have good learning skills, learning stops being an onerous chore. The "Learning Skills Cycle" that I advocate begins with motivation, and motivation begins with a sense of wonder.[2]
Educational policy makers seem confused about why so many students fall behind. Every year, over 1.2 million students drop out of high school in the United States. That’s a student every 26 seconds – or 7,000 a day. About 25% of high school freshmen fail to graduate from high school on time.[3] At the college level, only 59% of full-time four-year college students graduate within six years.[4] Most college data use a six-year limit because so many college students can't finish in the usual four years.
Over the last 40 years, all the educational fads we have tried apparently do not work. In this time we have had such high-profile government initiatives as "Goals 2000, New Math, Nation at Risk, No Child Left Behind, Race to the Top, Common Core, Next Generation Science Standards, Charter Schools, and Head Start." Where is the evidence that any of this works? SAT scores have not improved, even declining in some years, while funding for educational has increased dramatically, on the order depending on the state of 200%.[5]
Despite much ballyhoo and funding, Head Start's effects wash out within a few years. Nonetheless, many states think Head Start did not start early enough and that what is needed is government funded pre-kindergarten. Nobody considers what this too-soon, too-much, too stressful education does to childhood sense of wonder and motivation to learn.
The key question asked by L'Ecuyer is this: Are today's educational paradigms and policies promoting the sense of wonder and motivation to learn or squelching it? While all our government programs to improve education seem valuable, the results say otherwise. Teaching is now driven by high-stakes testing. While accountability is necessary, when high-stakes testing becomes the focus of education, it poisons the learning atmosphere. The law of unintended consequences applies. Today's educational environment suffocates the wonder and love of learning for its own sake.

# # #

Dr. Klemm is author of two books on learning: Memory Power 101 and Better Grades, Less Effort. Reviews and information can be found at his web site, WRKlemm.com



[1] L'Ecuyer, Catherine. 2014. The wonder approach to learning. Frontiers in Human Neuroscience. Volume 8, October 6. doi:10.3389/fnhum.2014.00764.
[2] Klemm, W. R. 2014. Shift Away from Teaching to the Test: A Better Way to Improve Test Scores. The STATellite, 59 (1): 10-13. http://c.ymcdn.com/sites/www.statweb.org/resource/collection/ED7F18DF-2934-4034-BDF2-CF81CADE155A/WinterSTATellite2014.pdf
[3] https://www.dosomething.org/facts/11-facts-about-high-school-dropout-rates
[4] http://nces.ed.gov/fastfacts/display.asp?id=40
[5]http://www.cato.org/publications/policy-analysis/state-education-trends