This Is (Some Scientific Research About) Your Brain On Video Games

Here’s a recent talk by Daphne Bauvelier (University of Rochester) on brain plasticity & video game playing. I’m not usually a huge TED fan, but this is actually a decent neuroscience one, it seems. Positive press from the online neuroscience community, too, which is neat. Hopefully such discussion fuels further research talk on the topic!

How to study better

I gave a talk at Chicago State University yesterday, the inaugural Scheinbuks Lecture (honoring a new endowment that provides scholarships to outstanding students).  In the post-talk Q&A, I got a question about how to study better so that a student would have better performance on the upcoming test.  As a memory guy, I should really be more prepared to answer this question as it comes up regularly.  But since we focus mostly on implicit learning, most of our direct research has little to say — I can give advice on skill learning, even cognitive skill learning (practice!), but not much on effective memorization.

At the time, I fell back on the suggestion to use various tricks: mnemonics like acronyms, songs, memory palace (spatial) strategies.  It feels lame and I think that opinion was shared by the questioner.  On the way home after, I decided I should come up with a standard answer and I think I have an idea that better connects to our work. I will now recommend 2 ideas:

  1. Treat memory use like a skill.  When we study, we often practice encoding by repeatedly reading over the material to be memorized.  We should also practice retrieving the information (a la Karpicke/Roediger — this is just a different account of their finding) to train ourselves to more quickly and easily pull up the information at test.
  2. Take advantage of statistical learning in memory through spacing effects.  Per standard advice, study a little bit every day to create long-lasting durable memories (good sleep will help, too).  And then cram like crazy right before the exam.  The other side of the spacing effect is that massed study is perfect for immediate retrieval, suggesting cramming probably works pretty well.

Both these answers tap into ideas about how cognitive processes are shaped by practice and the statistics of use, which are what we study (but not usually in the context of explicit memory).  And I think both parts of the advice probably work pretty well in actual practice.

The dark side of implicit learning

I mentioned this recent study in lab meeting today and decided to collect a few details here.  I saw the NY Times report cited/linked in a couple of other places and it is a good start (http://www.nytimes.com/2012/09/25/science/bias-persists-against-women-of-science-a-study-says.html?ref=science&_r=0).

Key paragraphs:

To avoid such complications, the Yale researchers sought to design the simplest study possible. They contacted professors in the biology, chemistry and physics departments at six major research universities — three private and three public, unnamed in the study — and asked them to evaluate, as part of a study, an application from a recent graduate seeking a position as a laboratory manager.

All of the professors received the same one-page summary, which portrayed the applicant as promising but not stellar. But in half of the descriptions, the mythical applicant was named John and in half the applicant was named Jennifer.

About 30 percent of the professors, 127 in all, responded. (They were asked not to discuss the study with colleagues, limiting the chance that they would compare notes and realize its purpose.)

On a scale of 1 to 7, with 7 being highest, professors gave John an average score of 4 for competence and Jennifer 3.3. John was also seen more favorably as someone they might hire for their laboratories or would be willing to mentor.

The average starting salary offered to Jennifer was $26,508. To John it was $30,328.

The bias had no relation to the professors’ age, sex, teaching field or tenure status. “There’s not even a hint of a difference there,” said Corinne Moss-Racusin, a postdoctoral social psychology researcher who was the lead author of the paper.

The original study was published in PNAS (http://www.pnas.org/content/early/2012/09/14/1211286109.full.pdf+html) if the link doesn’t work.  The Times summary appears to accurately capture the main findings.

The connection to our work is the conjecture that the sampled professors in the natural sciences have spent more time around men in these fields (even the women) and have come to expect men to fit more naturally in the field due to the statistics of their prior experience.  This isn’t “sexism” in the normal use of the word, it is an implicit bias that I strongly suspect the professors do not even realize they have acquired during the regular activities of daily science life.  They have implicitly learned to expect men to be better lab managers and do not even realize the bias is operating.

I put this out as a kind of “dark side” to implicit learning.  If your brain is automatically extracting the statistics of the world around you, it is going to naturally reinforce existing stereotypes and make it hard to change if those stereotypes are unjust.  An analogy can be made to another kind of dark side, the ability to learn a bad habit through practice as easily as a good habit and the equal difficulty of breaking either kind.

I have thought that framing this kind of bias as implicit and outside awareness would make it easier to address by making the expression of bias seen as less of a personal fault.  In my limited experience, however, this is completely not true.  People seem to find the idea that they might be influenced by implicit bias quite upsetting in practice.  Apparently that is not going to stop me from bringing it up here anyway, though.

The analogy between brain training and physical exercise

We often speak colloquially about how cognitive (working memory) training is like a workout for your brain.  I think this analogy can be somewhat usefully pushed to illustrate underlying principles.

For example, if one were to go to the gym and train on an upper body exercise (e.g., bench press), the muscle gains will be specific to the exercise — that is, you don’t get much transfer to lower leg muscles.  Similarly, we expect a primary effect of cognitive training to largely improve the trained task, just as normally training gains are specific to practice.

But that’s not all that happens with exercise.  Regular and reasonably intense exercise provides a cardio-vascular improvement that appears to broadly benefit health, including cognitively healthy aging.  This is essentially a secondary effect of the primary muscle exercise and also a smaller effect, which is harder to measure.

Our goal in cognitive training is to try to trigger some sort of global improvement analogous to cardio training.  This might show up as increased cognitive reserve (protection from loss due to aging/dementia) or a general increase in processing speed (which declines during aging across a range of domains).  But it’s a secondary effect to the training task, so we expect it might be smaller and harder to measure.

The current idea is that working memory training may be a good cognitive analog to cardio training in physical exercise.  We use our transfer battery in an attempt to pick up the global improvement in reserve or processing speed.  The specific working memory “muscle” is reasonably likely to be a good one to strengthen since WM capacity is important for problem solving, reading comprehension and long-term memory.  But the core question is if there is a brain analog to cardio training such that a training intervention can provide some of the benefit we know arises from lifetime eduction or occupational complexity.

It seems that sometimes our reviewers are surprised by the non-specificity of our transfer predictions or the lack of a precise description of why our particular training task is best.  To us, different training tasks are like the difference between running, biking or zumba — whatever works best to get you to do your cardio is more important than the specific exercise.  Maybe the physical exercise analogy will help communicate this framing better to keep the research on this moving forward.

George A. Miller

George A. Miller just passed away.  Many people consider his 7 +/- 2 paper to be the first step in the “cognitive revolution” of the 60s in which Cognitive Psychology came to be a major principle of psychology (replacing behaviorism to some extent, at least until the emergence of behavioral neuroscience).

However, his role in inspiring the original paradigm for the study of implicit learning is probably not well-known.  That story I know courtesy of my father who told it to me something like this —  it’s probably mangled due to memory (mine) and retelling (his) but the basic outline should be right:

[via A.S. Reber, personal communication]

I was at graduate school at Brown University [mid 60s] and I was bored out of my mind.  None of the research going on around me was remotely interesting to me and I needed to find something.  I met a woman at a party and we hit it off and it turned out she was George Miller’s daughter.  She invited me to come visit the family up in Boston, where George was at Harvard at the time.  The relationship didn’t go anywhere, but George and I hit it off famously.  We talked and I ended up coming to visit him periodically to talk and also to see what research he was working on.

He was trying to study the idea of a Universal Grammar in language learning, probably influenced by Chomsky as a lot of people were.  He had developed a paradigm with a simplified finite-state grammar and was observing people trying to learn it by trial and error, watching patterns and then predict the next item in the sequence. I looked at what he was doing and thought he was doing it wrong.  I tried to tell him, but of course he wouldn’t listen to me.  So I went back to grad school and ran it myself.  The result was the first Artificial Grammar Learning paper in 1967 where we coined the term “implicit learning.”

I found out about his passing through an email from the Cognitive Neuroscience society. That email includes this quote:

“If any person deserves credit for creating the field of cognitive psychology as it has developed in the past roughly 60 years,” the linguist and philosopher Chomsky said in an interview, Dr. Miller is “the one.”

And a link to a NY Times article:

http://www.nytimes.com/2012/08/02/us/george-a-miller-cognitive-psychology-pioneer-dies-at-92.html?pagewanted=all

“If any person deserves credit for creating the field of cognitive psychology as it has developed in the past roughly 60 years,” the linguist and philosopher Chomsky said in an interview, Dr. Miller is “the one.”“If any person deserves credit for creating the field of cognitive psychology as it has developed in the past roughly 60 years,” the linguist and philosopher Chomsky said in an interview, Dr. Miller is “the one.”

If you would like to read more about George Miller please go to:
http://www.nytimes.com/2012/08/02/us/george-a-miller-cognitive-psychology-pioneer-dies-at-92.html?pagewanted=all

But I don’t really know how else you get good at something without practicing.

Ta-Nehisi Coates is one of my favorite bloggers. He is listed as the “senior editor for The Atlantic, where he writes about culture, politics, and social issues for TheAtlantic.com and the magazine.” He writes about race better than anybody I have ever read. Plus we have some other areas of overlapping interest.

Today he dug into his personal experience learning a second language, French.  He’s reflected on the difficulty of this process before, but today reports the discovery of learning vocabulary by rote repetition:

Memorizing various verb forms has required simply writing them over and over again. I think in the past I’ve given the impression that rote repetition is somehow unconnected to “real” learning.  But I don’t really know how else you get good at something without practicing. I was once told that if you want to develop a jump-shot, you need to learn form, and basically shoot a thousand  jump-shots a day until the form becomes you. I’ve found that in French, I’m trying to recreate a similar trick–turning a overtly conscious act into muscle memory.
I’ve come to love the repetition, the constant rhythm of the jump-shot. I like the slow progress. It’s a kind of revelation. I find myself taken by fantasy. I imagine that I am breaking some ancient code. I imagine I am learning the rudiments of plane-walking. I imagine SETI in reverse–like all the teeming life of the Francoverse broadcasts itself to me, and someday I shall hear it all.

He is a good writer and captures something interesting about what we study here.

http://www.theatlantic.com/personal/archive/2012/07/the-book-of-human-language/260016/

Radio

I think there will likely be a clip of me talking briefly about the memory capacity of the human brain on BYU radio tomorrow (Wed, 6/20) in the morning (8:30/10:30?).  The host/producer said they’d try to get me a clip and I’ll link or post it if they do.

We are also preparing a press release on our top-secret, embargoed paper going live this Sunday.  Might be some more media around that next week.

Nature walking

Walking in nature is good for your brain.  Why?  I’m not sure either, but that didn’t stop me from sharing a few ideas on this topic with a health and science reporter for the Medill News Service:

http://news.medill.northwestern.edu/chicago/news.aspx?id=206542

Someday when our never-ending series of studies on ego depletion and skill learning requires us to un-deplete people, we can take them for a walk around the lagoon outside the lab as a restorative nature walk?

Cup stacking

I’d been meaning to post something on this for awhile, forgot, then was recently reminded again.  This has to be a top candidate in the category of arbitrary and useless skills that people get ridiculously good at:

Link if embed doesn’t work: http://www.youtube.com/watch?v=LyU5v0ZYMjI

The point is not to make fun of this kind of skill acquisition but to raise the question of why we do this.  In particular, the first people who started racing through stacking cups had to have been doing it because it provided some intrinsic reward for them for motivation.  I think skill learning does have some intrinsic reward (and maybe more for some people than others), possibly related to the involvement of cortico-striatal loops that use or are at least near reward processing (and the common dependence on dopamine).

Distant analogies

I haven’t been really random in awhile, but recently an editor of a science magazine asked me the curious question: how much learning is implicit?

Because I like distant analogies, I’m strongly tempted to claim in response: Implicit learning is the dark matter of memory

To ground the analogy, the idea is that what we normally think of memory (the visible, regular matter) is our conscious experience of memory retrieval.  Implicit learning and memory is this broad set of invisible memory processes that shape our perceptions, actions and thoughts that aren’t directly observable by our subjective experience.

To stretch the analogy, dark matter is thought to constitute 83% of the universe (via Wikipedia).  Is it even possible to quantify the percentage of “memory” that is implicit so that we could compare numbers?

One approach would be based on neurobiology.  Under our “plasticity principle” model of implicit learning, virtually all synapses in the brain have some inherent ability to be modified by experience. Those changes that depend materially on the medial temporal lobe are thought to contribute to explicit, conscious memory while all the other changes are reflected in implicit memory phenomena.  Just based on brain size, we might make the claim that implicit memory should reflect the majority of these plasticity events, perhaps even around 80% of all memory events in the brain.  However, we’d really need some estimate of the rate of change events (or even magnitude) and they might be much more common and significant in the MTL, potentially reducing the percentage by a lot.

Another approach would be based on some quantification of the behavioral consequence of memory and try to carve up changes apportioned based on the relative roles of explicit and implicit memory processes.  After brief consideration of this problem, I’m going to recommend the biological approach.

I’m tempted to invite argument for and against the dark matter hypothesis for students of memory.  Probably the concept is a bit too bizarrely abstract for good debate, though.