Responding to the Research

A recent study (Pashler, McDaniel, Rohrer, & Bjork, “Learning Styles: Concepts and Evidence”) argues that there is inadequate scientific evidence to justify the practice of matching instruction to students’ learning styles. ASCD asked us to respond to this study and printed a shortened and modified version of our response in the May 2010 issue of its member newsletter, Education Update. You can view this version online at: http://www.ascd.org/publications/newsletters/education-update/may10/vol52/num05/Responding-to-the-Research.aspx. Our initial response appears below.

Everybody seems to be talking about this study, so we’re glad to have this opportunity to respond. Above all, we want to clarify that the authors only reviewed one approach to learning styles-based instruction—grouping students by style and ‘matching’ each group to one particular style of instruction. Proponents of a differentiated style approach, ourselves included, do not advocate this kind of matching as the optimal form of instruction. We, along with Bernice McCarthy and David Kolb, have long argued that teaching to the full range of styles is far better than confining students to a single style of instruction. Support for a diverse approach can be found in Robert Sternberg’s research, which shows that using a variety of teaching styles leads consistently to higher achievement across grade levels and content areas. Distinguishing between our view of style and the view examined in the study is important because: 1) we’re concerned that the reported lack of evidence for matching is being misinterpreted as a criticism of style-based instruction in general; 2) the authors’ conclusion about the use of style assessments in schools depends on a ‘matching’ view of style that not everyone shares.

Pashler and his team conclude that using learning-style assessments in schools is “unwise” because they find minimal evidence that diagnosing students’ styles and matching them to corresponding instructional treatments improves test performance. While we don’t endorse this kind of matching, we disagree with the contention that test scores are all that matters in determining whether matching—or any intervention—works. A more important point: in reaching their conclusion about style assessments, the authors overlooked the fact that such assessments are used for purposes other than grouping students by style to facilitate matching. In our experience, learning-style assessments have proven to be wonderful tools for promoting conversations about learning, building teachers’ and students’ metacognitive capacities, increasing student engagement, and helping teachers find ways in with struggling students. We’ve also found that well-designed assessments enhance the effectiveness of a differentiated approach because teachers who assess their own and their students’ styles are typically more willing and able to implement a wide variety of instructional strategies in their classrooms. So while we support the authors’ call for additional research, we feel that determining the true educational value of learning-style assessments will require examining a wider range of benefits.Clearly, schools face hard choices about where to invest their limited resources. When making these decisions, however, schools need not choose between learning styles and research-based practices for raising achievement. Nor does paying attention to students’ styles get in the way of using content to guide instructional decision making. In The Strategic Teacher: Selecting the Right Research-Based Strategy for Every Lesson, we present strategies that address style and the principles of effective instruction set forth by Robert Marzano and other researchers; we also identify the type of knowledge each strategy is best suited for (declarative vs. procedural) and provide brief strategy overviews to help teachers decide which strategies best meet their content objectives.

When deciding how to invest resources, it’s also important to remember that the kinds of experiments proposed by Pashler and his team reveal only part of the picture. In our opinion, the best way to evaluate the effectiveness of an instructional intervention is to examine its effects on teaching and learning in real schools over time. And from our work in schools, we can identify dozens of schools that have implemented a styles-and-strategies framework and reported improved instructional and academic outcomes. This styles-and-strategies approach works because it does precisely what Pashler his team argue good instruction should: it keeps options open for all students, using strategies rooted in research to create “experiences, activities, and challenges that enhance everybody’s learning.”

A final point: style advocates are hardly unique in calling for more diverse instruction. After all, study after study shows that teaching content in a variety of ways improves comprehension and retention. Or, look at the NCTM’s position on how math should be taught or the instructional demands associated with RTI and 21st century learning. All are calls for diverse teaching practices that promote multiple forms of thinking. Our 35 years in schools tell us that a learning-styles framework, linked to a repertoire of research-based strategies, is the single best tool for accomplishing this goal.

 

Âö‡FÖĂŕ