Scientific Reasoning in Physics

For many students, high school or university science courses are their last formal contact with the lofty ideals and approaches of organized science. Educators have traditionally done a poor job equipping such students with the vaunted skills of scientific reasoning. In this post, I’ll explore why, and look at how we might improve our teaching.

The 1977 paper of Fuller, Karplus & Lawson [FKL] was the first in the modern era to combine the concerns of developmental psychology and tertiary-level physics education. FKL identifies three ways “patterns of reasoning” that are more prevalent among physicists:

    1. identifying and focusing on the most important variables
    2. using propositional logic
    3. using proportions

These are not unique to physics: my mother could see a broken window and know to look for the baseball, spot holes in my alibis and, deduce that cooking six-sevenths of a baked omelet that normally requires a dozen eggs means you scale it back to about ten. However, FKL argue, their consistent use is a hallmark of both scientific reasoning and progression to Piaget’s formal operational stage of development. The argument further suggests that physicist-style reasoning is uncommon because formal operation is uncommon (FKL suggest a third of American adults do not employ formal operation patterns), poorly promoted in the public sphere, and more cognitively difficult than concrete operation. Consider the comparison below (from FKL): where do high school students fit?

Screen Shot 2013-08-24 at 2.39.43 PM

Stephens & Clement (2010) identifies three patterns of thought associated with the analysis of models in physics. They note that, in group settings, students might either generate or run with  (ie: follow on from another student) the approach. Their patterns of thought are:

  1. Using analogies
  2. Considering extreme cases
  3. Thought experiments (gedankenexperiment)

In the American Next Generation Science Standards [NGSS], scientific skills and practices — essentially an expanded vision of Fuller’s view of scientific reasoning — comprises the first of three dimensions, along with “cross-cutting concepts” and the actual content. The “practices” are such statements as:

Apply scientific reasoning to show why the data or evidence is adequate for the explanation or conclusion (for gr. 6-8, p. 27)

Clearly, scientific reasoning is important. But it’s also difficult to teach. Steinberg, Cormier & Fernandez (2009) [SCF] taught a summer enrichment course in astrophysics for New York high school students. Their approach — presenting physical models, providing no direct answers, etc — provided only modest returns. Book-end tasks asking students to motivate heliocentrism (or geocentrism) saw a dramatic increase in the number of geocentrists (42%in the exit task!) with only anecdotal improvements in reasoning. A second end-task saw that 95% chose “I don’t know” about the existence of black holes, indicating that this inquiry-heavy approach might have swung the pendulum so far as to push most students to adopt a highly positivistic view of science.

At the undergraduate level, Moore & Rubbo (2012) note that non-STEM majors performed worse on Lawson’s Classroom Test of Formal Reasoning [LCTFR (also LCTSR)] than STEM students (54% vs 75%). Worse, although the non-STEM students made significant gains on content-based tests (38% to 41%), the post-course gains on the LCTFR were marginal (6%). In a class with explicit instruction on scientific reasoning, LCTFR gains were more substantial (68%). Small sample sizes, particularly for the last (N=14), means that this conclusion should be considered only a suggestion.

Otero & Gray (2007) note a much smaller effect: in a larger study (N=189) of the PET/PSET curriculum, the students achieved an 8.8% increase on the CLASS assessment.

In addition to the LCTFR (Arizona) and the CLASS (Colorado), there are at least three other recent tools for analyzing student understandings of scientific processes. The MPEX was developed by a team from Maryland and adopted by SCF, above. EBAPS, from Berkeley, has questions that apply to students who do not have experience as physics students. VASS (Arizona) aims to determine views about physics, rather than focusing on reasoning skills.

Another approach to determining students’ thinking is to observe gestures. Stephens & Clement (2007) have been working on this. Among my ESOL students, I’ve seen abundant use of gestures, not only shape-, movement-, and force-indicating, but also gestures to represent the universe and direct representations of abstract concepts such as magnetic fields.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s