Doing Interviews

I’m working on a paper that is based on a series of interviews my supervisor and I conducted over the past year. The interesting part of that research effort is the content of those interviews, and so that’s what the paper is about. However, I’ve been thinking a fair bit about how to conduct ethnographic research in introductory physics labs, including interviews, and want to share the model we’ve come up with. Here, then, are some considerations for doing interviews in contexts like my own.

Finding Participants

It’s tough to get students to agree to do interviews. It takes time during a busy semester, it’s something completely optional for them, and the financial incentive I can offer ($25, in my case) are a drop in the bucket compared with the cost of college. More significant, though, is fear of the unknown: this “interview” could be all sorts of unpleasant.

Thus, I combine my interviewee selection with organic discussions I have with students in their labs. By the end of the semester, they’ve gotten to know me as a friendly person who regularly visits their room, occasionally offers help with their work, and respects them as learners. Thus, it’s pretty easy for me to say, “I think this is the dial you’re looking for. By the way, would you be willing to sit down with me for an hour and talk about your experiences in the physics lab?”

When I approach students who have gotten to know me, at least a little bit, the response rate for interview invitations is about 75%. But when we approach students out of the blue, less than 50% of our invitees responded to email invitations. And even this response rate is impressive compared with the meagre return we get from poster advertisements. I think this shows the tremendous importance of establishing a foundational level of trust and respect before even beginning the interview.

Pseudonyms

When we report quotations in our publications, we use pseudonyms to protect the anonymity of our interview participants (note: participants, not subjects). As part of the introduction sequence for my interviews, which covers the ground rules set out by our IRB among other things, I ask my participants to choose their own pseudonym.

The result is that our pseudonyms are sometimes gender-neutral (and so I need to be explicit about the respondent’s gender if it is applicable in my analysis), and they usually don’t meaningfully reflect the ethnic or racial background of the participant. But they do give the participant a little bit more say in how their ideas are being represented, and I think that’s important.

Who Benefits?

I worry a lot about how interviews are a form of resource extraction: I, as a white man, am transcribing and using the ideas and thoughts of the participants (many of whom, by design, are underrepresented minorities) in a way that will benefit me directly, by helping me to achieve a degree and maybe a job. So I make it a point to frame our discussion in terms of how they can help future students by helping to redesign the class.

This practical, outcome-guided, framing is important for any research in which there are extra incentives for the researcher. Then, too, I think it is important to be clear with the participants about the goals — that this interview will help us improve instruction directly, and might also lead to an academic publication.

Making Meaning Together

It is a mistake to think that the meaning of an interview is created when the research and/or their collaborators pore over transcriptions and debate frameworks. Rather, the interview comes with meaning built into it, because the participant has their own frameworks and ways of understanding. Often those ways of understanding are better than the researcher’s.

This stance helps me ask better questions, and to participate better in dialogue. Our interviews are semi-structured: we compiled a list of 30 “questions”. Actually, these are things that we want to ask about. Usually, we will seek to weave these questions into the discussion.

Transcribing and Writing

I’m not very good at it, and I don’t know a good solution (let me know if you do!), but I transcribe all our interviews after we finish them. We also write post-interview reflections, in which we seek to identify key issues. After the interviews, we meet to discuss what we heard. This is when the coherent narratives start to emerge.

The paper I am working on now seeks to allow the participants to tell their own story, by copious use of long quotations. I’m also using a framework to help draw conclusions from what is being said.

 

Advertisements

Points & Sets: Lab Reasoning

As I pursue my PhD in physics education research, I have found time to explore a number of interesting questions. In this post, I’ll explore an approach to thinking about student reasoning in the introductory physics lab that we decided not to pursue at this time.

The Physics Measurement Questionnaire

Building on more than 20 years of development, use, and research, the Physics Measurement Questionnaire (PMQ) is a staple in introductory physics lab research. Its home is the group of Saalih Allie. This 10-question assessment is based on the following experiment:

Students are asked a series of questions that seek to determine the extent to which they agree with a set of principles about how lab-work is done (at least, in the abstract). The principles don’t appear explicitly in the relevant literature, but seem to be:

  1. More data is better
  2. When collecting data, it is good to get the largest range possible for the independent variable
  3. The average represents a set of repeated trials
  4. Less spread in the data is better
  5. Two sets of measurements agree if the averages are similar compared with the spread in their data

These axioms feel true and, as a teacher, I see value in getting my students to understand fundamentals about the nature of scientific work. However, while they are broadly applicable, the reality of science is that none of these axioms are exactly true. There is the question of whether science has a universal “method” — Paul Feyerabend argues it doesn’t. When I sat down with a distinguished professor to look at the PMQ, the professor could identify the “right” answer, but often didn’t agree with it.

This reminds me of the “Nature of Science” that was brought into the International Baccalaureate (IB) physics curriculum in 2014. It appears in the course guide as a 6-page list of statements about how science works, culminating in this mind-bending image:scimethod

So maybe attempting to define how science works isn’t a productive approach. Fortunately, the PMQ isn’t just a test of whether students agree with certain axioms of experimental physics.

Question-Level Analysis

In addition to evaluating students on their agreement with the above axioms, the PMQ also asks students to justify their reasoning. An example question follows:

After “Explain your choice”, the PMQ includes several blank lines for the student response. The instructions suggest that students should spend “between 5 and 10 minutes to answer each question”.

A thorough analysis of student responses is possible by using the comprehensive categorization scheme in Trevor Volkwyn’s MSc thesis. Volkwyn (and the PMQ authors) view the PMQ as an assessment that aims to distinguish two types of student reasoning about experimental data collection:

Point-Like Reasoning is that in which a student makes a single measurement, and presumes that the single (“point”) measurement represents the true value of the parameter that is being measured

Set-Like Reasoning is that in which a student makes multiple measurements, and presumes that the set of these measurements represents the true value of the parameter in question

Alternatively, we could view the point-like paradigm as that in which students don’t account for measurement uncertainty or random error.

Examples of responses that conform to the point-like reasoning paradigm, for the example above, include:

  • Repeating the experiment will give the same result if a very accurate measuring system is used
  • Two measurements are enough, because the second one confirms the first measurement
  • It is important to practice taking data to get a more accurate measurement

Examples of responses that match the set-like paradigm include:

  • Taking multiple measurements is necessary to get an average value
  • Multiple measurements allows you to measure the spread of the data

Thus, it is possible to conduct a pre/post analysis on a lab course to see whether students’ reasoning shifts from point-like to set-like over the course. For example, Lewandowski et al at UC Boulder have done exactly this, and see small but significant shifts.

My Data

I subjected a class of 20 students to the PMQ, and coded the responses to two of the prompts using Volkwyn’s scheme. The categorization was fairly straightforward and unambiguous.

For question 5, which asks students which of two sets of data is better (one has larger spread than the other), 17 students provided a set-like response and 3 gave the point-like response.

For question 6, which asks students whether two results agree (they have similar averages and comparatively-large spreads), only 1 student gave a correct set-like response, and the other 19 provided point-like reasoning.

I think this indicates two things:

  1. Different types of prompts may have very different response levels. Similar results are found in the literature, such as Lewandowski’s paper, above. This suggests that the set-like reasoning construct is complex, either with multiple steps to mastery or with multiple facets. Thus, it might not make sense to talk about it as a single entity with multiple probes, but rather as a collection of beliefs, skills and understandings.
  2. Some of the reasoning on question 6 seemed shallow. This suggests, for me, a bigger take-away message: my students aren’t being provoked to think critically about their data collection and analysis.

Going forward, we’ve decided not to use the PMQ as part of our introductory lab reform efforts. However, by trying it out, I was able to see clearly that our new curriculum will need to include time dedicated to getting students to think about the relevant questions (not axioms) for the experiments they conduct:

  1. How much data should I collect?
  2. What range of data should I collect?
  3. How will I represent the results of my data collection?
  4. How will I parameterize the spread in my data, and how will I reduce measurement uncertainties and random error?
  5. For different types of data, how can we know if two results agree?

Interestingly, some of Natasha Holmes’ work on introductory labs starts with the 5th question by asking students to develop a modified t-test, and then uses that tool to motivate cycles of reflection in the lab. That’s another approach we’ve agreed doesn’t quite work for us, but that is likewise a huge source of inspiration.

 

A Trilemma in Progressive Education

 

This is a quick note about something I keep encountering in practice, but haven’t seen described in The Literature. If you have an references, please send them my way!

Consider a situation in which you are making some changes to educational materials or structures in order to address a bias in the status quo. This could mean re-framing classroom rules, editing materials, or rethinking university admissions. In such a situation, there seem to be three approaches you could take:

  1. (Representational) Equality
  2. Neutrality
  3. Responsivity

Example One: Gender in Physics Problems

This week on Twitter, I mentioned that I was re-writing some physics problems that had been coded (a) exclusively male, and sometimes (b) using inaccurate vocabulary (ie: “spaceman” instead of “astronaut”). An Equality approach would be to make the subjects of the questions 50% male and 50% female. Kate Wilson explains why this matters:

Summer C emphasizes that gender isn’t a 50/50 split, and that assuming a binary nature of gender is problematic, including for students who aren’t cisgender.

One such response, then is to attempt to use gender-neutral language (ie: Neutrality). In some cases this works easily, as Jenn Broekman explains:

In other cases, though, it takes a bit more work:

I like gender-neutral pronouns in principle, but we’re probably a generation away from them being practical. An alternative is to adopt the 2nd-person perspective:

While “you” seems like a good solution, it’s not always going to be valid — for example, a problem in which there are two independent agents. There’s also the issue that this isn’t solving the bigger problem: even if my problems are gender neutral, students are still getting cultural signals about who is and who isn’t a physicist. This brings us to the idea of Responsivity: using these cultural artifacts to push back against counterproductive messaging and norms students are bringing to the classroom:

This discussion was enlightening for me — thanks to those who contributed!

Example Two: Latent Colonialism

I’ve been slowly putting together some ideas around the idea of “latent colonialism” in physics, the idea that a large portion of the canon of physics culture is a remnant of the colonial era and thus carries some problematic attributes with it. For example, textbooks around the world feature the same group of ~20 physicists (Newton, Franklin, Ampere, Maxwell, Einstein, etc) — all white, male, and Western European. Or consider that the names of the famous physics equations and principles, although developed over a couple millennia, have Western European names. This has been shown to negatively impact students who aren’t white and male.

So, how do we respond? We could:

  1. Seek Neutrality. Simply remove the names from the canon. Instead of Snell’s Law, we would call it the Law of Refraction. This approach has been adopted in several curricula, but I see two problems with it: first, instead of giving an advantage to white boys, the curriculum gives an advantage to no-one; and second, culture is complicated and it’s not clear we’d be successful at expunging all the biased artifacts.
  2. Instead, we could seek out cases in which white women and people of colour contributed to the history of science and highlight them as well. This would aim to present a canon that is representationally Equal. I’m currently reading up on Émilie du Châtelet, for example. This would allow us to re-cast the history of physics as an endeavour that involved white men and women in Western Europe, and built on the mathematics and early work done in the Middle East, and to a lesser extent India and China. Putting aside the cries of “revisionist history”, this isn’t clearly going fix everything, either. For one thing, the truth is that a lot of contemporary physics knowledge, in the form in which we currently teach it, was assembled by white men who refused to allow white women or people of colour in on the proceedings (with some rare exceptions).
  3. Which bring us to the Responsive approach, which is to acknowledge the deliberate exclusion, intellectual theft and, well, colonialism that took place during the construction of the set of ideas we think of today as the canon of physics knowledge. This would be hard: it would involve physics teachers talking about history and racism in their classes.

Likely, the best response is to do all of these, in the contexts in which they work best, to the best of our abilities. For example, I taught for some time in a country in which it was constitutionally prohibited for schoolteachers to talk about gay marriage or other gay rights issues in their classes: that type of thing severely restricted my ability to talk with students about some of these issues. It is my hope that this classification is useful in deciding which response to take.

Intro Physics Labs: Why?

Some of my research these days is focusing on the introductory physics lab. In this post, I will seek to outline some qualitative findings about introductory physics labs at universities and colleges in the USA. I am planning to follow-up these conclusions by doing a survey of a representative sample of physics departments — but that will have to wait until late August.

1. Parallel or Separate?

In most programs, the physics lab runs parallel with the physics course. At Lincoln University, for example, students take Phys 105 and 106 as their two-semester introductory physics sequence, and also enroll in Phys 105L and 106L at the same times. Here, the goal of the lab tends to be focused on reinforcing core ideas. At Penn State, labs “are designed to provide you with hands on experience with the material being investigated in class”. The key concern here is that recent research suggests that labs provide “no added value to learning course content” (Holmes et al).

Less common is a lab that is run as a separate course, sometimes requiring the first course in the introductory physics sequence as a pre-requisite. At Drexel, for example, students take Phys 128 as a separate course. These separate courses tend to focus more on the development of experimental skills and mindsets. At Carnegie Mellon, the purpose of intro lab course 33-104 is “to become skilled at acquiring, recording, and analyzing data, and drawing conclusions from experimental work”.

2. What is the purpose of labs, anyway?

The AAPT lab guidelines focus on the process of “constructing knowledge” and scientific skills, rather than core ideas. Ideally, I think, labs would meet both aims: helping students to enrich their understanding of core physics ideas, while also learning how scientific knowledge is generated experimentally.

In constructing a lab experience, core ideas and scientific skills will need to be interwoven. Could we say that a lab is successful if students only practice core ideas? Could we say it is successful if students only learn scientific skills? I would say that the former is not acceptable, but the latter might be.

3. What labs are being done?

Cookbook-style lab manuals are nearly ubiquitous. Most of these seem to have been written locally, but have the same format: an overview of the relevant physics (which students rarely read), then step-by-step instructions for what students should do in the lab, and finally some questions.

There is variation in how students are assessed. Often, there is a pre-lab quiz. Written lab reports are common (often 1 per group of 1-3 students), as are worksheets that need to be filled in.

Most labs are 3-hour sessions, with a new experiment each week. In some places, a 1-hour lecture precedes the lab session. In others, experiments stretch over two weeks.

Some manuals seem to try to scaffold students from this mode of highly-structured inquiry (where direct instructions are given) toward guided inquiry (where, instead, students are given goals and broad guidance). I wasn’t able to find any lab programs that aim for open inquiry.

4. How do AP, IB, and Cambridge A-Level lab expectations compare?

For all three of these programs, labs are expected to occupy about 20% of the instructional time. In the AP, IB and A-Level classes, labs are explicitly expected to build toward independent student work (ie: open inquiry). AP labwork is not assessed directly, while IB students submit a lab report for external assessment, and A-Level exams include a substantial practical component.

It seems to me that colleges and universities have substantially lower expectations for student performance in labs than is found in the AP, IB, and Cambridge A-Level courses.

Some Failures

I’m proud of my successes as a STEM teacher working toward equity, but I’ve also had failures. In order to get better, I need to understand, and correct for, those times when I’ve made mistakes. Here are some examples I’m thinking about now.

Irina

I met Irina when she was in 10th grade because she was doing a project in physics and wanted to talk with the school’s physics teacher about it. I knew that she was interested in technology and science in such a way that engineering would be a great career for her. Nevertheless, she was very reluctant about taking physics. On course selection night, she told me repeatedly that she felt she wasn’t intelligent enough. I tried to allay her fears, and I guess I was successful enough that she enrolled in my physics course.

Over the first semester, she did okay — Bs and Cs, mostly. We talked again about her career plans at the end of the term, and she told me she’d decided she wasn’t intelligent enough for engineering, and that her middling grades were evidence of this. Although we talked many times after this, I was never able to change her mind.

Irina needed some extra help in the first semester, to master some essential math and physics skills and build her self-efficacy, but I didn’t see that. In the future, I’m going to be more careful and more supportive with new students who have low self-belief and fixed mindset-style attitudes. This is especially relevant for girls and minority students, for whom self-efficacy already tends to be quite low.

Robotics Club

In an after-school robotics club, one student was explaining to two others his idea for using tank treads for the robot. He wasn’t doing a great job of explaining his idea, so I asked him to try again; he did, although with a bit of an exasperated tone. When he finished, I turned to the two students and asked them, “Did that make sense to you?” “Yes, I know what a tank is,” replied one.

The two listeners went off to work on something else. Later, they told me that many of the other students weren’t taking their ideas seriously. They interpreted my question (and their peer’s explanation) as condescension, and that was sort of the last straw. I apologized and tried to explain, but by that point the damage had been done: I restored their trust in me, but not in the robotics club. They came to the next meeting, but interacted with almost no-one, and didn’t return after that.

My microaggression against these two students happened because I briefly stopped focusing on their affective experience. For students in STEM, there are so many threats: my job as a teacher is to remove these and make a safe learning environment. Going forward, I’m trying to be more sensitive to the environment and careful about my own actions.

Joe

A student in my math class, Joe’s academic record was weak. His academic language skills were particularly poor because, as our program coordinator explained it, he’d never developed a first language. Instead, he spoke French early in his life, then German with his mother and friends, and then English when he moved and found a new school and a new social circle.

With Joe, I was insufficiently proactive. I didn’t insist on extra help after school, I didn’t establish strong communication with his mother early in the year, and I didn’t differentiate my instruction or get him alternative resources. By the time his weak grades started to pile up, he was too far behind to properly catch up and he had decided he couldn’t succeed in the course.

Epilogue

These students all went to university, at least for one year. They’re pursuing their dreams and doing alright. I’m pursuing my dreams too: being a better teacher for my next class of students.

Open-Ended Exam Tasks

For a variety of reasons, I’ve been thinking a lot about open-ended tasks for assessment. This style of physics “problem” gives the student the opportunity to use a variety of different approaches to demonstrate their understanding in a semi-authentic context.

Below are the open-ended tasks from the Scottish Qualifications Authority physics exams 2014-2017 (ie: from the N5, Higher, and Advanced Higher exams). My primary purpose in posting these here is to provide a resource for students studying toward their SQA qualifications.*  I think that answering this type of problem effectively requires careful practice, and I hope this collection is useful for that!

First, here’s the detailed rubric. Each open-ended task is scored between 0 and 3 marks.

Screen Shot 2018-03-07 at 9.08.45 PM

And here are the prompts. A few have been excluded because they are built into the context of a longer problem. A bunch more are here.

Screen Shot 2018-03-07 at 9.00.49 PMScreen Shot 2018-03-07 at 9.01.03 PMScreen Shot 2018-03-07 at 9.01.32 PMScreen Shot 2018-03-07 at 9.02.19 PMScreen Shot 2018-03-07 at 9.02.33 PMScreen Shot 2018-03-07 at 9.03.13 PMScreen Shot 2018-03-07 at 9.03.25 PMScreen Shot 2018-03-07 at 9.04.08 PMScreen Shot 2018-03-07 at 9.04.53 PMScreen Shot 2018-03-07 at 9.05.02 PMScreen Shot 2018-03-07 at 9.05.51 PMScreen Shot 2018-03-07 at 9.06.10 PMScreen Shot 2018-03-07 at 9.06.42 PMScreen Shot 2018-03-07 at 9.06.52 PMScreen Shot 2018-03-07 at 9.07.39 PMScreen Shot 2018-03-07 at 9.07.51 PM

* I think this falls within the stated allowed use, but please write me if that is not the case!

30 Actual Teachers

Forbes published their “30 under 30” list for education. It’s a list notably lacking actual teachers. This is frustrating because, as exciting as it may be for these young people to create a start-up designed to help their communities, we already have a group of highly-trained, preposterously-dedicated people for whom teaching is a vocation.

So, I’d like to present an alternative “30 under 30”: educators who are genuinely changing the world, and not getting the salary or respect they damned well deserve.
(And I don’t care about their ages).

Kristine Atrens – A top-notch kindergarten teacher who epitomizes dedication and caring.

Peter Bohacek – One of the people behind Direct Measurement Videos and Pivot Interactives, both of which are really effective because of Peter’s ability to draw on his teaching experience to understand what active learning really requires.

Stephen Collins – An expert on the Socratic method and standards-based grading, and a modeling instruction leader.

Oyuntsetseg Durvuljin – The founder and spiritual leader of ‘Hobby’ School, a project that lifted hundreds of students into the upper-middle class through education and providing opportunities.

Cristina and John Julius Fajardo – This teaching couple are the most enthusiastic, energetic, teachers I have ever met. They care about the students so much that the students can’t help but care about learning.

Erdenetsetseg Gombojav – Such a great teacher that she becomes an extra parent and gets the best out of her students by caring for them and maintaining the highest standards.

Chris Hamper – A lifelong physics teacher, textbook author, online resource provider, and workshop leader. The best of IB Physics comes from him.

Megan Hayes-Golding – A superb physics and STEM educator with a passion for helping students take on challenges. Bonus points for bow ties.

Richard Hechter – A scholar and educator who focuses on cultural aspects of science education, especially the intersection between Western and Canadian Aboriginal ways of knowing.

Meghan Hennick – An innovative, creative, and passionate elementary school teacher on the international circuit.

Scott Hovan – Probably the expert on how to get students talking honestly, thoughtfully, and considerately in high school physics.

Sarah Johnson – An educator at Simon Fraser at the forefront of getting girls into physics, and changing the discipline to make it more accessible.

Michael Lerner – A deeply passionate educator who puts his students first, and invests the time and energy needed to create an exceptional learning environment for them.

Joe McIntyre – An educator and policy scholar with strong passion and a deep commitment to education as a tool for social empowerment.

Zanda Medne – An English and German teacher and university lecturer who inspires love from her students by being passionate about their success.

Dan Meyer – Former high school maths teacher, currently developing great tools at Desmos.

Derek Muller – Best known as Veritasium, his Youtube channel is an outgrowth of his doctoral research into how students learn via videos.

Kelly O’Shea – Proof that educational change happens best when it comes from working teachers. Her numerous ideas, shared through the blog, have helped hundreds of teachers move toward more active teaching methods.

L Ozola – An excellent maths teacher and PhD student, she connects with students through humour and inspires them to do exceptional things.

Moses Rifkin – A deeply wise and conscientious man at the forefront of social justice in high school physics education.

Marianna Ruggerio – An enthusiastic and creative physics teacher who has recently started sharing her experiences via Twitter.

Andy Rundquist – A physics professor with a commitment to faculty development, collaboration, and learning communities.

Patrick Savage – A drama and French teacher whose compassion and patience for his students have always inspired them to become better citizens.

Sloane Schubert – A thoughtful, original, caring educator with a broad range of experience and mastery.

Leigh and Scott Simon – A teaching couple whose passion for their students is exceeded only by their willingness to do whatever it takes to provide meaningful, exciting, and effective learning experiences.

Ieva Smits – A dedicated teacher with a strong commitment to education as a way to improve communities, especially through service.

Rebecca Vieyra – The K-12 program manager at the AAPT, she is responsible for many great programs to support teachers and teaching of physics

Evan Weinberg – Evidence, in case any is needed, that great teachers can build the tools they need to conduct great teaching.

(There’s a whole bunch of other people I wanted to include, but randomly didn’t because of the limit of 30)