Author Archives: Danny

About Danny

Physics educator and researcher. @danny_doucette

Managing Cheating

To my friend M,

Next week, you’ll be teaching as an instructor-of-record for the first time. I am so excited for you, and for your students who will get to learn from a kind and insightful instructor. There’s been a lot of talk about cheating lately and, as someone who has been “around the block” a few times, I wanted to put together a few thoughts that might be helpful for you this semester.

Why Do Students Cheat?

In our large-enrolment introductory physics courses, few of the students are there because they are interested in the subject. Mostly, they’re enrolled because they need Physics 1 & 2 for med school (or another health track) or engineering. So while we see the beauty of physics, our students view the course as a hurdle to overcome — either a checkmark for their med school applications and (maybe) some preparation for the physics portion of the MCAT, or a “weed-out course” required by the school of engineering.

Physics has a reputation as a difficult subject: esoteric, highly mathematical, nearly impossible to master. So when students who would rather see the course in the rearview mirror struggle with physics problems, they have little motivation to seek out help through proper channels: going to office hours, asking their TA for help, checking in at the physics resource room, or collaborating with classmates. After all, why should they struggle to learn something they don’t care about? Then, later on, on high-pressure final exams for which they haven’t properly prepared, students cheat because they need a certain minimum grade in the class and can’t imagine earning that grade without the outside assistance they’ve grown to rely on.

At the same time, resources that feed students answers under the guise of helping them learn (ie: Chegg) are now available, and growing fast in the student collective consciousness. Worse still, there is a negative spiral: students who use Chegg feel defensive about it and tell stories about how bad their instructor is, how difficult physics is, how modern work requires resourcefulness, and how everyone is doing it. These stories create feedback when other students hear them: I thought the instructor was okay, but maybe he really isn’t very good? I thought Chegg was cheating but if everyone is doing it then maybe I have to as well, in order to keep up?

Here are some student comments from our university’s subreddit about cheating this past semester in the introductory physics courses:

I’m not outraged nor am I surprised. Cheating happens all the time.

People say “don’t cheat it’s wrong” but I guarantee each and every one of us have done something academically unethical.

It’s college and in these times your GPA is now apart of who you are… So why wouldn’t students use ALL their resources to better themselves?

In summary: students cheat because they see their classes as hurdles to overcome, feel pressure to succeed, and are given permission from their peers since cheating is commonplace and acceptable. So what can we do about it?

smokey

 

Preventing Students From Cheating

Everything in education that is done well needs to start from a place of compassion. How can we help students avoid the allure of cheating? The equation below suggests three starting-points:

Screen Shot 2020-05-06 at 10.40.09 AM

  1. Help students to see physics as something more than just a hurdle to overcome. Share your joy and enthusiasm with the class. Invest time in developing lectures and homework problems that are genuinely interesting. Go deeper, too: understanding physics fundamentally changes how we view the world. Compare a neophyte’s reaction to the bowling ball pendulum to that of a physicist, for example. Help students understand how their studies in physics are actually vital to their future careers in medicine or engineering by bringing in relevant, real-life examples from those fields.
  2. Reduce the pressure of high-stakes exams by instead having multiple tests. Which do you think students would find more stressful: two 20% midterms and a 30% final, or a 10% test every week for seven weeks? The two approaches do equally well at assessing student learning. If you have the support to do standards-based grading (sadly, we don’t in our department) then invest the time and energy in doing that. Decrease the pressure by giving students plenty of useful, quick feedback about their performance by using formative assessments like quizzes (I like quizzes that have a multiple-choice component that can be graded instantly and a problem component that can graded overnight and returned 1-2 days later with detailed commentary).
  3. Remove students’ permission to cheat by establishing unambiguous rules about what is, and what isn’t, academic misconduct in your class. Do this in the syllabus, spend 15 minutes in the first class talking about it and giving examples, and explain how you will respond to first and further offences. At our university, further offences are out of your hands, and go straight to the Dean. Don’t establish a rule you can’t or won’t enforce. Have students sign an honor pledge and make sure they understand the university’s Academic Integrity Code (and know it, yourself). Enforce your rules strictly: if you give them an inch, they’ll take a mile.

 

Catching Students who Cheat

Our university has a subscription to GradeScope, a platform that allows student work submissions and streamlines grading. I suggest using it because it makes it easy to store and compare student work. TurnItIn is integrated with Gradescope, and it is a good idea to use TurnItIn if your students are submitting any longer written work.

Don’t use textbook (or Sapling or LON-CAPA) problems directly as they can be easily found online. If you’re not sure about writing your own problems, re-write textbook problems. Be sure to change all names and places, and rephrase as fully as you can. You might as well be clear about this with students so they don’t waste their time searching.

Where possible, seek ways to give each student a customized problem so they can’t copy someone else’s answers directly (and you’ll be able to notice if they do). Here are some ways to do this:

  • Have students use their student ID number (or the Nth digit in their student ID number) as one of the parameters in a problem
  • Create multiple different versions of the same assignment/test by (a) rearranging the order of questions on the assessment, (b) reordering the answers in multiple-choice problems, (c) changing the values of some key parameters, and/or (d) using similar but non-identical problems
  • Some of the biology instructors have been using scripting in PDF forms to customize assignments. Students type their name / ID number at the top of the page, and the values of key parameters on the page change as a result. You can create such forms using Adobe Acrobat.
  • Ask questions that require students to use their knowledge of physics in a context that requires a written response. Context-rich problems are good for this, if you require that students write a couple sentences explaining their reasoning and results.

Even with all the preventative measures in place, you will likely still find student submissions that duplicate the work of others, either online or other students in your class. If you’ve been thorough and deliberate in establishing and explaining your response to academic integrity issues, and given students plenty of timely feedback on their learning, the next steps will be clear and unambiguous, and you will have a good response to the inevitable sob stories.

If you find your original problems on Chegg, you can submit a DMCA takedown request, which typically takes a couple days to process. Chegg also stores the IP addresses of its users. If you suspect students were cheating on a test or exam using Chegg, the Dean can request records as part of their investigation. More information here.

Hope this helps! Your friend, -D.

Taking Labs Online

I want to briefly summarize what we did at the University of Pittsburgh to take our introductory college physics labs online in March and April, 2020, when classes for the last 6 weeks of the semester were interrupted due to Covid-19.

The Lab

We had a bit of 400 students enrolled this semester. In order to better serve health-science track students, we run the introductory physics lab as a separate 2-credit, one-semester, offering from the Physics 1 and 2 lectures. Each section has a maximum enrolment of 24 students, and is led by a graduate student teaching assistant (TA). Typically, students attend a one-hour lab lecture and a 3-hour lab session each week. They get graded based on attendance at the lab lecture, a pre-lab assignment, a digital lab report that is completed in pairs during the lab (and usually finished up and submitted afterward), and a short lab homework assignment. We use Gradescope to streamline the submission of work and the grading.

New Policies

The University of Pittsburgh announced the switch to online learning mid-way through the spring break, and immediately extended the break by a week. That meant we had nearly two weeks to figure everything out and communicate with TAs and students. The lab manager and I met and made a few decisions:

  • We would reduce the number of labs from 12 to 11
  • We would allow students to drop the grades from their two lowest-scoring labs (originally the policy was just one lab)
  • We would develop online versions of the lab reports that could be completed individually
  • The pre-lab and lab homework would stay the same
  • The lab lecture would be dropped, but the slides would be posted online for students to read
  • We would ensure that all deadlines would be flexible for students
  • TAs would continue to play an important role in delivering these courses

Gradescope turned out to be a good technology: by this point, students were comfortable with uploading their own work and assigning pages to questions. Presumably those whose partner always took care of it at least knew someone to turn to for help. The TAs were able to continue their grading responsibilities from home.

New Labs

Our labs are loosely based on Real Time Physics, and have been modified so that there are 20 “chunks” of work (either a prediction, an activity, an in-depth question). Each “chunk” is described on a separate page of a Word document, and students write, insert graphs and pictures, fill in tables, write formulas, etc to complete their lab report. Since these online labs were being done solo, rather than in pairs, we reduced the number of “chunks” from 20 to 10. That felt like a good amount of work for students to do in 3 hours at home by themselves.

We relied heavily on some of the excellent physics simulations that have been developed over the last few years. We chose only simulations that run either in HTML5 or Javascript, making them functional across nearly all devices. We used simulations from PhET, Physlet Physics, the Physics Aviary, and the excellent Falstad circuit simulator. Each lab used 2-3 simulations, and asked students to collect and analyze data, make and check predictions, develop conceptual understanding, and reflect on some of the bigger questions (eg: should it really be called Snell’s Law?).

Implementation

I was worried that some of our students would be unable to continue their studies after the university went online, but that seems not to have been the case. While 400 students submitted the lab report before the shift to online instruction, 405 submitted the first online lab report, and 393 submitted the second online lab report. I reached out to the 20-or-so students enrolled in the class who had not submitted the first online lab report, and heard back from about half. Most asked for a short extension to finish the work as they moved or adapted to a new schedule. Two students indicated that they were struggling with online learning and expected not to be able to complete their classes (I wrote back with some resources, and heard nothing further).

Although the lab reports and at least one email asked students to send any complaints or criticism about the lab reports to me, I heard from only two students. Both wrote to say thanks for developing the materials and for making the online transition a smooth one.

We asked the TAs to be available to students during the first half of their regularly-schedule lab sessions via chat on Blackboard Collaborate, and also to aim to respond quickly to emails at this time. The TAs received very few questions via either medium, but I think it was valuable that they made the effort to be available to their students. One downside of the transition to online learning is that we have fallen behind on the grading. I expect that, unfortunately, the delay has resulted in some uncertainty for the students, who can usually see grading as soon as it is done on Gradescope. Most of the TAs for the labs are graduate students in their first or second year of studies, so the switch to online learning was significant for them because it also affected their own classes. Many are also international students (like myself), and that has made the closing of the university extra challenging.

Moving Forward

We will be able to evaluate the effectiveness of our approach to online learning through concept inventories and an attitudinal assessment that we regularly do with this introductory lab. However, looking toward online learning during the summer semester, we’re not sure that these simulation-based online labs are the best approach. We are trying out the IOLab system right now, and might adopt it for the July-August lab offering.

Finally, a few take-away messages based on what worked in our lab and what we’ve seen to be less effective in our class and others:

  1. It’s important to scale back expectations, accommodate lateness, and design work that can be done individually by students without their regular support networks (office hours, after-school help, peers, class)
  2. Early, frequent planning and communication are the key to getting a viable system in place and to getting students to buy-in to your approach to online learning
  3. People don’t like passively watching videos. Simulations are better, and there’s lots of them out there.

Lab Reports

Here are the lab reports our students completed:

Report 08 (Online Version)

Report 09 (Online Version)

Report 10 (Online Version)

Report 11 (Online Version)

Doing Interviews

I’m working on a paper that is based on a series of interviews my supervisor and I conducted over the past year. The interesting part of that research effort is the content of those interviews, and so that’s what the paper is about. However, I’ve been thinking a fair bit about how to conduct ethnographic research in introductory physics labs, including interviews, and want to share the model we’ve come up with. Here, then, are some considerations for doing interviews in contexts like my own.

Finding Participants

It’s tough to get students to agree to do interviews. It takes time during a busy semester, it’s something completely optional for them, and the financial incentive I can offer ($25, in my case) are a drop in the bucket compared with the cost of college. More significant, though, is fear of the unknown: this “interview” could be all sorts of unpleasant.

Thus, I combine my interviewee selection with organic discussions I have with students in their labs. By the end of the semester, they’ve gotten to know me as a friendly person who regularly visits their room, occasionally offers help with their work, and respects them as learners. Thus, it’s pretty easy for me to say, “I think this is the dial you’re looking for. By the way, would you be willing to sit down with me for an hour and talk about your experiences in the physics lab?”

When I approach students who have gotten to know me, at least a little bit, the response rate for interview invitations is about 75%. But when we approach students out of the blue, less than 50% of our invitees responded to email invitations. And even this response rate is impressive compared with the meagre return we get from poster advertisements. I think this shows the tremendous importance of establishing a foundational level of trust and respect before even beginning the interview.

Pseudonyms

When we report quotations in our publications, we use pseudonyms to protect the anonymity of our interview participants (note: participants, not subjects). As part of the introduction sequence for my interviews, which covers the ground rules set out by our IRB among other things, I ask my participants to choose their own pseudonym.

The result is that our pseudonyms are sometimes gender-neutral (and so I need to be explicit about the respondent’s gender if it is applicable in my analysis), and they usually don’t meaningfully reflect the ethnic or racial background of the participant. But they do give the participant a little bit more say in how their ideas are being represented, and I think that’s important.

Who Benefits?

I worry a lot about how interviews are a form of resource extraction: I, as a white man, am transcribing and using the ideas and thoughts of the participants (many of whom, by design, are underrepresented minorities) in a way that will benefit me directly, by helping me to achieve a degree and maybe a job. So I make it a point to frame our discussion in terms of how they can help future students by helping to redesign the class.

This practical, outcome-guided, framing is important for any research in which there are extra incentives for the researcher. Then, too, I think it is important to be clear with the participants about the goals — that this interview will help us improve instruction directly, and might also lead to an academic publication.

Making Meaning Together

It is a mistake to think that the meaning of an interview is created when the research and/or their collaborators pore over transcriptions and debate frameworks. Rather, the interview comes with meaning built into it, because the participant has their own frameworks and ways of understanding. Often those ways of understanding are better than the researcher’s.

This stance helps me ask better questions, and to participate better in dialogue. Our interviews are semi-structured: we compiled a list of 30 “questions”. Actually, these are things that we want to ask about. Usually, we will seek to weave these questions into the discussion.

Transcribing and Writing

I’m not very good at it, and I don’t know a good solution (let me know if you do!), but I transcribe all our interviews after we finish them. We also write post-interview reflections, in which we seek to identify key issues. After the interviews, we meet to discuss what we heard. This is when the coherent narratives start to emerge.

The paper I am working on now seeks to allow the participants to tell their own story, by copious use of long quotations. I’m also using a framework to help draw conclusions from what is being said.

 

Points & Sets: Lab Reasoning

As I pursue my PhD in physics education research, I have found time to explore a number of interesting questions. In this post, I’ll explore an approach to thinking about student reasoning in the introductory physics lab that we decided not to pursue at this time.

The Physics Measurement Questionnaire

Building on more than 20 years of development, use, and research, the Physics Measurement Questionnaire (PMQ) is a staple in introductory physics lab research. Its home is the group of Saalih Allie. This 10-question assessment is based on the following experiment:

Students are asked a series of questions that seek to determine the extent to which they agree with a set of principles about how lab-work is done (at least, in the abstract). The principles don’t appear explicitly in the relevant literature, but seem to be:

  1. More data is better
  2. When collecting data, it is good to get the largest range possible for the independent variable
  3. The average represents a set of repeated trials
  4. Less spread in the data is better
  5. Two sets of measurements agree if the averages are similar compared with the spread in their data

These axioms feel true and, as a teacher, I see value in getting my students to understand fundamentals about the nature of scientific work. However, while they are broadly applicable, the reality of science is that none of these axioms are exactly true. There is the question of whether science has a universal “method” — Paul Feyerabend argues it doesn’t. When I sat down with a distinguished professor to look at the PMQ, the professor could identify the “right” answer, but often didn’t agree with it.

This reminds me of the “Nature of Science” that was brought into the International Baccalaureate (IB) physics curriculum in 2014. It appears in the course guide as a 6-page list of statements about how science works, culminating in this mind-bending image:scimethod

So maybe attempting to define how science works isn’t a productive approach. Fortunately, the PMQ isn’t just a test of whether students agree with certain axioms of experimental physics.

Question-Level Analysis

In addition to evaluating students on their agreement with the above axioms, the PMQ also asks students to justify their reasoning. An example question follows:

After “Explain your choice”, the PMQ includes several blank lines for the student response. The instructions suggest that students should spend “between 5 and 10 minutes to answer each question”.

A thorough analysis of student responses is possible by using the comprehensive categorization scheme in Trevor Volkwyn’s MSc thesis. Volkwyn (and the PMQ authors) view the PMQ as an assessment that aims to distinguish two types of student reasoning about experimental data collection:

Point-Like Reasoning is that in which a student makes a single measurement, and presumes that the single (“point”) measurement represents the true value of the parameter that is being measured

Set-Like Reasoning is that in which a student makes multiple measurements, and presumes that the set of these measurements represents the true value of the parameter in question

Alternatively, we could view the point-like paradigm as that in which students don’t account for measurement uncertainty or random error.

Examples of responses that conform to the point-like reasoning paradigm, for the example above, include:

  • Repeating the experiment will give the same result if a very accurate measuring system is used
  • Two measurements are enough, because the second one confirms the first measurement
  • It is important to practice taking data to get a more accurate measurement

Examples of responses that match the set-like paradigm include:

  • Taking multiple measurements is necessary to get an average value
  • Multiple measurements allows you to measure the spread of the data

Thus, it is possible to conduct a pre/post analysis on a lab course to see whether students’ reasoning shifts from point-like to set-like over the course. For example, Lewandowski et al at UC Boulder have done exactly this, and see small but significant shifts.

My Data

I subjected a class of 20 students to the PMQ, and coded the responses to two of the prompts using Volkwyn’s scheme. The categorization was fairly straightforward and unambiguous.

For question 5, which asks students which of two sets of data is better (one has larger spread than the other), 17 students provided a set-like response and 3 gave the point-like response.

For question 6, which asks students whether two results agree (they have similar averages and comparatively-large spreads), only 1 student gave a correct set-like response, and the other 19 provided point-like reasoning.

I think this indicates two things:

  1. Different types of prompts may have very different response levels. Similar results are found in the literature, such as Lewandowski’s paper, above. This suggests that the set-like reasoning construct is complex, either with multiple steps to mastery or with multiple facets. Thus, it might not make sense to talk about it as a single entity with multiple probes, but rather as a collection of beliefs, skills and understandings.
  2. Some of the reasoning on question 6 seemed shallow. This suggests, for me, a bigger take-away message: my students aren’t being provoked to think critically about their data collection and analysis.

Going forward, we’ve decided not to use the PMQ as part of our introductory lab reform efforts. However, by trying it out, I was able to see clearly that our new curriculum will need to include time dedicated to getting students to think about the relevant questions (not axioms) for the experiments they conduct:

  1. How much data should I collect?
  2. What range of data should I collect?
  3. How will I represent the results of my data collection?
  4. How will I parameterize the spread in my data, and how will I reduce measurement uncertainties and random error?
  5. For different types of data, how can we know if two results agree?

Interestingly, some of Natasha Holmes’ work on introductory labs starts with the 5th question by asking students to develop a modified t-test, and then uses that tool to motivate cycles of reflection in the lab. That’s another approach we’ve agreed doesn’t quite work for us, but that is likewise a huge source of inspiration.

 

A Trilemma in Progressive Education

 

This is a quick note about something I keep encountering in practice, but haven’t seen described in The Literature. If you have an references, please send them my way!

Consider a situation in which you are making some changes to educational materials or structures in order to address a bias in the status quo. This could mean re-framing classroom rules, editing materials, or rethinking university admissions. In such a situation, there seem to be three approaches you could take:

  1. (Representational) Equality
  2. Neutrality
  3. Responsivity

Example One: Gender in Physics Problems

This week on Twitter, I mentioned that I was re-writing some physics problems that had been coded (a) exclusively male, and sometimes (b) using inaccurate vocabulary (ie: “spaceman” instead of “astronaut”). An Equality approach would be to make the subjects of the questions 50% male and 50% female. Kate Wilson explains why this matters:

Summer C emphasizes that gender isn’t a 50/50 split, and that assuming a binary nature of gender is problematic, including for students who aren’t cisgender.

One such response, then is to attempt to use gender-neutral language (ie: Neutrality). In some cases this works easily, as Jenn Broekman explains:

In other cases, though, it takes a bit more work:

I like gender-neutral pronouns in principle, but we’re probably a generation away from them being practical. An alternative is to adopt the 2nd-person perspective:

While “you” seems like a good solution, it’s not always going to be valid — for example, a problem in which there are two independent agents. There’s also the issue that this isn’t solving the bigger problem: even if my problems are gender neutral, students are still getting cultural signals about who is and who isn’t a physicist. This brings us to the idea of Responsivity: using these cultural artifacts to push back against counterproductive messaging and norms students are bringing to the classroom:

This discussion was enlightening for me — thanks to those who contributed!

Example Two: Latent Colonialism

I’ve been slowly putting together some ideas around the idea of “latent colonialism” in physics, the idea that a large portion of the canon of physics culture is a remnant of the colonial era and thus carries some problematic attributes with it. For example, textbooks around the world feature the same group of ~20 physicists (Newton, Franklin, Ampere, Maxwell, Einstein, etc) — all white, male, and Western European. Or consider that the names of the famous physics equations and principles, although developed over a couple millennia, have Western European names. This has been shown to negatively impact students who aren’t white and male.

So, how do we respond? We could:

  1. Seek Neutrality. Simply remove the names from the canon. Instead of Snell’s Law, we would call it the Law of Refraction. This approach has been adopted in several curricula, but I see two problems with it: first, instead of giving an advantage to white boys, the curriculum gives an advantage to no-one; and second, culture is complicated and it’s not clear we’d be successful at expunging all the biased artifacts.
  2. Instead, we could seek out cases in which white women and people of colour contributed to the history of science and highlight them as well. This would aim to present a canon that is representationally Equal. I’m currently reading up on Émilie du Châtelet, for example. This would allow us to re-cast the history of physics as an endeavour that involved white men and women in Western Europe, and built on the mathematics and early work done in the Middle East, and to a lesser extent India and China. Putting aside the cries of “revisionist history”, this isn’t clearly going fix everything, either. For one thing, the truth is that a lot of contemporary physics knowledge, in the form in which we currently teach it, was assembled by white men who refused to allow white women or people of colour in on the proceedings (with some rare exceptions).
  3. Which bring us to the Responsive approach, which is to acknowledge the deliberate exclusion, intellectual theft and, well, colonialism that took place during the construction of the set of ideas we think of today as the canon of physics knowledge. This would be hard: it would involve physics teachers talking about history and racism in their classes.

Likely, the best response is to do all of these, in the contexts in which they work best, to the best of our abilities. For example, I taught for some time in a country in which it was constitutionally prohibited for schoolteachers to talk about gay marriage or other gay rights issues in their classes: that type of thing severely restricted my ability to talk with students about some of these issues. It is my hope that this classification is useful in deciding which response to take.

Intro Physics Labs: Why?

Some of my research these days is focusing on the introductory physics lab. In this post, I will seek to outline some qualitative findings about introductory physics labs at universities and colleges in the USA. I am planning to follow-up these conclusions by doing a survey of a representative sample of physics departments — but that will have to wait until late August.

1. Parallel or Separate?

In most programs, the physics lab runs parallel with the physics course. At Lincoln University, for example, students take Phys 105 and 106 as their two-semester introductory physics sequence, and also enroll in Phys 105L and 106L at the same times. Here, the goal of the lab tends to be focused on reinforcing core ideas. At Penn State, labs “are designed to provide you with hands on experience with the material being investigated in class”. The key concern here is that recent research suggests that labs provide “no added value to learning course content” (Holmes et al).

Less common is a lab that is run as a separate course, sometimes requiring the first course in the introductory physics sequence as a pre-requisite. At Drexel, for example, students take Phys 128 as a separate course. These separate courses tend to focus more on the development of experimental skills and mindsets. At Carnegie Mellon, the purpose of intro lab course 33-104 is “to become skilled at acquiring, recording, and analyzing data, and drawing conclusions from experimental work”.

2. What is the purpose of labs, anyway?

The AAPT lab guidelines focus on the process of “constructing knowledge” and scientific skills, rather than core ideas. Ideally, I think, labs would meet both aims: helping students to enrich their understanding of core physics ideas, while also learning how scientific knowledge is generated experimentally.

In constructing a lab experience, core ideas and scientific skills will need to be interwoven. Could we say that a lab is successful if students only practice core ideas? Could we say it is successful if students only learn scientific skills? I would say that the former is not acceptable, but the latter might be.

3. What labs are being done?

Cookbook-style lab manuals are nearly ubiquitous. Most of these seem to have been written locally, but have the same format: an overview of the relevant physics (which students rarely read), then step-by-step instructions for what students should do in the lab, and finally some questions.

There is variation in how students are assessed. Often, there is a pre-lab quiz. Written lab reports are common (often 1 per group of 1-3 students), as are worksheets that need to be filled in.

Most labs are 3-hour sessions, with a new experiment each week. In some places, a 1-hour lecture precedes the lab session. In others, experiments stretch over two weeks.

Some manuals seem to try to scaffold students from this mode of highly-structured inquiry (where direct instructions are given) toward guided inquiry (where, instead, students are given goals and broad guidance). I wasn’t able to find any lab programs that aim for open inquiry.

4. How do AP, IB, and Cambridge A-Level lab expectations compare?

For all three of these programs, labs are expected to occupy about 20% of the instructional time. In the AP, IB and A-Level classes, labs are explicitly expected to build toward independent student work (ie: open inquiry). AP labwork is not assessed directly, while IB students submit a lab report for external assessment, and A-Level exams include a substantial practical component.

It seems to me that colleges and universities have substantially lower expectations for student performance in labs than is found in the AP, IB, and Cambridge A-Level courses.

Some Failures

I’m proud of my successes as a STEM teacher working toward equity, but I’ve also had failures. In order to get better, I need to understand, and correct for, those times when I’ve made mistakes. Here are some examples I’m thinking about now.

Irina

I met Irina when she was in 10th grade because she was doing a project in physics and wanted to talk with the school’s physics teacher about it. I knew that she was interested in technology and science in such a way that engineering would be a great career for her. Nevertheless, she was very reluctant about taking physics. On course selection night, she told me repeatedly that she felt she wasn’t intelligent enough. I tried to allay her fears, and I guess I was successful enough that she enrolled in my physics course.

Over the first semester, she did okay — Bs and Cs, mostly. We talked again about her career plans at the end of the term, and she told me she’d decided she wasn’t intelligent enough for engineering, and that her middling grades were evidence of this. Although we talked many times after this, I was never able to change her mind.

Irina needed some extra help in the first semester, to master some essential math and physics skills and build her self-efficacy, but I didn’t see that. In the future, I’m going to be more careful and more supportive with new students who have low self-belief and fixed mindset-style attitudes. This is especially relevant for girls and minority students, for whom self-efficacy already tends to be quite low.

Robotics Club

In an after-school robotics club, one student was explaining to two others his idea for using tank treads for the robot. He wasn’t doing a great job of explaining his idea, so I asked him to try again; he did, although with a bit of an exasperated tone. When he finished, I turned to the two students and asked them, “Did that make sense to you?” “Yes, I know what a tank is,” replied one.

The two listeners went off to work on something else. Later, they told me that many of the other students weren’t taking their ideas seriously. They interpreted my question (and their peer’s explanation) as condescension, and that was sort of the last straw. I apologized and tried to explain, but by that point the damage had been done: I restored their trust in me, but not in the robotics club. They came to the next meeting, but interacted with almost no-one, and didn’t return after that.

My microaggression against these two students happened because I briefly stopped focusing on their affective experience. For students in STEM, there are so many threats: my job as a teacher is to remove these and make a safe learning environment. Going forward, I’m trying to be more sensitive to the environment and careful about my own actions.

Joe

A student in my math class, Joe’s academic record was weak. His academic language skills were particularly poor because, as our program coordinator explained it, he’d never developed a first language. Instead, he spoke French early in his life, then German with his mother and friends, and then English when he moved and found a new school and a new social circle.

With Joe, I was insufficiently proactive. I didn’t insist on extra help after school, I didn’t establish strong communication with his mother early in the year, and I didn’t differentiate my instruction or get him alternative resources. By the time his weak grades started to pile up, he was too far behind to properly catch up and he had decided he couldn’t succeed in the course.

Epilogue

These students all went to university, at least for one year. They’re pursuing their dreams and doing alright. I’m pursuing my dreams too: being a better teacher for my next class of students.

Open-Ended Exam Tasks

For a variety of reasons, I’ve been thinking a lot about open-ended tasks for assessment. This style of physics “problem” gives the student the opportunity to use a variety of different approaches to demonstrate their understanding in a semi-authentic context.

Below are the open-ended tasks from the Scottish Qualifications Authority physics exams 2014-2017 (ie: from the N5, Higher, and Advanced Higher exams). My primary purpose in posting these here is to provide a resource for students studying toward their SQA qualifications.*  I think that answering this type of problem effectively requires careful practice, and I hope this collection is useful for that!

First, here’s the detailed rubric. Each open-ended task is scored between 0 and 3 marks.

Screen Shot 2018-03-07 at 9.08.45 PM

And here are the prompts. A few have been excluded because they are built into the context of a longer problem. A bunch more are here.

Screen Shot 2018-03-07 at 9.00.49 PMScreen Shot 2018-03-07 at 9.01.03 PMScreen Shot 2018-03-07 at 9.01.32 PMScreen Shot 2018-03-07 at 9.02.19 PMScreen Shot 2018-03-07 at 9.02.33 PMScreen Shot 2018-03-07 at 9.03.13 PMScreen Shot 2018-03-07 at 9.03.25 PMScreen Shot 2018-03-07 at 9.04.08 PMScreen Shot 2018-03-07 at 9.04.53 PMScreen Shot 2018-03-07 at 9.05.02 PMScreen Shot 2018-03-07 at 9.05.51 PMScreen Shot 2018-03-07 at 9.06.10 PMScreen Shot 2018-03-07 at 9.06.42 PMScreen Shot 2018-03-07 at 9.06.52 PMScreen Shot 2018-03-07 at 9.07.39 PMScreen Shot 2018-03-07 at 9.07.51 PM

* I think this falls within the stated allowed use, but please write me if that is not the case!

30 Actual Teachers

Forbes published their “30 under 30” list for education. It’s a list notably lacking actual teachers. This is frustrating because, as exciting as it may be for these young people to create a start-up designed to help their communities, we already have a group of highly-trained, preposterously-dedicated people for whom teaching is a vocation.

So, I’d like to present an alternative “30 under 30”: educators who are genuinely changing the world, and not getting the salary or respect they damned well deserve.
(And I don’t care about their ages).

Kristine Atrens – A top-notch kindergarten teacher who epitomizes dedication and caring.

Peter Bohacek – One of the people behind Direct Measurement Videos and Pivot Interactives, both of which are really effective because of Peter’s ability to draw on his teaching experience to understand what active learning really requires.

Stephen Collins – An expert on the Socratic method and standards-based grading, and a modeling instruction leader.

Oyuntsetseg Durvuljin – The founder and spiritual leader of ‘Hobby’ School, a project that lifted hundreds of students into the upper-middle class through education and providing opportunities.

Cristina and John Julius Fajardo – This teaching couple are the most enthusiastic, energetic, teachers I have ever met. They care about the students so much that the students can’t help but care about learning.

Erdenetsetseg Gombojav – Such a great teacher that she becomes an extra parent and gets the best out of her students by caring for them and maintaining the highest standards.

Chris Hamper – A lifelong physics teacher, textbook author, online resource provider, and workshop leader. The best of IB Physics comes from him.

Megan Hayes-Golding – A superb physics and STEM educator with a passion for helping students take on challenges. Bonus points for bow ties.

Richard Hechter – A scholar and educator who focuses on cultural aspects of science education, especially the intersection between Western and Canadian Aboriginal ways of knowing.

Meghan Hennick – An innovative, creative, and passionate elementary school teacher on the international circuit.

Scott Hovan – Probably the expert on how to get students talking honestly, thoughtfully, and considerately in high school physics.

Sarah Johnson – An educator at Simon Fraser at the forefront of getting girls into physics, and changing the discipline to make it more accessible.

Michael Lerner – A deeply passionate educator who puts his students first, and invests the time and energy needed to create an exceptional learning environment for them.

Joe McIntyre – An educator and policy scholar with strong passion and a deep commitment to education as a tool for social empowerment.

Zanda Medne – An English and German teacher and university lecturer who inspires love from her students by being passionate about their success.

Dan Meyer – Former high school maths teacher, currently developing great tools at Desmos.

Derek Muller – Best known as Veritasium, his Youtube channel is an outgrowth of his doctoral research into how students learn via videos.

Kelly O’Shea – Proof that educational change happens best when it comes from working teachers. Her numerous ideas, shared through the blog, have helped hundreds of teachers move toward more active teaching methods.

L Ozola – An excellent maths teacher and PhD student, she connects with students through humour and inspires them to do exceptional things.

Moses Rifkin – A deeply wise and conscientious man at the forefront of social justice in high school physics education.

Marianna Ruggerio – An enthusiastic and creative physics teacher who has recently started sharing her experiences via Twitter.

Andy Rundquist – A physics professor with a commitment to faculty development, collaboration, and learning communities.

Patrick Savage – A drama and French teacher whose compassion and patience for his students have always inspired them to become better citizens.

Sloane Schubert – A thoughtful, original, caring educator with a broad range of experience and mastery.

Leigh and Scott Simon – A teaching couple whose passion for their students is exceeded only by their willingness to do whatever it takes to provide meaningful, exciting, and effective learning experiences.

Ieva Smits – A dedicated teacher with a strong commitment to education as a way to improve communities, especially through service.

Rebecca Vieyra – The K-12 program manager at the AAPT, she is responsible for many great programs to support teachers and teaching of physics

Evan Weinberg – Evidence, in case any is needed, that great teachers can build the tools they need to conduct great teaching.

(There’s a whole bunch of other people I wanted to include, but randomly didn’t because of the limit of 30)

 

Urination and Physics

The Times Education Supplement [TES] is known as a fairly conservative British publication, focusing on policy news, endorsements of the teaching profession, and op-eds by teachers. So it was surprising to see a click-bait headline relating to physics education research: “Taking the pee out of physics: How boys are getting a leg-up“. Unlike many submitted posts, this one is not identified as being written by a blogger, and comments are disabled — we are intended to treat this as real research news.

The crux of the argument is this: we have a gender gap in physics scores on standardized assessments. That gap seems to be most pronounced on tasks involving 2-dimensional motion. One explanation for the discrepancy is that boys have more experience with balls, rockets, cannons, and so forth because of the social conditioning they experience as children. However, the authors note that female students in the “ultra-masculine environment” of a military school show the same gender gap. Thus, they conclude that ball sports and play-acting war isn’t the factor. Instead, they propose that boys playfully urinate, and thus have experience with projectile motion in a way that girls don’t.

There is a lot about the article that is objectionable.

1. This article isn’t based on published scientific work, it doesn’t refer to a submitted manuscript, and the authors don’t have any related publications in the literature. This isn’t an idea that has been vetted by peer review. More importantly, it isn’t a mature scientific idea: the authors have proposed a hypothesis, but haven’t actually carried out the experiment.

It would be easy to test: survey men about their childhood urination habits, and about their proficiency with physics. Maybe throw a tricky physics problem at them, too. But the authors didn’t do this, preferring to write about the idea as if it were too obvious to need verification. This sort of speculative science is problematic, and popularizing ideas that haven’t been vetted empirically has been problematic in physics in recent years. It is particularly bad in the field of physics education research, which is struggling to be recognized as proper science by a dubious physics community.

2. Since the authors didn’t conduct a study, I did. I asked 25 people (THANK YOU!!) to answer four questions: were they sports fans as children, did they playfully urinate as children, and were they good at physics in school? I also asked them which angle would optimize the range of a projectile in the real-world case where air friction cannot be neglected — someone familiar with projectile motion either experimentally or theoretically should know that slightly decreasing the angle from 45 degrees (the theoretical optimum) will increase the range when air friction is considered.

The results of the survey show that neither urination nor sports were strong predictors for physics ability. The strongest relationship was between sports and success on the physics problem, but this did not reach an adequate level of confidence*. In short, had the authors actually tested their hypothesis, they would have found it incorrect.

3. The language used in the article makes it clear that this is click-bait rather than a serious attempt to introduce a new idea. Consider the following lines: “those sparkling arcs of urine”, “pee-based-game-playing”, and “…despite the surface layer of toilet humour, and the implication that physics may be little more than a pissing contest, we’re making a serious point.”

Unfortunately, with phrasing like that, the authors are not.

4. Another point is made by Brett Hall: projectile motion isn’t a topic that occurs at the start of the curriculum, yet the gender gap is apparent from early in the physics course. Likewise, the authors suggest focusing on energy conservation first, rather than projectile motion, but this is something that is already done in many classrooms.

5. Research by Zahra Hazari and others points to socio-cultural factors (identity,  home and school support) being the most relevant to explain why girls opt out of physics. I wouldn’t argue that the gender gap is an understood problem, but the authors present it as wholly-unsolved (perhaps to increase the audience’s willingness to accept their unorthodox idea) when it isn’t.

6. [addition 18 September] On further reflection, it is more clear to me that the phrasing and positioning of this idea to be damaging and troublesome, in addition to being incorrect and click-bait. A phrase like “why don’t young women perform as well in physics?” presupposes that the cause is a deficiency in the women, rather than the sexist culture in which they are raised and on whose assessments they are being found wanting. I hope no teenage girl hears of this incorrect hypothesis, reads this article, or absorbs the various ripples it is making in the news media.

Lastly, a note about ad hominem rebuttals. I think that most men would look at this idea and disagree because of their personal experience. I’ve seen some rejection of this hypothesis because the primary and secondary authors are female. However, there is value in the perspective of an outsider: we do a lot of things unconsciously, and only an external viewer would be able to make connections we might otherwise miss. Dismissing this work about male urination because the authors are female is incorrect.

I think that’s about all I want to say about this idea. Hopefully we can forget it now.

* The n=24 study I did was enough to show that the urination=physics ability hypothesis cannot be the primary explanation for the gender gap. However, it is possible that there is still a small correlation. As pointed out by Steve Zagieboylo, however, this pathway likely goes boy-sports-physics rather than boy-urination-physics, given the strong social differentiation that boys face. The results from my study suggest this but, since the effect is smaller, I cannot claim to have discovered anything with the small sample I used.