Tag Archives: exams

Latvian Physics Exam

Today was the pilot administration of Latvia’s new 12th-grade physics exam. Next year, students will be required to write either the physics or the chemistry exam as part of their secondary school leaving requirements (for those students who pursue the academic track of studies). In this post, I will take a close look at the exemplar, which can be found (in Latvian) here.


Part 1.1 (multiple choice): Assumptions and Memory

The exam opens with a question about the applicability of the accelerated motion model. It’s a promising sign, but all of the answers — a launched javelin, a parachutist in the first minute of fall, a person in an elevator, and a hammer dropped on the moon — are possible, and the question is really about the degree to which the model can be applied to these situations. For example, a javelin has a very narrow cross section, so it would be reasonable to assume it experiences very little drag force. In fact, the aerodynamics of javelins are interesting, and although their motion is non-parabolic, the reasons are not obvious (an example).

The fifth question is identical to one from an IB physics exam from a couple years ago, where the motion of a piston is presumed to be uniquely responsible for the temperature increase of a compressed ideal gas. The twelfth lists four core equations about electricity and asks which explains the need for high voltage in long-distance power lines. These types of questions are frustrating because multiple answers are reasonable, and the students are forced to try to guess what the exam-writer is thinking.

Another type of question that appears is simple recall. Question 6 requires that students remember that isochoric processes have constant volume. The 13th question seems to ask that students remember how the technology of electromagnetic inductive charging works. Question 20 is about the origin of the atoms that make up our bodies.

The third type of common question is about relationships between variables. Students are expected to recall how electric fields depend on the distances from point charges, how photon energy is related to wavelength, and how the energy of photoelectrons is related to the frequency of incident light.

These multiple choice questions are pretty standard, with questions similar to IB Physics and the GRE. The difficulty level is probably appropriate, in that students will show a nice Gaussian distribution of responses. But as a tool for educational assessment, I think these multiple-choice questions have only marginal utility.

Part 1.2 (short answer): Not For The Meticulous

The second part of the exam consists of 10 short-answer questions. In reality, it is a mix of question types that is sure to frustrate students. There are 6 multiple-choice style questions where students must choose all the correct answers, 2 ranking tasks, and 2 calculations. Problematically, some of the distractor options on the multiple-choice style problems could conceivably be relevant. For example, problem 27 shows possible light paths refracting and reflecting around a glass block. A line path that is not visibly refracted by the block is possible, depending on the index of refraction (and the refraction is exaggerated compared with what would be observed in the lab).

The biggest problem with questions like this is that, while they are accessible to very weak students and straightforward to students who have a teacher’s level of understanding of physics problems, they will not be very successful at differentiating between a student who is at 90% of a robust understanding, and a student who can intelligently guess 10%.

I’m happy that the test-writers are attempting to expand beyond multiple-choice questions, but this section will be hard for the students, and will provide little feedback about students’ ability to use their physics knowledge and skills to meaningfully create knowledge.

Part 2 (long answer): Physics Problems, With a Dash of Discussion

The long-answer section begins with the ominous note that this section will involve judging pace, but with 135 minutes to solve 5-7 problems, the question likely will not be pace so much as sustained concentration.

I want to like these problems: they combine calculations (kinetic and gravitational energies for the first, circular motion for the second) with diagrams and comparison of quantities. This fixes the problem with the first half of the exam, and allows students the opportunity to explain their answers.

However, the structures for the problems seem set up to greatly restrict the students from any sort of individual actions, knowledge construction, or creativity. The first problem requires that students draw the velocity and acceleration vectors for a pendulum — something that is usually recalled, rather than figured out.

The rest of the problems mix standard physics things-that-must-be-remembered with little calculations. There is a clear effort to make the problems meaningful and/or relevant: one concludes by asking students for a conclusion about heat loss in a house, and another is about the laser on a Mars rover. However, the overwhelming sense with these questions is that of standard physics regurgitation: draw a voltage divider, calculate the diffraction angle, identify this range of the EM spectrum, etc.

The final question (at last!) asks that students work with some data, but this is an illusion: after finding the slope from a graph, it’s the same deal as before: find the efficiency of the electric kettle.

By the time you get to the end of these problems, it is clear that they are merely standard physics problems, with just a few “explain why” questions. This is better than the mechanistic rigamarole of the IB physics exams, but not by much. Even here, there is no chance for students to construct their own meaning or use genuine critical thinking, and I suspect that most students will get most of their points by reiterating standard physics solutions to these standard physics problems.


There is a suspicion that the new physics and chemistry exams are an attempt by the Ministry of Education to add more rigour to a national curriculum that has recently come under fire after Oxford University decided not to recognize the Latvian grade 12 diploma as adequate preparation for undergraduate studies. Thus, this exam will probably give universities a better way to sort students. I’ve heard that, after promising they wouldn’t be required, many universities are asking to see students’ results for this pilot year for the exam.

However, like all standardized testing of this sort, what we will primarily find is that students attending better-funded schools, in Rīga, and with higher socioeconomic class will do better on these exams. Additionally, these exams will be felt backward into the 11th grade and earlier, as physics and chemistry teachers are increasingly under pressure to prepare students for the exam, and thus are forced to sacrifice good science teaching in favour of test preparation. The quality of meaningful science education will fall, resulting in weaker critical thinking and scientific reasoning skills across society, while universities will increasingly rely on these test scores to make admissions decisions, and disadvantaged students will be left behind.

If I’m wrong about some or all of this, or if you have perspective or insight to share, please reply or email me (danny.doucette at gmail).

Mastery vs IB Assessment

Exams make me depressed.

Every December and June (for grade 11) and December and March (for grade 12) I spend a lot of time creating an assessment that follows the objectives of my course and meaningfully assesses student proficiency in the required skills, and knowledge of the required information.

And every December and June, and December and March, the results are disappointing. It doesn’t seem to matter how much we review. I think the Modeling approach is helping with comprehension of ideas, but it isn’t translating into perfect test scores.

I’m clearly not alone in this. For IB science and math courses, the “excellent” grade of 7 is awarded to students who get more than ~70% of the questions correct on their exam (adjusted yearly, there is also the IA, etc). The passing grade (a 4) is ~40%. And according to the IB Statistical Bulletin, only 8% of students will get that elusive 7. And 39% of students won’t even get the 4.

Screen Shot 2015-06-21 at 7.56.45 AM

The problem, I think, is an impedance mismatch between the IB grading system and my own mastery-oriented expectations.

The IB system is designed to assign a rank to students, for the purpose of university applications, mainly.  A 7 student needs to be clever, clever, clever. The average student, who can satisfy most of the curricular learning goals, is distinguished from the 7 student through trick questions, overly-specific grading rubrics, demanding that students memorize definitions, and even a few unsuitably-hard questions. Such a system is fine — we expect the SAT and IELTS to deliver such a ranking, for example.

My expectations as a teacher, however, and my sentiments as someone close to the learner, are a bit different. We see the examination as an opportunity to demonstrate the students’ mastery of the curricular content — not as a test to put the student in his/her place. For us, it is heartbreaking when students get less than 7 — and when the 7 they get is 70% instead of 95%.

I’m not an excellent teacher, but I’m getting better. Hopefully, soon, my students will go into an exam with full and deep understanding of everything they need to know. I hope that they achieve the scores they deserve, when they undertake that examination.