I have become slightly obsessed with multiple-choice questions (MCQs) in recent months. I first considered them properly when I read Daisy Christodolou’s book Making Good Progress (2017) a couple of years ago, and have learned more about them through the Assessment Lead Programme from Evidence Based Education (both the book and the programme come strongly recommended). When distance learning came along, Elizabeth Mountstevens’ post on remote feedback (2020) got me thinking about whether Google Forms and Loom might offer the ideal tools to harness the potential of MCQs to check and improve understanding, and now I am using them to set and give feedback on short MCQ quizzes for my classes most weeks.
I am far from the only person who has noticed the potential of MCQs at this time, and many will have enjoyed Stuart Kime’s recent ResearchED Durrington Loom presentation on their use as part of distance learning, so I am aware that I am treading a well-worn path here. They act as a form of retrieval practice, enabling students to benefit from the testing effect, and they also provide me with feedback about misconceptions. What I am particularly interested in is trying to use MCQs in history to go beyond assessing memory of facts (although that’s not in any way to denigrate factual memorisation), and unpick the complexities of students’ understanding of concepts, especially when it comes to the subtle interplay of substantive and disciplinary knowledge.
The rest of this post simply takes one short, run-of-the-mill quiz which I used recently with my Year 9 class as a case study, and briefly tries to explain the thinking which lies behind it. To give some context, the enquiry question was, ‘What were the effects of the atomic bombs dropped on Japan?’ This took up a short sequence of 2-3 lessons and formed a bridge between prior learning about World War Two and this term’s main topic of the Cold War.
Designing the quiz
Putting the quiz together in the first place is always the most time-consuming part of the job, because writing decent multiple-choice questions takes quite a bit of thought and I often look back at mine afterwards and spot things which weren’t as good as they could have been. There’s at least one example of this in the quiz I’m describing here, as you’ll see below. If you are interested in this sort of thing, there’s a useful summary of what is considered good and bad practice in the world of MCQs in the Brame paper (2013), which you can find linked in the references below.
I tend to only include three alternatives (options) for each question, for the simple reason that it saves me time. I realise that more would reduce the chances of students guessing correctly and might allow me to identify more misconceptions, but those things need to be balanced against the fact that I am very busy and can’t spend all my time designing quizzes. So three alternatives have to suffice.
- In which year did the USA drop the atomic bombs on Japan?
- a. 1943
- b. 1944
- c. 1945
N.B. The bullet points only appear when I copy and paste my questions into WordPress and are not intended to be there, so please ignore them. I can’t seem to make them go away.
My first question tests straightforward factual knowledge. However, I have not picked a fact at random. The date of the dropping of the atomic bombs is important information which underpins the concepts I want students to learn about, not least because it helps to build a connection in their minds between prior learning about the end of World War Two and the transition from this conflict to the Cold War. I could have asked about the name of the aeroplane which dropped the first bomb, the temperature within the blast zone or dozens of other things, but I judged that these are less valuable pieces of information in the context of the historical enquiry. By choosing consecutive years as alternatives, and not going beyond the end of World War Two, I have tried to make the distractors (incorrect options) as plausible as possible.
2. Which of these sentences does the best job of explaining why the Americans used the atomic bomb?
- a. The Americans wanted to force the Japanese to surrender, but also wanted to demonstrate their power to the USSR.
- b. The Americans wanted to get revenge for the Japanese attack on Pearl Harbor, so they were keen to inflict as much suffering as they could.
- c. The Americans were afraid that the Japanese were going to invade the USA, so they used the atomic bombs to prevent that from happening.
The second question in my quiz is a more complicated one. It targets more than factual knowledge, asking students to consider the second order concept of causation. Although the enquiry question does not ask why the bombs were dropped, I would argue that students require this information in order to make any sense at all of what happened. Furthermore, the American desire to intimidate the USSR, which co-existed with the aim of defeating Japan, is of absolutely fundamental importance to the development of the Cold War in the late 1940s and the nuclear arms race which spiralled from 1949 onwards. These substantive concepts are at the heart of our enquiry, and therefore the question is well worth asking.
Questions like this one, which go beyond facts and into the realm of historical interpretations, are harder to write, because it is very challenging (indeed I would argue impossible) to identify the ‘right’ cause in history, although ‘wrong’ causes are much easier to spot. In this instance, Option B is arguably correct as well as Option A, because there were doubtless plenty of American people who did want revenge for Pearl Harbor and probably some who wanted to inflict as much suffering as possible on the Japanese as well. Interestingly, several students in my class selected this option, perhaps because they are familiar with Pearl Harbor from previous lessons. I am not an expert on this topic, but I am reasonably confident that few historians would argue that this motivation takes precedence over that described in Option A, although I found it a little hard to explain why in my feedback (see below) and talked myself into a bit of a corner about it. One way I try to get around this problem in the quizzes is by the use of the word ‘best’ rather than ‘correct’ in the stem (question). In doing so I am drawing on the thinking of Paula Lobo (2017), in her excellent blog post on historical multiple-choice questions.
3. Which of these sentences describes an immediate effect of the atomic bombs?
- a. The USA invaded Japan.
- b. The soil around Hiroshima and Nagasaki became radioactive for years to come.
- c. Over 100,000 people were killed.
By asking about effects, the third question in my quiz puts the main second order concept in our enquiry in the foreground. I follow a pattern which I use in many of my multiple-choice questions, by writing one distractor which is factually incorrect (Option A in this case) and one which did happen, but does not conform to the description requested by the stem (Option B). If students select the former in significant numbers, it indicates to me that they are simply confused about historical events, and perhaps I need to revisit the material in subsequent lessons. More commonly they make the latter error, which suggests some disciplinary confusion, in this case around the concept of long-term effects, and I can address this issue with them.
In retrospect I do not think I did a great job of writing this question, because although Option C is clearly correct, arguably Option B was an immediate effect too, because I am not aware of any delay between the dropping of the bomb and the contamination of the soil with radioactive material, although I would need a scientist’s help in order to be sure. The clue that Option B was a long-term effect, of course, is my use of the phrase ‘for years to come’, and I draw my students’ attention to it in my feedback. Arguably my use of this phrase makes it possible for somebody who has not learned the material to work out that this is a distractor through logic alone, so it is an example of a flaw in my question design. The possibility of deducing that it is incorrect may be why only six students in my class chose this option, but I cannot be sure.
4. Which of these sentences describes a long-term effect of the atomic bombs?
- a. A nuclear arms race developed between the USA and Japan.
- b. A nuclear arms race developed between the USA and the USSR.
- c. Japan surrendered.
The next question follows a similar pattern, this time asking for a long-term effect, and including a factually incorrect distractor (Option A) and a factually accurate but unsuitable one, which suggests a conceptual misunderstanding (Option C). Interestingly, over a third of my class selected Option A. This could either mean that they are very confused about what happened subsequently in the Cold War, or simply that they were rushing and did not read the alternatives properly, since the wording of Options A and B is very similar. In my feedback I focussed on the second of these two possibilities, which might be complacency on my part, but if they are genuinely confused about the identity of the USA’s Cold War enemy, I am not overly worried at this stage, since we are only at the start of our study of this topic, and they are unlikely to retain the belief that it was Japan for very long.
5. Which of these sentences describes a feature of American capitalism?
- a. Most businesses were owned privately by individuals.
- b. Most businesses were owned by the government on behalf of the whole of society.
- c. There were very few businesses.
6. Which of these sentences describes a feature of Soviet communism?
- a. There were many political parties and they competed with each other in elections.
- b. There was only one political party and it chose all candidates in elections.
- c. There were no elections because it was a dictatorship.
The final two questions test students’ knowledge of basic differences between capitalism and communism, which we have also covered in recent lessons, since I want them to have an idea of the ideological disagreements between the USA and USSR. I have written the alternatives so that they are all on a common theme (businesses in Question 5 and elections in Question 6), rather than disparate issues. Both questions follow the pattern of having a distractor which was a feature of the superpower NOT being asked about and a distractor which was not true of either the USA or USSR. It would be all too easy to make the factually incorrect distractors implausible, but I have worked hard to avoid this. For example, in Option C, I have deliberately used the word ‘dictatorship’, because students are familiar with the concept from their earlier study of Stalin’s totalitarian regime and I suspected several might fall into the trap of thinking no elections were permitted in the USSR. Pleasingly this was not the case, and over 80% of the class got this question right.
After the quiz
The quiz function of Google Forms enables students to find out immediately which answers they got right and what score they achieved. This is of interest to them as individuals, but I am not under the illusion that it provides me with information from which I can draw valid inferences about how well they understand this topic, not least because MCQs reveal almost nothing about students’ ability to express themselves in writing. In addition, I am aware that they could easily be cheating when they are not under my supervision, so I reduce the temptation by keeping the stakes very low and for my purposes I avoid basing any major conclusions on performance in a quick quiz.
I am mostly interested in looking for patterns of correct and incorrect responses. If something very significant crops up, I might reteach material or set some work to address the misconception, but mostly I give some short audio feedback to talk through the responses of the class, highlighting and correcting common mistakes. I have recently settled on Loom as the most effective way of doing this, with the Google Form’s analysis of the answers on screen and my own voice talking through them. I don’t take the time to write a script, so it’s a bit rough and ready, but that’s exactly how it would be in a normal lesson and I don’t have time to worry about it. If you are not familiar with Loom or can’t picture what I’m describing, you can watch my feedback video for this particular quiz here. Disclaimer: the video is pretty dull and not slick at all, so it’s only there to illustrate my point and I’d be glad if you could kindly skip it if that’s not necessary for you. With my students, I post the link to the video on Google Classroom for the next lesson and set them the task of listening carefully and making notes to correct anything they got wrong or didn’t know.
So that’s it really. It’s nothing fancy, but for an apparently simple activity which people associate with ‘low level’ factual knowledge, I think you can pack in a considerable amount of sophisticated, conceptual thinking. Multiple-choice quizzes work well in school, but I think they’re especially well-suited to remote learning, when I don’t have all the usual questioning tools and face-to-face clues at my disposal to get a sense of how well students have grasped what I am teaching. Plus there’s no irritating faff of them having to get phones out in class, struggle to get a connection, be distracted by their notifications etc. I’d strongly recommend the use of MCQs in this way and I hope this post encourages you to give them a try if you haven’t yet done so.
Brame, C., (2013) Writing good multiple choice test questions. Retrieved  from https://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/.
Christodoulou, D., (2017) Making Good Progress? The Future of Assessment for Learning. Oxford: OUP.
Kime, S., (2020) Guidelines for using multiple-choice questions in distance learning. ResearchED Durrington Loom .
Lobo, P., (2017) “So that’s what you mean, Miss.” Using multiple-choice statements to model source analysis. Retrieved  from https://lobworth.com/2017/10/21/so-thats-what-you-mean-miss-sourcework-and-multiple-choice-statements/.
Mountstevens, E., (2020) Feedback from afar. Retrieved  from https://catalysinglearning.wordpress.com/2020/04/14/feedback-from-afar-2/.
Image attribution: Quiz by Erik Arndt from the Noun Project. Retrieved from https://thenounproject.com/search/?q=multiple-choice&i=2809198.