Multiple choice questions: Are they (a) good, (b) bad or (c) as good as we make them? Editorial Director Richard Stagg discusses where MCQs fit in the future of assessment.
Multiple choice questions (MCQs) are attracting more and more attention. We are often asked by educators and authors alike about their value. Why do students like them? Can they really develop critical thinking? Are they an authentic form of assessment?
As with the best MCQs, there is no easy answer. The more time and resources we devote to the MCQ, the more we learn about its strengths, weaknesses and role in better teaching and learning.
The role of MCQs in feedback
Students always value the opportunity to check their understanding, test their progress and get some feedback when they need it most. For me, this response from a student captures the timely feedback problem that MCQs can help us solve:
"Having to wait for a lecture as well to find out if you've got anything right, you're constantly thinking you’ve got it right, for like a week, and the next week they're going, 'Oh no you got it wrong', and you don’t remember even doing the question."
Of course, the reason students are able to check their progress and get instant feedback is because MCQs can be auto-graded. It's this feature that many educators value, especially those with large classes and growing student expectations of shorter gaps between test and result, answer and feedback.
Assessment at scale in multiple disciplines
Multiple choice questions can be imperfect, but their use has been on the rise as they are the most scalable and efficient form of assessment. They are easy to assign, easy to mark automatically and offer valuable learner data to help educators personalize teaching and target support.
Their use has been most widespread in quantitative or mechanical disciplines, where problem sets are common, but their growing sophistication is enabling their use on more conceptual markets where long-form answers have traditionally dominated.
Even in disciplines like law, MCQs could feature more heavily in formative HE assessment as the undergraduate part of the sector responds to the Solicitors Qualifying Examination.
Improving MCQs through learning science
We design our MCQs to be of a high quality, backed up by learning science. Last month, Digital Content Development Manager Jenny Lee and her team ran a one-day workshop for our assessment authors, focused entirely on better MCQ writing.
The workshop covered issues from optimal distractors to promoting critical thinking, and 'feeding forwards'. The participants were also users of Pearson resources, and had a lot to share about their teaching experiences and their use of formative assessment in and outside the classroom.
Participants left the day with a shared excitement about the potential for stronger MCQs as part of formative assessment, and the team has revised guidelines and new ideas to help authors write better MCQs.
How MCQs can respond to student needs
When eBook platform Kortext surveyed customers in 2017 on their ideal choice of eBook enhancements, formative feedback came out top. Similarly, in a recent student focus group on eBooks, we asked the students what enhancements to eBooks would be most valuable to them. The most popular option? “Questions with feedback so I can see my progress.”
Some MCQs really are just quizzes, but we believe every MCQ is actually an opportunity for good feedback and better engagement. Our strategy is to develop ever more sophisticated question types and back our questions up with constructive feedback so that MCQs truly accelerate learning and help students to stay motivated.
We believe this strategy makes resources rich in high-quality MCQs and feedback a valuable tool for improving both student satisfaction and those crucial NSS scores on regular assessment and timely feedback in particular.
That’s why our Mastering and MyLab resources include deep sets of assignable, randomized and varied question types, with multi-layered feedback and sophisticated gradebooks.
We are also bringing the content and the questions closer together, and making them available to students whenever and wherever they need them.
Revel™ , for example, is an interactive learning resource that allows students to read, practise, and study in one continuous experience. It brings together trusted textbook content with videos, activities and MCQs, all in one location.
Beyond MCQs: the road to authentic assessment
MCQs are clearly not the only answer. As we build resources designed to develop higher-order learning and skills, we’re creating new content and assessment types that work together to improve the learning experience.
That’s why Revel already has a range of formative assessment types - including decision-simulations and writing tasks - to foster critical thinking through judgement and writing.
These assessments include:
- self-paced journaling prompts throughout the narrative that encourage students to express their thoughts without breaking stride in their reading
- assignable shared writing activities that direct students to share written responses with classmates, fostering peer discussion, and
- essays integrated directly within Revel that allow educators to assign the precise writing tasks they need for the course.
That’s also why we’ve introduced autograded Excel Projects for MyLab Accounting, allowing students to perform assessed tasks in Excel that are a better reflection of the types of activity they will encounter in their future careers. This is just one example of the next generation of formative assessment, that will extend deeper into writing, project-based assignments, group-work and collaborative learning.
We believe the high quality MCQ will continue to evolve and play a valuable part in teaching and learning, but that it will be joined in our digital resources by many more dynamic and exciting assessment experiences. Together, they can do more to deliver higher-order learning.