Summary of "Writing Good Multiple Choice Questions"
Summary — Writing Good Multiple‑Choice Questions (Dr. Michelle Hardy)
A well‑written multiple‑choice exam should let students show what they’ve learned — assessing a range of cognitive levels, aligned with course outcomes, and avoiding bias or test‑wiseness.
Main purpose and high‑level points
- Multiple‑choice questions (MCQs) should measure student learning, not guessing or test‑taking skill.
- MCQs can assess a range of cognitive processes (recall through higher‑order thinking) when aligned with course learning outcomes.
- Use Bloom’s Taxonomy (cognitive processes + knowledge types) to plan question levels and select verbs that target intended thinking.
- Avoid biased or inaccessible stems; examples should be inclusive and relatable to the student population.
- Design tests with realistic expectations about timing, difficulty, and whether the exam assesses mastery or progress.
Practical methodology and recommended workflow
- Plan the exam early: draft the test at the start of the unit to determine which items are pure recall and which require higher‑order thinking.
- Mix cognitive levels:
- Include low (recall), mid (application/analysis), and fewer high (synthesis/evaluation) items.
- Consider weighting higher‑order items more heavily.
- Scaffold learning and assessment: teach with practice items that step students through the thinking process, then give full synthesis items on the exam.
- Time design: assume about 2+ minutes per MC question; adjust the number of items to fit exam duration. For short exams, reduce MC count and consider short answers or matching items.
- Post‑exam item analysis: review item difficulty distributions and cognitive-level balance (e.g., are most items Bloom levels 1–2?). Use results to adjust future tests.
- Check for answer‑pattern artifacts (e.g., long runs of the same letter) and avoid intentionally creating patterns that undermine student trust.
Rules and guidelines for writing MC items (stem + alternatives)
-
Structure: an MC item consists of a stem (question/problem) and alternatives (one correct answer + distractors).
-
Stem guidelines:
- Be meaningful and self‑contained — include the full scenario or issue; don’t rely on answer text to complete the stem.
- Be clear and focused so there is one best answer.
- Use direct questions rather than sentence completions when possible.
- Prefer positive phrasing; avoid absolutes (always/never) and double negatives.
-
Alternatives (distractors) guidelines:
- Be plausible and useful — distractors should discriminate between students who understand content and those who don’t.
- Be homogeneous and parallel in form (similar length, grammatical structure, and complexity).
- Be mutually exclusive — avoid two options that could both plausibly be correct.
- Be concise and clearly stated — avoid long, verbose options that act as mini‑questions.
- Order options logically (alphabetical or numerical) unless there is a pedagogical reason not to.
- Avoid grammatical cues that reveal the correct answer; ensure subject/verb agreement matches uniformly.
- Where appropriate, include common student misconceptions as distractors to help diagnose misunderstandings.
-
Things to avoid:
- Frequent use of “All of the above” / “None of the above” (they can make guessing too easy); some platforms forbid negatives.
- Large numbers of options — 3–4 options are common; use 5 only if necessary. More than 5 often indicates a different item type is more appropriate.
- Trick questions that test reading‑decoding or test‑wiseness rather than content understanding.
- Culturally or experientially specific examples that not all students can relate to.
How to design higher‑order thinking MCQs
- Place the concept in context (case studies, graphs, data tables, plots, experimental results).
- Require synthesis of multiple knowledge elements (memory + application) before selecting an answer.
- Use tasks that require interpretation, justification, or discrimination (e.g., choose the best justification for a treatment decision).
- Use multi‑step questions or question pairs: scaffold with one item that walks through steps, then follow with an item that integrates them (different point values possible).
- Include visuals/graphs/diagrams across disciplines when assessing analysis and evaluation (science graphs, physics ray diagrams, literary passages, etc.).
Test administration and logistics
- Give explicit, clear exam instructions (what each section tests, time allowed, and question types).
- Consider online testing behaviors and platform constraints:
- Randomizing answer order is usually fine, but consider implications for question flow.
- Lockdown browsers and institutional platforms (Respondus, ExamSoft) can affect allowed item types and presentation.
- Scoring:
- Weight questions differently if desired (e.g., multi‑part items or higher points for synthesis items).
- Prepare students:
- Provide practice questions and a cheat‑sheet/handout with Bloom’s verbs and item‑writing rules so students know what to expect.
Common pitfalls to watch for
- Long correct alternatives that stand out — make alternatives parallel in length and style.
- Partial stems (fill‑in blanks) that force extra reading and reduce clarity.
- Heterogeneous options or obvious outliers that remove discriminatory power.
- Fluff or irrelevant story text that confuses students — be concise.
- Double negatives and negative phrasing that increase cognitive load and test reading rather than content.
- Example contexts that work well for higher‑level MCQs when written carefully: protein signal peptide plots, clinical/social‑work vignettes, physics lens diagrams, poetry passages.
Practical tips to implement now
- Keep a Bloom’s verbs list handy when writing stems to target desired thinking levels.
- Draft items early and practice them with students; show strategies for approaching synthesis MCQs.
- Review your item bank for grammar, parallel structure, and distractor plausibility.
- Limit options to 3–4 unless there is a clear rationale for more; avoid “all/none” unless pedagogically required and allowed by your platform.
- Quick prefinal check:
- Can you answer the item from the stem alone if you cover the alternatives?
- Are distractors plausible and homogeneous?
- Is the timing realistic?
Speakers and sources mentioned
- Presenter: Dr. Michelle Hardy — Program Manager for TA Training at the CTE; instructor, School of Earth, Ocean & Environment.
- Participants referenced: Georgina (CTE graduate student), Kelly, Adriana, Chase, Ann, Stephanie, Dr. Catropa / Dr. Catrepo (name appears with several spellings in transcript).
- Organizations/tools/websites: CTE (Center for Teaching Excellence), School of Earth, Ocean & Environment, ExamSoft, NBME, Respondus LockDown Browser, Iowa State University Bloom’s taxonomy interactive resource.
- Key references: Bloom’s Taxonomy (framework for item design); Jeopardy used as an analogy for trivia‑focused testing.
Category
Educational
Share this summary
Is the summary off?
If you think the summary is inaccurate, you can reprocess it with the latest model.
Preparing reprocess...