ASM events
This conference is managed by the American Society for Microbiology
Table of contents
No headers


While I’m still trying to solidify my specific research question, my basic SoTL interest is in finding ways to promote student learning in my physiology courses, and more specifically, the physiology labs which have been the greatest pedagogical challenge for me and the most overwhelming for many physiology students. I’ve been fascinated with cognitive load theory and how minimization of cognitive overload, particularly crucial to the physiology laboratory curricula, may have significantly positive impacts on student learning and information retention. I’ve come across a veritable gold mine of papers that have been germane to my SoTL interests, and the papers I’ve selected to include in this preliminary bibliography have thus far been the most inspirational in a couple of ways, i.e., (1) in helping me to plan novel strategies for developing new or renovating old lab activities that are more effective in meeting course learning objectives and (2) in providing quantitative evidence for the effectiveness of the teaching strategies.    
1. Sweller J, van Merrienboer JJG, and Paas FGWC, “Cognitive architecture and instructional design,” Educational Psychology Review 10(3): 251-296, 1998.
2. Paas F, Renkl A, and Sweller J, “Cognitive load theory and instructional design: recent developments,” Educational Psychologist 38(1): 1-4, 2003.
3. Schnotz W and Kürschner C, “A reconsideration of cognitive load theory,” Educational Psychology Review 19: 469-508, 2007.
4. Michael J, “What makes physiology hard for students to learn? Results of a faculty survey,” Advances in Physiology Education 31: 34-40, 2007.
5. Michael J., Modell H, McFarland J, and Cliff W, “The ‘core principles’ of physiology: what should students understand?” Advances in Physiology Education 33: 10-16, 2009.
6. Michael J, “Where’s the evidence that active learning works?” Advances in Physiology Education 30: 159-167, 2006.
1. Williams JB, “Assertion-reason multiple-choice testing as a tool for deep learning: a qualitative analysis”, Assessment & Evaluation in Higher Education 31(3): 287-301, 2006.
This article concerns a type of MCQ in which the student identifies the reason for the correctness or incorrectness of some statement, and gives examples illustrating the format. The author’s context is graduate school of business, and in the study students were using a MCQ survey and also open-ended questions to give their opinions of the ARQ (assertion-reason) style of question. Among the interesting student perceptions is that these questions are difficult because they require having better language skills than do rote memorization type MCQ, a comment I also get from students about conceptually similar MCQ. Having discovered the article in March, I did a limited try of the ARQ format in one of my classes: my students didn’t like it because the format was unfamiliar, and because they didn’t like having only one choice for the “because” aspect of the question (in my alternative format the answers are all possible “becauses”).
2. Wilson RB and Case SM, “Extended matching questions: an alternative to multiple-choice or free-response questions”, J. Veterinary Medical Education 20(3): 7 pages, 1993 (I got this online at
This paper is mainly a description of the “extended matching multiple-choice question” (EMQ) format, showing up more now in medical education references: the format includes a statement of a theme, a list of options (an “extended” list, that is, more than 4 or 5),  a lead-in statement (general enough to set up 2 or more individual question stems), and 2 or more item (question) stems. The items can vary in complexity: in one example set given in the paper the items contain laboratory test results. This method is said to obviate difficulties in choice of distractors, as well as providing different correct answers for different items in the set. It seems to me this method might be quite successful in medical microbiology courses. The authors provide their view of comparative success of this format compared to essay, short answer, and conventional multiple choice questions, which is that the EMQ format may evaluate application of knowledge less well than essay type questions but allows much better coverage of course materials. Another advantage of the format is that the theme and options list could be reused simply by providing different stems.
3. Beullens J, Struyf E, & Van Damme B, “Do extended matching multiple-choice questions measure clinical reasoning?”, Medical Education 39(4): 410-417, 2005.
This article again concerns the EMQ format. The format has been thought effective in testing factual knowledge and the authors wished to find out whether it tests clinical reasoning. As they stated, there is no “good assessment instrument of clinical reasoning” that they consider a “gold standard”. They evaluated the format using oral responses given by students and observed and rated by professionals, comparing a group of medical students to a group of residents, who are presumed to have better developed clinical reasoning skills. Examples of the questions are provided. I can see adapting the format for use in medical microbiology, especially.
4. Deutch CE, “Using data analysis problems in a large general Microbiology course”, American Biology Teacher 59(7): 396-402, 1997.
This paper describes an investigation using a “setup” that describes some simple/simplified results from a technical article, gives one or more data tables from the article, and then a set of 2 to 5 MCQ that relate to that information. The questions use the setup as a context in which to ask for recall of terms or concepts learned in the class generally (type A),  or ask students to show comprehension of data presentation in the table(s) (or figure) (question type B), or ask students to draw conclusions from the data presented (type C). Deutch used such question groups for about 20% of his exam points. I like this format and may adopt it along with the other MCQ formats I favor. Examples of 3 sets of questions are included as an appendix to the paper.
Although I have used individual (not sets of) questions like this, at the moment my interest is in somewhat similar questions except that following the setup the student is asked to choose, from among 5 small data tables, the one that is predicted to result from the experiment described in the setup. The way we’ve used such questions to date makes it likely that students who have been attentive in recitation will be able to answer correctly, so that the ability to form suitable hypotheses predicting expected results may not be tested directly, and I’d like to devise a way to investigate this.
I haven’t so far figured out any way to search ERIC to turn up examples of investigation of MCQ of this type.
5. Gronlund NE, Assessment of Student Achievement, Allyn and Bacon. It’s available in 8th edition 2006 but I’m currently reading the 6th ed. 1998 because the other is checked out of my library.
There is a huge amount of instructional material “out there” on how to write MCQ: I’m appreciating this source at the moment because it addresses the issues of question design, exam planning, grading, and so on in terms that relate directly to my concerns and seem to be more helpful to me than other sources I’ve stumbled across so far.
Tag page
You must login to post a comment.