ASM events
This conference is managed by the American Society for Microbiology
Table of contents
No headers

New Additions (post institute assignment 5)

Beck, C., Butler, A., & Burke Da Silva, K. (2014) Promoting inquiry-based teaching in laboratory courses: Are we meeting the grade? CBE-Life Sci. Educ., 13, 444-452.

In this article the authors do a meta-study of inquiry-based exercises published in peer-reviewed journals. They examine types of exercises developed and the assessment of the effectiveness of the inquiry-bsed exercises.142 reports of inquiry based exercises were compiled from papers dating from 2005-2012 providing a useful database. Most of the exercises were aimed at upper level students, leading the authors to claim that more focus is needed to bring inquiry-based labs to introductory and non-major students. The majority of the assessment methods involved qualitative measures and student self-assessment. Published validated assessments were used in the minority of cases. In addition, few studies included control groups or multiple institutions. Overall the inquiry-based approach appears to lead to increased learning gains. However the authors caution that more controlled and multi-instutional studies are needed and encourage the use of published assessment methods to facilitate comparisons. They also raise some issues that may cloud the picture such as people may be preferably publishing studies with positive outcomes of inquiry-based learning. I found this article to be useful because my project centers on assessing the effects of inquiry-based exercises on student learning gains in experimental design and scientific literacy.

Gormally, C., Brickman, P., & Lutz, M. (2012) Developing a test of scientific literacy skills (TOSLS): measuring undergraduates' evaluation of scientific arguments. CBE-Life Sci. Educ.,11, 364-377.

This article presents a test of scientific literacy skills that the authors develop, validate and test. The test is to be used to assess the impact of curricular reform in biology on student literacy skills in biology. I plan to use this assessment in my research project to measure student learning gains in biological science literacy as a result of the inquiry-based approach. Because the course focuses on microbiology, whereas the questions on the test cover more general biology, it will be interesting to see if students can extend their skills beyond the immediate subject area of the course. The questions cover several different skill sets, some of which are directly emphasized in the course more than others. It will also be of interest to determine if those skills given more emphasis in class exhibit greater gains that other skills. An optimistic hypothesis is that the course hones student critical thinking skills overall, not just those directly taught.

Brownell, S.E., Kloser, M.J., Fukami, T., & Shavelson, R.J. (2013) Context matters: Volunteer bias, small sample size, and the value of comparison groups in the assessment of research-based undergraduate introductory biology lab courses. JMBE, 14, 176-182.

The authors of this article examine deficiencies in discipline-based education research on research-based lab courses by focusing on their own studies. Key areas of difficulty they focus on are volunteer bias (volunteers represent a specific group that may not be representative of the student population), small sample size, and lack of comparison groups. Using their work in an introductory biology course at Stanford, the authors show how assessing gains in interest in science and confidence in abilities vary depending on the whether the students were volunteers or randomized. Volunteers showed greater gains. Furthermore comparison of the randomized group to a matched group in a cookbook lab did not show significant advances for the students from the research-based labs. However, students in the research-based course did show gains compared to the cookbook group in other areas. In short, this article provides a gut-check for how your research design can skew your results and provides a good resource for what factors may be influencing your results. Acknowledging the limitations of your study is critical to allow others to interpret your results and just how generalizable your results might be. 

Summer Bibiliography

Marback-Ad, G., McAdams, K.C., Benson, S., Briken, V., Cathcart, L., Chase, M., … Smith, A.C. (2010) A model for using a concept inventory as a tool for students’ assessment and faculty professional development. CBE-Life Sci. Educ., 4, 408-416.

In this article the authors discuss the creation and implementation of a concept inventory to evaluate student learning related to host-pathogen interactions for students at the University of Maryland in the microbiology major. They use the results of the concept inventory to inform curricular reform efforts and for faculty development. The concept inventory was administered as a pre and post test in several different courses for several years. The finding most relevant for my interests was that significant learning gains were observed in a few of the courses, in particular the general microbiology course. Scores at the end of the general microbiology course were similar to pre test scores for upper level courses indicating retention of concepts. I would like to develop a similar concept inventory to test student understanding and retention of fundamental concepts before, immediately after and several semesters after my class. Although students will take other classes, we can compare students from the inquiry-based lab class to those taking similar course progression who did not take the inquiry-based class.  

Sirum, K. & Humburg, J.  (2011) The experimental design ability test (EDAT). Bioscene: J College Biology Teaching, 37, 8-16. 

Brownell, S.E., Wenderoth, M.P., Theobald, R., Okoroafor, N., Koval, M., Freeman, S., Walcher-Chevillet, C.L., & Crow, A.J. (2014) How student think about experimental design: novel conceptions revealed by in-class activities. BioScience, 64, 125-137.  

These two articles focus on assessing students’ scientific literacy skills. Sirum and Humburg report on the development of a tool to assess scientific literacy call the experimental design ability test (EDAT). The EDAT is content independent and requires students to recognize that an experiment can be done to address a claim and then design the experiment explaining their thinking. A rubric was developed along with the EDAT to score the students’ abilities in key areas of experimental design including both lower order and higher order skills. Although the goal of the work was to develop the assessment tool, implementation revealed that students taught through inquiry-based instruction exhibited higher learning gains than students taught through lecture and descriptive labs. A drawback of the EDAT is that without specific prompts, students may not think to include particular information in their responses that they do actually understand. 

The article by Brownwell and coworkers uses an expanded version of the EDAT (E-EDAT) to evaluate how two in-class activities impact students’ scientific literacy. One of the activities is involves analytical evaluation of data while the second involves experimental design. The E-EDAT includes prompts addressing some of the shortcomings of the original EDAT. The authors find that the activities do appear to lead to learning gains on the E-EDAT. In addition to the E-EDAT, the authors evaluate student responses to questions on the in-class activities about the role of repetition and sample size in experimental science. They uncover several misconceptions about these concepts. One limitation is that the students’ misconceptions could reflect a misunderstanding of the vocabulary rather than an actual misconception. 

I am working on an inquiry-based introductory laboratory course, in which experimental design is a huge component. Tools such as the E-EDAT and EDAT might be useful in assessing the development of the students’ abilities to design experiments over the course of the semester. In addition the worksheets used for the analyze activity might be useful as a starting point for a more specific tool to examine analytical skills. 

Luckie, D.B., Maleszewski, J.J., Loznak, S.D., & Krha, M. (2004). Infusion of collaborative inquiry throughout a biology curriculm increases student learning: a four-year study of “Teams and Streams.” Adv in Physiology, 28, 199-209 

In this paper the authors describe a research-based biology lab course that combines student-designed experiments with a strong writing component, in which students write multiple drafts of a paper (in a format for a scientific journal) reporting their research. The effectiveness of this approach was assessed using qualitative and quantitative measures. The qualitative assessment involved analyzing students’ perceptions of the course and their learning gains through their answers to questions on the course evaluation forms. Comments were coded as positive or negative. To get a quantitative measure of learning gains, the authors used the MAT test developed by the American Association of Medical Colleges. In comparing the assessments from students taking the new course to those taking the traditional lab, students’ from the new course had significantly more positive comments. The MAT scores were also higher for the students in the new course. A limitation of the study is that the qualitative measure was the course evaluations. A survey like the CURE from Grinnel college could provide a broader assessment of student self-reported learning gains. I have not been sure about how to or whether to include responses from standard end of semester course evaluations and this study presents a useful metric. I also found the structure of the new course itself to be very interesting and spurred my thinking about whether a strong writing component would enhance learning gains by having the students think more deeply about their work. 

Loike, J.D., Rush, B.S., Schweber, A. & Fischbach, R.L. (2013) Lessons learned from undergraduate students in designing a science-based course in bioethics. CBE-Live Scie Educ. 12, 701-710. 

In this study the authors discuss a course that focuses on bioethics as it relates to emerging biotechnologies. They chose a qualitative assessment methodology to investigate how students were integrating bioethics as a process to examine issues. The assessment tool was a student-written essay describing the student’s personal ethical descision-making strategy to address issues in bioethics. The researchers coded the student answers to tally key topics mentioned by the students. There are some benefits of this approach, including students not being influenced by leading questions from an interviewer in a focus group or on a questionnaire. It would be interesting to combine a self-reflection piece like the authors used with a case study that the students needed to work through, i.e. have the students apply their ethical reasoning skills to a relevant problem and then reflect on what strategies they used to arrive at a decision. This paper is relevant to a second project that I am developing. I have integrated ethics into a course that I teach. From the student comments, they really appreciate the units. I am very interested in asking whether the students are developing critical ethical thinking skills as opposed to “gut instinct”  and are now more cognizant about how ethical issues can arise from scientific research. 

Tag page
You must login to post a comment.