ASM events
This conference is managed by the American Society for Microbiology
Table of contents
No headers

1a. Lopatto, D. et al.  (2008).  Genomics education partnership.  Science, 322, 684-685. 

1b. Shaffer, C.D. et al.  (2010).  The genomics education partnership: successful integration of research into laboratory classes at a diverse group of undergraduate institutions.  CBE-Life Sciences, 9, 55-69. 


The Genomics Education Partnership (GEP) is similar to our project in two ways.  First, the goal of the project is to give valuable research experience to students in the context of a course, which allows large numbers of students to participate.  Second, students generate novel genomics data that are “published” on a public database.  Again, this is a method by which large numbers of students can make a contribution.  The assessment described in the Science paper is a multiple choice survey given at the end of the course that address attitudes rather than content.  The most noteworthy result, and a result very relevant to our work, is that students involved in classroom research respond similarly to students in summer research internships, and both of these groups respond differently from students in non-research courses.   


2. Russell, S.H., Hancock, M.P., and McCullough, J.  (2007).  Benefits of undergraduate research experiences.  Science, 316, 548-549.   


This is a survey of a large number (6600) of science students.  The main results are that students that had an undergraduate research experience increased their confidence in doing research, their interest in pursuing STEM careers, and their expectation of obtaining a Ph.D.  The research experiences were traditional internships in someone’s lab, rather than in the context of a course.  Thus although it does not directly address the issues of classroom research, it is a good overview of the general benefits of research on student career attitudes.  Most of the data and analyses are on-line, not in the short paper. 


3. Thiry, H, Weston, T.J., Laursen, S.L, and Hunter, A-B. (2012).  The benefits of multi-year research experiences: differences in novice and experienced students’ reported gains from undergraduate research. CBE-Life Sciences Education, 11, 260-272. 


The student research examined in this paper is done in professor’s labs rather than in the classroom, but the relevant aspects of this paper are the assessment methodology and the comparison of novice and experienced undergraduate researchers.  They describe their method of “triangulation,” using multiple methods of collecting data to validate their results.  The biggest hurdle in our assessment is our lack of true control groups, so we need to investigate alternate ways to measure effect.  The researchers use an instrument they call the Undergraduate Research Student Self Assessment (URSSA) that does not seem to be available on-line (the website states that it is now closed).  Their results show that novice and experienced student researchers gain from the research experience in different ways: novice students gain confidence in more basic skills (collecting data), and experienced researchers gain confidence in higher-order scientific thinking (interpreting data or designing a future experiment).  We might be able to take these findings into account in our own assessment - some of the students in our courses have no research experience, others have been working in a lab since freshman year. 


3. Dirks, C. and Cunningham, M.  (2006).  Enhancing diversity in science: Is teaching science process skills the answer?  CBE-Life Sciences Education, 5, 218-226.   


The authors demonstrated that a freshman “science process skills” class (graphing, data analysis, experimental design, and scientific writing) significantly increases the grades of students in subsequent sophomore introductory biology, compared to students that did not take the science process skills class.  The authors use an ANCOVA to take into account any differences in preparedness measured by high school SAT scores.  I like this article because the results are impressive, and the statistics are rigorous, and the Introduction does a great job of summarizing the status of URM and women students in science.  This relates to our work in that it demonstrates the importance of expressly teaching science process skills.  We include these skills as part of the students’ original research, but they deserve greater emphasis.  More specifically, the authors use an analysis I found compelling.  In their course they did a pre/post test of data interpretation, and they split the post-course results into students that scored above and below the median on the pre-test.  High pre-course scorers showed little improvement, but low pre-course scorers improved significantly on the post-course test (Fig. 3, paired t-test).  We have had the problem in our assessments that the pre-course scores, especially on attitudes, are on average quite high so we are unlikely to see much change.  But it may be that we need to do a better job of teasing apart student attitudes and/or skills when they enter the class. 


4. Hagenbuch, B.E. et al.  (2009).  Evaluating a multi-component assessment framework for biodiversity education.  Teaching Issues and Experiments in Ecology, 6. 


The Network for Conservation Educators and Practitioners (NCEP), based out of the American Museum of Natural History, is a global initiative to promote conservation education.  Toward this end they have developed numerous modules that can be used by educators.  The authors of this paper developed a framework to assess the impact of these modules.  The assessment included pre- and post- multiple choice surveys to examine 1) content knowledge, 2) confidence and interests (a version of SALG), and 3) environmental orientation and worldview (New Ecological Paradigm, NEP).  The content assessments are not included in any of the appendices, but the other two assessments are.  One result I have seen in my previous assessments is that students had increased resolve to preserve biodiversity after our classroom intervention.  I would like to be able to understand this in more detail, and the NEP is a possible instrument.  (It seems simplistic, however.)  The analysis they used is similar to what I have used – a paired t-test comparing individual changes.  They also test how a change in content knowledge is correlated with a change in confidence, something we have been interested in with our data. 

Tag page
You must login to post a comment.