Mini-Case Study and Peer Group Action Research in Human Physiology

Cronmiller Poster

Reflections on Collaborative Professional Development and the Fellows Program

Glenn McClure, Adjunct Professor of English, SUNY Geneseo

As I look back on my experience as a 2013 CCTE Fellow, I can identify three areas of professional development that have emerged from the guidance and collaboration that was at the heart of this project.

First of all, the collaborative discussions with great K12 and higher education teachers was very fruitful and inspiring. I have worked in both worlds and I currently spend more time in higher education. That being said, the rift between these worlds is not constructive for our students. The meta-structure of Western education moves from general to specific. The holistic education of the pre-school classroom gradually gives way to increasingly specialized study that culminates in the narrowly focused Phd thesis. Running parallel to this increased specificity of instruction is the increased isolation of disciplines and the teachers who teach them. We have specializations within our academic departments that discourage many fine teachers from general education. Furthermore, our disciplinary boundaries are reinforced by a professional culture that values specialization over synthesis. All this happens within the higher education world. For many college professors, talking to someone from another department is a “bridge too far, “ so we can only imagine the gaping chasm that exists between college professors and K12 teachers. This rift is exacerbated by assumptions behind the hierarchical nature of professional achievement. Also, within the K12 world, the daily schedule discourages collaboration between teachers of different disciplines, especially between academic teachers and arts teachers.

However, while we as adult professionals create systems that reinforce and deepen our intellectual isolation, our students must live within and attempt to achieve something in the middle of it all. The Math teacher teaches math all day while the high school student must shift from Math to English, Social Studies, Science, Music, etc. to get to the end of the day. In direct opposition to institutional education, the digital world encourages a culture of multi-tasking that combines many activities within the same time frame. While the digital world has its own propensity for isolating ideas and customizing our attention, there is an underlying assumption that connectivity is better than “siloed” knowledge.

If we are going to serve our students better in an increasingly connected world, all of us in the P-16 spectrum need more opportunities to collaborate like this fellowship. We will be able to respond to the needs of our students better by understanding where they come from and where they are going.

Secondly, this fellowship has offered new opportunities to work with colleagues I already know. I have worked with Prof. Tracy Peterson for many years prior to her appointment to the education school at SUNY Geneseo, but this fellowship brought us together in a new way. It was a joy to work with Tracy in designing my action research project and her observations of my classroom were very useful. Furthermore, the observations of Michelle Costello also brought new insights into my teaching. Peer review usually happens only within the context of promotion reviews or grant panels. I found their observations so useful, that I intend to coordinate observations with trusted colleagues each semester. While this will be done in an informal setting, it may lay the groundwork for a more formal plan in the future.

Thirdly, the focus of the action research has led to a renewed commitment to use music and multi-media materials in my Humanities classes. I have learned new ways to hone down the design of my current lessons and found materials for new lessons. The evidence generated by both the quantitative and qualitative measures of the study offers a new way to discuss the effectiveness of this strategy in the hopes that others may choose to incorporate it into their lessons. I come away from this fellowship with enough material to develop a new academic paper and conference presentation that will be presented both on campus in several formats and also for an international education conference in the Fall of 2014. Finally, I am most excited about how this year has laid the groundwork for a new, multimedia textbook proposal to the OPEN SUNY initiative. This is a new program hires professors to write online textbooks that will be distributed throughout the many campus of our state university system.

I look forward to staying in contact with this growing community of practice in the future.  For more about my work in the Fellows Program, please see my research report on Arts Integration in the Teaching of Western Humanities.

 

 

Socratic Seminars in Earth Science and AP Environmental Science

Ellen Post, Rochester City School District

Socratic seminars have been shown to increase critical thinking. This is a challenge for many of my students. On state assessments they are fine with basic content, regurgitation or memory questions; however they struggle when asked to think critically about the content.

In order for me to see some success using Socratic seminars I had to scaffold carefully. I did my best to choose text that met my students reading levels (lexile scores = range 800-1200) and moved them up over time. I ran units simultaneously with one class using seminars and one without. I then switch classes for the next unit. Data was collected on two different class sections of earth science and one section of AP environmental science.

To measure student motivation, a survey was given to students and I recorded student feedback and observations during class work time, seminars and discussions throughout the units. A comparison of assessments showed an 8.4% increase in scores on assessments when Socratic seminars were used during the unit. Student survey results showed an increase in motivation and positive feelings about the class.

I found that preparation for the seminars is time consuming, especially when I first began. However, I have found that when comparing the length of class time used to cover a topic was less than the previous year, and students understanding were deeper and retention was higher.   For more information on this work, please see my action research report.

Collaborative Teaching an Undergraduate Education Methods Course

Michelle Costello, Education and Instructional Design Librarian, SUNY Geneseo          

Kathryn Rommel-Esham, PhD, Education Professor, SUNY Geneseo

CURR 316: Teaching Science and Math to Children

Katie’s thoughts…

Michelle and I have been working together doing (one shot) library instruction sessions for a number of years. A couple of years ago, I asked her if she would be willing to work more closely with me on a specific assignment that my undergraduate preservice teachers undertake in preparation for their work with elementary students in the classroom. This assignment has them find (fiction) trade books that they can use as a jumping off point for math and science lessons. Since my students had been consulting with Michelle for several years prior to our “official” collaboration, I thought it was only natural to include her in the process.

Our collaboration started with this one assignment, then grew rapidly! I knew that asking Michelle to take a more active, co-teaching role in my class would energize my own teaching and allow us to implement some activities that I was unable to complete on my own (for example, she debriefs model lessons that I do in class, allowing students to be honestly critical and thoughtful about my teaching). Furthermore, I wanted the students to view Michelle as a partner in the process, not merely as a “resource” into which they could tap. Integrating her into the course has many positive aspects, including allowing the students to experience team-teaching first hand; letting them see that while I am the “expert” in some areas, she is clearly the expert in others; making them aware that she is familiar with all of the assignments (and the methods of assessment of each), and thus increasing her credibility as a trusted advisor when students have questions.

This formal partnership has definitely improved my own classroom teaching! For example, when I know that Michelle will be debriefing a model lesson with our students, I am more careful to include all the elements I want the students to see. Our common planning is a collegial and pleasant experience, and I think that comes through to our students as well. Michelle and I developed a series of class meetings that focused on authentically integrating technology into science instruction, a facet that (I felt) was poorly addressed prior to our collaboration. We are now looking at ways to authentically incorporate technology into math instruction, in less “obvious” ways that by using graphing calculators for example.

I am grateful to Michelle for being willing to work so closely with me and our students and feel like it has been a successful collaboration for everyone involved. I am excited that we already have plans in the works for new things to add to the course, and are preparing conference presentations and manuscripts detailing the successes of our collaboration!

Michelle’s thoughts…

In the fall of 2013, I was given the opportunity to be fully embedded in an undergraduate education course. Although, I have been teaching information literacy (IL) for over six years, the vast majority of those classes have been “one-shots.” This venture would give me the chance to experiment with something I have wanted to try for a long time (as I have done a lot of research on the benefits of embedded librarianship). It would allow me to get to know the students (and their assignments) better, and hopefully in turn they would get to know me and feel more comfortable asking for help. It would also give me the chance to team teach with Katie (an education professor) and be fully enveloped in subject specific content.

The collaboration consisted of our co-teaching nine lessons; four focusing on teaching methods and debriefing, two on lesson planning, and three on teaching with technology. Information literacy standards such as; determining the nature and extent of the information needed, accessing needed information effectively and efficiently, and evaluating information and its sources critically, were authentically integrated into the class content. I also met with students outside of class, to help them find and evaluate material for their assignments.

Overall, I felt that this collaboration was a success, and one of the the most significant things I learned was that personal connections with students and instructors are beneficial to all involved. We instituted a student survey to gauge feedback on our co-teaching efforts, the specific lessons we taught, and meetings with us outside of class. Student responses solidified our expectations that co-teaching can be beneficial, not only for them but for the instructors as well. Katie and I also recorded our observations throughout the project which again showcased the benefits of collaborating.

An unexpected result of this project, however, was the profound impact it had on me. In addition to feedback received from students I was also given advice and guidance from Katie. I was able to watch her model her expertise in class and our follow-up discussions afterwards were invaluable. We were able to talk in detail about our experiences in the classroom and she offered many suggestions for improvement to the lessons. We plan on continuing this collaboration into the future.  For more about this project, see my action research report.

 

 

Learning Targets and the Use of “Target Trackers” in U.S. History

Katherine Lanning, Rush-Henrietta Senior High School

Are you searching for methods to improve student achievement? Would you like to see your students more engaged and accountable for their own learning? If so, read my action research report to learn more about:
(1) How to integrate daily lesson “targets”
(2) How students can use these targets to monitor their own progress
(3) How self-assessment encourages student accountability and raises achievement
(4) How teachers can use targets to make responsive and informed instructional decisions

As a veteran high school U.S. History teacher, I’ve tinkered with many instructional changes, but I was looking for strategies with a greater “cultural” impact. While my students have typically performed well on their state assessments, I was looking for research-based practices to improve student achievement and “mastery” of content. At the same time, with the implementation of the CCLS, I wanted to develop a classroom approach to instruction that reflected the new standards and the importance of academic vocabulary. A third goal was to increase student engagement and accountability. While my lessons were structured and purposeful, and my students were “busy” and cooperative, I felt that my students needed to become more active partners in the process.

This fall, I made some instructional changes which involved the implementation of “Learning Targets” and the use of a student log sheet known as a “Target Tracker”. “I can” statements were used daily to provide clarity for learning goals, with an emphasis on embedding academic vocabulary within social studies content. Assessments were designed to measure student progress in relation to the learning targets, and instructional decisions were made based upon the results of these assessments.

Analysis of student test scores and a modified student “tripod” survey showed that there was a positive shift in students’ attitudes and a significant increase in “mastery” achievement. As learning goals became more visible and internalized by students, they assumed greater responsibility for their own learning, which in turn seemed to have an impact on their academic performance. While my research was designed primarily to examine the impact of learning targets on students, the impact on my teaching was equally profound.

 

Developing an Understanding of Statistics Using R.

Rob Feissner PhD, Laboratory Instructor in Biology, SUNY Geneseo

For the Fall 2013 semester of my college freshman-level General Biology Laboratory class, I overhauled the way statistics was taught. The reason for the overhaul was two-fold; I wanted to move toward an inquiry-based experience to replace a lab-on-rails style tutorial and I wanted to migrate to the free and open-source ‘R’ statistical programming software environment as the platform for data analysis. As with any change in curriculum, questions arose. Would the introduction of a new and complex software environment (R) challenge students to explore HOW statistics work or simply overwhelm them? Would students be successful in independently using collaborative peer interactions and freely available internet resources to help them solve problems? To answer these questions I spent one semester carrying out an action research project to investigate these questions

The decision to change software was prompted by the adoption of ‘R’ as the primary tool for data analysis in upper-level courses, as well as the fact that it is free and usable on any computer platform. The old laboratory was a walkthrough experience that led students through the procedures for completing statistical tests. Because the lab simply led students through entering numbers in a spreadsheet-like interface and selecting options from drop-down menus, the reasoning for what was being accomplished was lost to many students.

I designed the new laboratory to be an integrated component of the entire semester rather than a single, one-time activity. The main goals were to emphasize statistical literacy and develop statistical thinking, use real data in an authentic manner, and stress conceptual understanding rather than rote memorization of procedures. The first exposure to statistics was through a lab entitled “Introducing R” that provided a self-guided lesson on how to acquire, install, and become familiarized with the R software itself. A second lab entitled “Using R for Biological Statistical Analysis” was an inquiry-based lab in which students solved progressive problems by searching for help within the internal help system and on the internet to develop an understanding of data analysis using ‘R’ and how to employ statistics to test hypotheses.

After switching to the R-based lab, grades from all graded lab assignments (n=248) were tabulated and compared to the grades from a similar assignment from the prior 4 semesters. My hypothesis for changing the lab experience was that student achievement would increase with an inquiry-based approach as compared to the old recipe-book style approach. The median grade, however, dropped upon switching labs. A number of explanations for the grade drop are discussed in my research report.

Despite the small drop in grades on the statistics assignment, I noticed some positive changes in lab reports through the semester. Whereas students in the past relied on graphs to present data in lab reports, I started seeing students voluntarily using ‘R’ to analyze their data statistically. For their final project, between 50% and 80% of student groups both chose to use some sort of statistics to analyze some or all of their data. With one semester of data to draw upon, my impression is that while students were challenged to put together the concepts of a statistical analysis with little guidance, the struggle led to a deeper understanding of why statistics are useful. It will be interesting to follow these students through their college career to see if the groundwork established in freshman general biology will lead to higher achievement in upper-level classes that depend heavily on ‘R’ usage.

Waiting to Exhale

Patricia Moynihan, Physics Teacher, Rochester City School District

In the myriad of projects, programs, collegial learning circles, classes, and professional development opportunities I have undertaken, I endeavor to achieve one thing – to improve learning experiences for my students.  All too often we “try” something new and then hold our breath, waiting to see the vast improvement in student achievement.

I was thrilled when I received word that I had been chosen to be a 2013 Fellow in the Community Center for Teaching Excellence Action Research Program.  All too often in a professional development setting, a cursory overview of a proven teaching strategy is provided, without the time or resources to effectively implement the concept.  The CCTE program provided the perfect setting and ample opportunity to not only learn about high impact teaching strategies but to design, refine, practice, test and eventually incorporate a proven pedagogy into my teaching practice.  Implementing collaborative learning techniques in my classroom revealed to me the true dedication and commitment required to improve my teaching practice and implement new techniques. I also learned the importance of teaching my students to develop a reflective process of their own, to assist them with their learning objectives.

One final take-away from this extraordinary program, is that I now know that improving my teaching practice is a continuous process. I must inhale and exhale through the ups and downs of successes and failures as there is no end point, just a beautiful ride through the rolling hills and valleys of daily teaching. And I am fortunate for the beautiful view of the children’s lives that I will impact and be touched by.  To view my action research report, please visit Using Collaborative Learning Techniques in the Earth Science Classroom.

Exam Wrappers: a Strategy for Promoting Metacognition in Undergraduate STEM Classes

PKAL S14 v 2

Metacognition and Student Performance in Advanced Physiology, Systems Physiology, and Neurophysiology

Dr. Adam Rich, Associate Professor of Biology, The College at Brockport

The CCTE Fellows Program for K-20 educators appealed to me because I was interested in meeting and working with a group of professionals that have a common goal of student success and are interested in sharing what works best in their own practice and in studying the effects of different teaching methods on student learning.  I am a college professor and have taught for 10 years.  I teach several courses in the Department of Biology, including Anatomy and Physiology (A&P), Systems Physiology, and Neurophysiology.  The A&P course is high enrollment with over 300 students.  The student population is heterogeneous in their future goals and in the preparation for the course.  The course is required for Pre Nursing, Health Science, and Kinesiology students and is an elective for Biology majors.  Some students are well prepared and highly motivated.  I spend 30% of my time doing physiology research and have long wanted to contribute to the educational fields in a similar way that I contribute to physiology.  My major goals for the past several years is to identify the best places to spend my energy, and to know what is most important for the largest fraction of the student body (when it comes to learning).  I am very sensitive to the possibility that my own interpretation of student learning might be wrong, and therefore I look for evidence in the literature to support my viewpoint that things ‘work’, or that they don’t.  I also read to find new ideas, and for evidence that contradicts my own practice.  I like papers that collect data and use it to support their thesis, and I try to use this approach in my own work.

The CCTE required an action research project and I teamed up with two other college faculty, Amanda Lipko Speed and Theresa Westbay, to examine metacognition and student learning.  The major idea was to learn if increasing student awareness of  their study habits correlated with exam performance.  This is not a new idea at all and is well described in How Learning Works: Seven Research based Principles for Smart teaching (Ambrose, Bridges, DiPietro, Lovett, and Norman, 2010).  The strategy is to have students review and analyze their results on an exam, reflect on how they prepared for the exam, and to use that information to guide preparation for future exams.  We used Exam Wrappers, a form that is described in many places.  Two online sources are a blog by Maryellen Weimer at The Teaching Professor (http://www.teachingprofessor.com/articles/student-performance/exam-wrappers) and another blog by Andrew Wolf at askandrewwolf.com (http://askandrewwolf.com/teaching-metacognition-with-exam-wrappers/).

I was attracted to this project because students often ask, what is the best way to study for your class?, and I study for your class so much but it never seems to help.  I am always surprised that these students rarely reflect on how they study, and they rarely look at missed exam questions and analyze why they missed those questions.    I usually answer students with what comes to mind at the moment, study more, work with the material, talk to friends about the material, read the textbook, and look over your exams and try to understand why you missed questions.  The exam wrapper offered a method to explicitly direct students to an activity that could help them now and in the future.

We wrote questions for the wrapper, talked about how we could successfully implement the wrappers in our courses, and wrote an IRB so that we could do the study.  The project was made more interesting by the fact that I teach a large enrollment course with mostly sophomore and junior level students,  Dr Lipko Speed taught mostly freshman, and Dr Westbay taught sophomores.  Each of us teaches science courses (anatomy and physiology, psychology, and microbiology).  Briefly, after an exam students were given the correct answers and an exam wrapper form to fill out.  They were guided to directly exam missed questions, document their preparation for the exam, their predicted outcome, and finally their actually outcome.  Then they were asked to develop a new strategy for the next exam. This procedure was implemented for several exams.

The first thing we learned was the enormous difficulty in timing, and having students do the wrappers during class time, in close proximity to exam time.  It was very challenging to implement.  Students did seem receptive to the project overall.  We did not see a significant change in student performance on exams after the wrappers.  There were two simple ways to interpret the outcome.  First, metacognition and exam wrappers do not help students to improve their performance during exams.  We accept this conclusion.  It might be that by the time students are in college their behavior is firmly set, and that changes in study behavior are very hard to develop.  This idea led to our second interpretation, that most likely there is more to metacognition and using exam wrappers then we learn from this simple experiment.  Certainly it is possible that there was an effect and that we missed it because we did not do the correct ‘tests’ to show the effect.  For example, one semester is probably much too short to show a change in test performance.  We all thought it would be more interesting to follow students for several semesters to learn if metacognition and the awareness and reflection on the connection between preparations and outcomes would significantly enhance student learning over the long run.  We also thought that exam wrappers could be an excellent tool to teach new students, freshman, how to analyze their own performance in a realistic way, and to teach them how to change their own actions to improve future learning in college.  The study continues because we did not have enough time to analyze all of the data.  Each of us plans to do more analysis of existing data, and to teach metacognition to future students.  Due to the effort involved we may not use exam wrappers consistently.

Overall this was a really good experience.  I enjoyed meeting regularly with others who were interested in teaching and learning.  Action research, and the usefulness of constant reflection was an important part of this CCTE learning community that I plan to continue.  For my full report, please see Metacognition and Student Performance in Advanced Physiology, Systems Physiology, and Neurophysiology.

 

Metacognitive Awareness and the Use of Exam Wrappers in Psychology 101

Dr. Amanda Lipko-Speed, Associate Professor of Psychology, The College at Brockport

In the spring of 2013, I began participating as a STEM faculty fellow in a Professional Learning Community hosted by MCC’s Community Center for Teaching Excellence.   It has been a rewarding experience that has allowed me to engage in timely, focused discussions about the current state of our educational world with colleagues from high schools and colleges around the Rochester area.  It has also provided me with an opportunity to complete a pedagogically relevant action research project within my own college course.  I chose to focus my action research on college students’ metacognitive awareness – the reflection on one’s own thinking, knowledge, and abilities.   This choice stemmed from my own research interests as well as my experiences and investment in undergraduate teaching.  Through my experiences as a college professor, I have been routinely asked by students about the best study strategies and techniques for academic success.  In addition to my teaching experiences, I have an active research program focused on the development of young children’s metacognition.  Thus, when given the opportunity to conduct an action research project, studying metacognition was a natural choice for me.

Fortunately, I had the opportunity to collaborate with two gifted colleagues, Dr. Adam Rich and Dr. Theresa Westbay, who were also interested in investigating metacognition in their own action research projects.  Specifically, we chose to evaluate the usefulness and influence a metacognitive survey tool called an exam wrapper (e.g. Ambrose, Bridges, DiPietro,Lovett, & Norman, 2010).  Personally, I wanted to know whether repeated completion of exam wrappers would improve students’ metacognitive awareness, lead to more strategic self-regulatory academic behaviors (e.g. study strategies), and potentially even influence students’ academic performance on exams.  I chose to administer the exam wrappers with my fall 2013 section of Principles of Psychology which had 78 students from a range of majors enrolled.  Students completed an exam wrapper approximately one week after each of the first three exams in the course (for logistical reasons, students were unable to complete an exam wrapper after the final exam).  Students were allotted about 25 minutes in class to review their exams and complete the exam wrappers.  The exam wrappers asked them to reflect on their exam preparation, specifically the amount of time they spent studying and the study behaviors and strategies in which they engaged.  The exam wrappers also asked students to review and reflect on possible reasons for incorrectly answered questions.  Finally, the exam wrappers asked students to list any strategies and behaviors they planned to change in preparation for subsequent exams.

Overall, the experience of completing our action research project was meaningful and informative.  The good news is that students do appear to have some metacognitive awareness.  Two of the most commonly reported reasons my students gave on the exam wrappers for answering exam questions incorrectly were metacognitive in nature.  Specifically, students frequently reported that they had missed a question because they had studied the information but could not remember it at the time of the exam or that they knew the concepts but could not apply them accurately on the exam.In order to identify these reasons as why a question was potentially answered incorrectly, students needed to reflect on their knowledge. The first reason states that the knowledge was never retained.  The second reason states that the knowledge was retained but was not processed at a deep enough level.  Whether or not students’ metacognitive awareness was increased through the experience of completing an exam wrapper cannot be determined.  But, I am encouraged by the finding that students are able to reflect metacognitively when considering why they answered questions incorrectly.

Unfortunately, despite demonstrating some metacognitive awareness, my students did not seem to have the ability to translate their awareness into self-regulatory academic behaviors. They reported studying for far less time and for far fewer days than would be recommended. Students also routinely failed to engage in beneficial behaviors such as reading the textbook consistently and meeting or emailing with the professor to ask questions or clarify material.  Most disheartening was that these poor behaviors were engaged in consistently across the semester.  The experience of completing the exam wrappers did not change their study behaviors or strategies.  Interestingly, they consistently reported a desire to change their behaviors but did not follow through.  Not surprisingly, I did not see a change in exam performance across the semester.

One unexpected consequence of conducting this action research project was a significant decline in the number of Principles of Psychology students who visited my office hours this semester.  Due to the large enrollment of this course, I typically post exam grades but do not hand back the actual exams to the students.  As a result, many of my office hours are typically dedicated to meeting with students who wish to review their exams.  By asking students to complete exam wrappers this semester, all students were given the opportunity to review their exams in class.  Thus, far fewer students visited my office hours which has obvious negative implications but also has a few important, unexpected benefits.  For example, students seemed less likely to arrive at my office emotionally charged when they did come.  I believe the process of reviewing one’s exam in a professor’s office comes with an additional amount of pressure not present when one is reviewing an exam in a classroom among peers.  In the past when students would come to review their exams, they were often highly frustrated and emotional which would lead them to focus on things like why a question was poorly worded in their opinion or why their grade was unfair rather than a more productive focus of ways to improve on future exams.

To see my full report on this project, please visit Metacognitive Awareness and the Use of Exam Wrappers in Psychology 101.