Society for the Teaching of Psychology: Division 2 of the American Psychological Association

E-xcellence in Teaching
Editor: Annie S. Ditta

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
  • 28 Mar 2025 4:57 PM | Anonymous member (Administrator)

    Serena Zadoorian
    University of California, Riverside

    As a doctoral student, I am always eager to seek opportunities to teach courses. When I was hired to teach an introductory level research methods course at California State University, San Bernardino, I was extremely excited. This lower-division class has been designed to provide students with essential knowledge and skills to conduct and evaluate psychological research, which fosters critical thinking. The topics covered include the philosophy of science, scientific thinking and reasoning, correlation vs. causation, threats to validity, formulating testable research questions and hypotheses, basic concepts of research design, and research ethics. As I was preparing to teach this class, my goal was to ensure that, by the end of the semester, students not only had a comprehensive understanding of research but also demonstrated the ability to apply their knowledge to real-world scenarios. To facilitate this, I incorporated active learning assignments into the curriculum using the flipped classroom method. This method requires students to take responsibility for their learning both inside and outside of the classroom (Prust et al., 2015). Students are expected to engage with the course materials (e.g., recorded lecture videos, chapter readings, quizzes, etc.) prior to class session with minimal support from the instructor. Then, during class hours, students are presented with class activities and discussion to strengthen their understanding of the learning materials (see Al-Samarraie et al., 2020).

    The flipped classroom method offers many benefits over the traditional lecture, such as: reviewing the course materials at one's own pace and as many times as needed (Goodwin & Miller, 2013); watching lecture segments individually at the most convenient times (Forsey et al., 2013; Jensen, 2011); and engaging in active learning during class (Daniel & Braasch, 2013; Freeman et al., 2014; Karpiak, 2011). It is important to note that in addition to the benefits, there are also potential drawbacks to consider. For instance, students may find the pre-recorded lecture materials less engaging (Jensen, 2011; Foertsch et al., 2002) or encounter technical difficulties when viewing the materials (especially those with fewer financial or technical resources). Lastly, some students may also face challenges upon anticipating, scheduling, and completing the out-of-class learning (see Dunning et al., 2003).

    Flipped Classroom in Action

    In a study conducted by Roehling and colleagues (2017) flipped pedagogy was used in an Introduction to Psychology course at a small liberal arts college. The instructors selected four topics, including research methods, to be flipped. Their results revealed that students found the flipped classroom method more interesting than the traditional one. Previous research has also shown that the flipped classroom method could enhance students’ metacognition (Van Vliet et al., 2015) and promote students’ engagement and self-efficacy (see Esson, 2016). Taking these results into account, this method was utilized when teaching my lower-division research method class at CSU San Bernardino. The particular flipped model employed was developed by me, and it was my first time applying it in the classroom.

    The class consisted of a total of 38 students, all of whom utilized materials from PsycLearn—a digital resource provided by the American Psychological Association. Students were instructed to review the course materials prior to the class session within a period of five days. PsycLearn enables students to complete each chapter section by section, which makes it easy to navigate through the modules. Each module consists of reading materials, video lectures, and assessments (e.g., quizzes, short-answer questions) with immediate feedback. Additionally, for each chapter, real-world scenarios are included to allow students to apply psychological science to real life. For example, when learning about reliability, students were presented with scenario-based examples that demonstrated how researchers collect and analyze data. They were then required to match each type of reliability with the appropriate scenario.

    In addition to completing the online modules, students were asked to post a question related to the course materials and reply to another students’ post on the Canvas discussion board. Pre-class activities took about one and a half to two hours per week. To reward students for their progress, they earned points for completing the online modules and posting questions. Although the majority of students found the class structure easy to follow and said that they benefited from the discussions, some noted in the end-of-year evaluations that the course was time-consuming. Specifically, these students found it challenging to allocate time outside of class to review the course materials.

    The in-class materials were presented through PowerPoint slides and focused on the concepts students found most challenging, as reflected in the questions raised on the discussion board. It is important to note that the concepts students find challenging may vary between classes. For this reason, faculty will likely need to adjust the slides to better address these challenges based on the needs of each group. This research methods class met twice a week, with the first day dedicated to reviewing the course materials and encouraging students to ask questions. During the second day, we focused on collaborative work. Students were randomly placed in groups of 4 to 5 to work on active learning assignments designed to help them apply their knowledge to real-world scenarios. For instance, when teaching about ethical guidelines, students were given various research proposals and asked to act as members of the Institutional Review Board committee, deciding whether to approve, reject, or request additional supporting documents form the “researchers.” This activity not only enhanced students’ engagement but it also allowed them to put their knowledge into practice. Furthermore, as part of the course requirement, students were asked to learn about APA formatting. To help them better understand APA style, students were provided with texts that included both in-text citations and references. They were instructed to identify any APA style errors (if present) and provide explanations on how to correct those errors. Although students were asked to work in groups, they were required to submit their own work after each class session.

    Students were assigned to the same group for the entire semester. This approach was implemented to facilitate communication and encourage students to collaborate more efficiently. To enhance their learning experience, I actively moved around the classroom, checking in with each group to ensure they were progressing in the right direction. For active learning assignments involving challenging concepts, we reviewed the answers as a class. Reviewing assignments as a class provided students with additional support, reinforcing key concepts. Each week, assignments were submitted via Canvas and graded within five days and returned with constructive feedback. Students were invited to attend office hours to ask any additional questions or for support. It is important to note that if a student was absent for a valid reason, they were granted an extension to complete the assignment independently, with the opportunity to receive guidance and support from the instructor as necessary. Finally, it is important to emphasize that this class does not require a final research paper or proposal. Students majoring in psychology at CSU San Bernardino are mandated to enroll in an upper-division methods course that primarily focuses on writing research papers and proposals. Considering the importance of mastering APA format and the critical steps in writing papers and proposals, I believe the flipped classroom method may not be the best approach. Students may find it challenging and discouraging without a solid foundation. It is essential for students to learn how to write research papers through traditional classroom methods, as academic writing is crucial for success in many upper-division courses and for those pursuing graduate school.

    Conclusions

    As a faculty, I tend to create a cooperative student-faculty environment in the classroom. Teaching a diverse group of students during my graduate career has made me aware of the importance of creating a supportive learning environment to ensure inclusivity for diverse learners. As shown by previous research, the flipped classroom method has been shown to improve students’ metacognition and promote engagement. Given the lower-division research methods class at CSU San Bernardino, I noticed that students who typically refrained from asking questions during discussion time were actively engaged and communicative with their peers during the active learning assignments.

    Furthermore, the flipped method also provides instructors with more time to address questions during class and allows them to integrate active learning assignments, giving students the opportunity to apply the concepts they have learned to real-world scenarios. In the end-of-year evaluations, students described the class as highly engaging and informative, noting that the class activities were both captivating and enjoyable. Moving forward, my next steps include maintaining the same class structure and curriculum as outlined. However, I also aim to gain deeper insights into students’ prior experiences and preferences regarding flipped vs. traditional classrooms. Collecting these data at the beginning of the semester will enable me to better understand my students’ backgrounds and needs and will help me tailor my teaching strategies to align with my students’ preferences.

    References

    Al-Samarraie, H., Shamsuddin, A., & Alzahrani, A. I. (2020). A flipped classroom model in higher education: a review of the evidence across disciplines. Educational Technology Research and Development, 68(3), 1017-1051.

    Daniel, F., & Braasch, J. L. (2013). Application exercises improve transfer of statistical knowledge in real-world situations. Teaching of Psychology, 40(3), 200-207.

    Dunning, D., Johnson, K., Ehrlinger, J., & Kruger, J. (2003). Why people fail to recognize their own incompetence. Current Directions in Psychological Science, 12(3), 83-87.

    Esson, J. M. (2016). Flipping general and analytical chemistry at a primarily undergraduate institution. In The flipped classroom Volume 2: Results from practice (pp. 107-125). American Chemical Society.

    Foertsch, J., Moses, G., Strikwerda, J., & Litzkow, M. (2002). Reversing the lecture/homework paradigm using eTEACH® web‐based streaming video software. Journal of Engineering Education, 91(3), 267-274.

    Forsey, M., Low, M., & Glance, D. (2013). Flipping the sociology classroom: Towards a practice of online pedagogy. Journal of Sociology, 49(4), 471-485.

    Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410-8415.

    Goodwin, B., & Miller, K. (2013). Research says/evidence on flipped classrooms is still coming in. Educational Leadership, 70.

    Jensen, S. A. (2011). In-class versus online video lectures: Similar learning outcomes, but a preference for in-class. Teaching of Psychology, 38(4), 298-302.

    Karpiak, C. P. (2011). Assessment of problem-based learning in the undergraduate statistics course. Teaching of Psychology, 38(4), 251-254.

    Prust, C. J., Kelnhofer, R. W., & Petersen, O. G. (2015, June). The flipped classroom: It's (still) all about engagement. In 2015 ASEE Annual Conference & Exposition (pp. 26-1534).

    Roehling, P. V., Root Luna, L. M., Richie, F. J., & Shaughnessy, J. J. (2017). The benefits, drawbacks, and challenges of using the flipped classroom in an introduction to psychology course. Teaching of Psychology, 44(3), 183-192.

    Van Vliet, E. A., Winnips, J. C., & Brouwer, N. (2015). Flipped-class pedagogy enhances student metacognition and collaborative-learning strategies in higher education but effect does not persist. CBE—Life Sciences Education, 14(3), ar26.


  • 04 Mar 2025 12:47 PM | Anonymous member (Administrator)

    Shlomit Flaisher-Grinberg
    Saint Francis University

    Background:

    As a behavioral neuroscientist, I always loved rats. I have seen them master complicated cognitive tasks and wanted my students to have the opportunity to learn, first-hand, just how social, clean, and clever they really are. Thus, I was thrilled when my department chair asked me to teach the “Psychology of Learning” course. Integrating rat-training experiences into the curriculum, my students used learning methodologies to teach their rats to “play soccer,” ride tiny scooters, and complete agility courses, among other tasks. During the semester, my students also learned to care for their rats, and many grew to love them. In fact, many students decided to adopt their rats at the end of the semester.

    The Transformation:

    Even though the course was effective, and students’ feedback was positive, there was always this one nagging thought I couldn’t get rid of. I was breeding (or purchasing) rats for the course, while animal shelters were overloaded with abandoned, neglected, abused, and betrayed animals. Having created an upper-level shelter dog-training program a few years ago (Flaisher-Grinberg, 2023), and knowing that every year, more than 3 million cats enter animal shelters worldwide (American Society for the Prevention of Cruelty to Animals, n.d.), I wondered if I could replace the course’s focus on rat training with shelter cat training. I assumed that like the training of rats, the training of cats (and assessment of the training efficacy) will contribute to students’ working knowledge of psychology and enhance their scientific reasoning skills. I envisioned that unlike the training of rats, the training of shelter cats would provide students with an opportunity to apply psychological content to practical, ‘real-world’ problems and empower them to make a difference in their lives and in their communities (American Psychological Association, 2022). Specifically, I wanted my students to use learning concepts to socialize shelter cats and train them towards behaviors that could advance their wellbeing and adoptability (Kogan et al., 2017). I hoped that the ability to improve the quality of lives of shelter cats, while supporting animal shelters in their effort to care for sheltered animals, will also promote my students’ personal and professional development, and therefore produce an impact that will span people and animals alike.

    Course Set-Up:

    My first step into the transition included the creation of a partnership with a local animal shelter, which was willing to entrust us with their cats and agreed to support the program by providing some of the cats’ needs (crates, blankets, litter boxes, litter, toys, and food). This task was easy, given that we have previously established a reciprocal and mutually beneficial collaboration with one of the animal shelters around us, centered around my students’ efforts to train shelter dogs and write grant applications on behalf of the shelter. Next, I secured permission to house shelter cats on campus premises and attained relevant protocols. This task was more complicated and took almost a year to conquer. Specifically, I worked with our Business Office to ensure insurance coverage for our course, with our Office of Risk Management to create appropriate liability agreement documents, with the Students Health Center to generate allergy screening/management procedures, and with the Biology Department to identify a classroom appropriate for the course. I submitted detailed care protocols to our Institutional Animal Care and Use Committee, and an internal grant application ($1,000) to our Office of Advancement. The grant allowed me to purchase cats’ carriers, leash/harness kits, click sticks, “place” mats, and training treats. Later, I was able to sustain the program financially via the addition of minor lab fees to students’ course enrollment ($20/student). Given that my institution has a small vivarium, I was able to transform the rats’ space into a cat-dedicated room, supplied with cat trees, hiding boxes, and toys. Finally, I recruited a group of TAs, trusted with cleaning/feeding protocols (for additional information about the course, see Flaisher-Grinberg, 2024).

    Course Delivery:

    The integration of cats into the curriculum was accomplished via a variety of classroom activities, four lab sessions, and a 3-part research assignment. The labs focused on the demonstration of 1) habituation processes, 2) classical conditioning, 3) operant conditioning, 4) generalization and discrimination procedures, and the research assignment was built around a student-generated cat training project. During the labs (and the following weeks) students learned to gently handle the cats, respect their personalities, and interact with them in a way that enabled both cats and students to calm down and enjoy each other’s company. The cats learned that the carriers, “place” mats, click sticks and harnesses predicted the availability of treats, and later, that entering the carriers, sitting on the “place” mats, following the click sticks, or wearing the harnesses, awarded them with treats. Concurrently, the TA team utilized similar methodologies to facilitate cats’ nails trimming, ears cleaning, and teeth brushing. Since many cats associate their carriers with aversive consequences (e.g., a visit to the vet), are wary of wearing a harness (even if for a nice walk outdoors), and despise being groomed, I hoped that these tasks will be beneficial to both the cats and their future adopters. With the intent of introducing my students to the growing field of animal-assisted interventions (e.g., the incorporation of animals into human-oriented therapeutic/educational processes, Fine & Weaver, 2018), the final lab session took place within the institutions’ clinical educational facility set as a simulated hospital. During the lab, the cats generalized previously learned tasks, demonstrated via their ability to walk in a harness into the medical environment, sit on their “place” mats by “mock patients”, follow the click stick to execute entertaining tricks, and spread love and happiness all around. For their research assignments, students selected a research question (“Can a cat learn to…?”), designed a training methodology, chose assessment parameters (e.g., latency to complete the task, number of correct responses, etc.) that were evaluated before and after the training, collected and analyzed the data, wrote an APA-style research paper describing their project, and disseminated their findings to their classmate via a 10-minute presentation.

    Findings:

    The course, in its shelter-cats-integrated format, was offered for the first time in the spring of 2023. It was repeated in the spring of 2024 and will be offered again in the spring of 2025. Set as a mandatory curricular item for psychology majors and an elective for students in other majors, the course is well-enrolled, attracting students with interest in health sciences, biological sciences, and a variety of other disciplines. Each iteration thus far included 40-48 students (across two sections), and six cats. Depending on the cat’s needs and personality (e.g., outgoing or shy), each cat was assigned to one or two students’ groups of 3-4 students per group. This structure seems suitable, as students’ and cats’ number accommodate the size of the classroom/cat-room, and since in this fashion the cats receive adequate (yet not overwhelming) attention and students are able to coordinate their training time with their group members to maximize the efficiency of their work. Although protocols allow students with allergies to substitute the course for another alternative within the department’s curriculum, to date, none of the students who took the course had severe cat allergies, and the three that had minor allergies chose to enroll in the course while complying with safety regulations (i.e., wearing lab coats, gloves, and face masks during their interactions with the cats). Most of the cats learned most of the tasks (some cats dislike walking in a harness, while some seem to like it), and students were able to design research projects that involved a variety of tasks (giving a ‘high five’, “playing soccer”, etc.). The students indicate that they truly like their cats, enjoy training them, and are proud of both their cats’ and their own accomplishments throughout the semester. Students attest that they recognize the positive impact that they make on shelter cats and on the local community and that they feel that the course contributes to their educational, professional, and personal development. Representatives of our shelter partner indicate that they are thrilled with the partnership and desire to see it continue and develop in the future. Various faculty, staff, and administrators also enjoy the presence of cats on campus, and I often find myself engaging in conversations about our cats and about pets and animals in general. Importantly, regardless of age, sex, behavioral tendencies, and health status, the cats seem to prosper in the campus environment. Moreover, up to date, all cats in the program have been successfully adopted by their student trainers, other students in the course, campus community and the general community.

    New Directions:

    Given the popularity of the cats-integrated course, the success of the collaboration with our local animal shelter, and the cats’ high adoption ratio, the fall of 2024 marked the beginning of a new, shelter-cats-integrated program. In the past few months, we identified a small room at the campus library that did not contribute to the air circulation within the library, received all necessary permissions, gathered equipment, and brought shelter cats to live in our own “Library Cat Room.” I trained a team of eight students who previously took my cat-focused course to clean, feed, and socialize the cats, and created ‘meet-n-greet’ sessions (3-4 hours/week) which enabled the campus community to spend time with the cats (the cats are not allowed outside the room). Although the program is very new, the three cats that spent the past few months living in our library environment have already brought much joy to students, faculty, staff and administration. All of them were successfully adopted by the end of the fall semester, and it is our hope that our program will continue and develop in future years.

    References:

    American Psychological Association (2022). APA guidelines for the undergraduate psychology major. Version 3.0. APA. www.apa.org/ed/precollege/about/psymajor-guidelines.pdf

    American Society for the Prevention of Cruelty to Animals (n.d.). Pet Statistics. Retrieved January 31, 2023, from https://www.aspca.org/helping-people-pets/shelter-intake-and-surrender/pet-statistics

    Fine, A. H., & Weaver, S. J. (2018). The human-animal bond and animal-assisted intervention. In M. V. D. Bosch, & W. Bird (Eds.), Oxford textbook of nature and public health: The role of nature in improving the health of a population (pp. 132-138). Oxford University Press.

    Flaisher-Grinberg, S. (2023). Community-Engaged Pedagogy in the Psychology Classroom: Shelter Dogs go to College. Teaching of Psychology, 0(0). https://doi.org/10.1177/00986283231191748

    Flaisher-Grinberg, S. (2024). Creating a College-Based Shelter Cats’ Training Program. Journal of the International Association of Animal Behavior Consultants, IAABC. 29(1). Retrieved from: https://journal.iaabcfoundation.org/sfu-cats/

    Kogan, L., Kolus, C., & Schoenfeld-Tacher, R. (2017). Assessment of Clicker Training for Shelter Cats. Animals7(10), 73. https://doi.org/10.3390/ani7100073


  • 13 Jan 2025 6:58 PM | Anonymous member (Administrator)

    Chelsea Romney
    Brigham Young University


    While attending a teaching seminar at my university on the importance of experiential learning,(1,2) I found myself experiencing academic discipline FOMO (fear of missing out). A dance professor explained, “Instead of having students learn the steps of the cha-cha from their textbook, we use our classroom time to actually dance the cha-cha.” I thought to myself, if only I were in a discipline that lent itself so well to experiential learning. How could I possibly demonstrate concepts like bias, group dynamics, or social cognition in my upcoming social psychology course? Surely, such erudite concepts weren’t meant for experiential learning methods. 

    Later, as I was preparing a lecture on the “Psychology of Law” chapter of the course textbook, I realized that I might have discovered my own version of the cha-cha. I began developing a plan for students to view live courtroom proceedings to see how social psychology influences the practice of law. I knew that court was a public place that could be visited by civilians, and that all arraignments and trials were public records. However, we know from community-based participatory research (CBPR) that groups, especially academic groups, should take care when visiting or studying community organizations to respect the functioning of the organization. Proper CBPR involves an equitable approach where both the academic group and the community stakeholders both benefit from the interactions. (3) 

    I met with the local city justice court judge to determine how they might benefit from having students observe court. Fortunately, the judge was enthusiastic about having students in the court and was particularly motivated by the idea of collecting data to determine how the court was functioning. Specifically, the judge had already used a validated national court survey(4) to assess the accessibility and fairness of the court from the customers’ perspective and was enthusiastic at the idea of students from the local university assisting in the collection of a new batch of these surveys. Since another learning outcome of the course was to increase the students’ understanding of the research process, collecting data for the court created an additional opportunity for experiential learning in my class. 

    The judge and I worked together to determine a series of interactions, assignments, and specific statistical associations we wanted to assess in the outcomes of the validated survey to mutually benefit our organizations. The course courtroom assignment followed this timeline: 

    1. The judge visited our classroom, provided a guest lecture related to course material, described the purpose of the validated court survey, and provided information about proper conduct while visiting the court.

    2. 25 students visited the courtroom in pairs of two to collect data throughout the semester. During their court visit, they also completed a courtroom observation assignment where they identified at least three social psychological phenomena occurring in the court in a 2-page paper. This encouraged them to think critically about course concepts and gave their visit structure when they weren’t collecting data.

    3. With instruction, students completed data analyses with the data collected from the validated survey and wrote brief research papers on their findings for course credit.

    4. The Judge visited the classroom again for the students to deliver their findings in the form of oral presentations that were graded as part of the course.

    Students reported how observing court in person and engaging with court customers and employees improved their understanding of course concepts. One student wrote,

    “I was able to understand the social psychology vocabulary much better because I could see at the court different concepts like emotional intelligence, intrinsic and extrinsic motivation in the court customers, receptivity, yielding (William McGuire’s model of persuasion), and stigma.”

    Another student made sense of course concepts through observing the Judge,

    “The biggest social psychological principle that I was able to see while watching the Judge’s court was his high emotional intelligence. The book breaks up emotional intelligence into four different parts. The first was the ability to perceive your emotions and the emotions of those around you. This was definitely something that was practiced by the Judge as he was able to recognize the way that others felt and how his emotions were influencing his judgements. To have awareness of how others feel is a great quality to have in the court system because how people feel influences the way that people perceive fairness and furthermore their actions after the court hearing. The second type of emotional intelligence is facilitating thought which is described as the ability to generate an emotion and then regulate that emotion. The third is understanding emotion which explains that a person is able to acknowledge an emotion and recognize when that emotion switches from a simple emotion to a complex emotion. The last type of emotional regulation is called managing emotions which explains that an individual is open to theirs and others emotions and they are able to understand them, along with being able to grow and develop because of them. I saw the Judge demonstrate each portion of emotional intelligence in his courtroom and that is one of the reasons why so many court customers perceive him as fair.”

    Due to the experiential nature of the courtroom project, students also gained additional insights beyond the course assignments. One student shared their initial apprehension about the court environment, stating,

    “I had a lot of preconceived notions about court even though I had never been to a court before in real life. I thought the layout made it easy for people to learn observationally because they are seated in the courtroom until it is their turn to speak. I liked this composition because I feel like it helped them know what to say.”

    Another student noted,

    “While spending time at the court, I was surprised that the people who worked there were normal people. They even laughed and talked and joked around with each other. I wished that other people got to see the human side of the judge, the cops, and the staff who work in the court and not as people who want to give you the harshest sentence that they can.”

    And lastly, students learned about human nature and increased their awareness of the  complexity of human behavior. One student reflected,

    “While at court, I saw a case where a son was trying to protect his mother from a violent ex-boyfriend by hitting the man with a skateboard. He was charged with very heavy charges. Through previous meetings and litigation between the Judge and the attorney, they ended up dropping the charges and allowed him to pay a fine of $100 and he got his bail money back. I saw the family hugging and crying in the parking lot. I was very touched by this experience that sometimes people may commit crimes to protect others.”

    Since the courtroom assignment, I have applied the principles of community-based participatory research as a base for experiential learning assignments for other courses. Recently, I taught a Health Psychology course with an emphasis on the health benefits of social inclusion. I targeted on campus clubs as an effective venue to study student social engagement. I followed the lead of the stakeholder, the overall campus clubs staff coordinator, to design an assignment that would benefit both the students and the campus clubs. Like the judge, the clubs coordinator was also interested in data to understand how the clubs were functioning. Students selected clubs and administered surveys to assess the effect of club attendance on outcomes like well-being, happiness, and belonging. The assignments followed a similar pattern as the courtroom study with a guest lecture from the clubs coordinator, instruction on research methods, data collection, and a final presentation to the stakeholder. I believe this format could be applied to courses that cover other types of content and I plan to implement a similar project in my Psychological Statistics course.

    In conclusion, incorporating community or campus engagement into the classroom provides invaluable experiential learning opportunities for students, fostering a deeper understanding of academic concepts through real-world application. By engaging with community stakeholders, such as local courts or campus clubs, students not only gain practical experience but also contribute to the community, creating a mutually beneficial relationship. These experiences challenge students' preconceived notions, enhance their critical thinking, and provide them with a nuanced perspective on human behavior and social dynamics. Ultimately, integrating experiential learning through community or campus engagement enriches the educational experience, promoting both academic growth and social responsibility, and you don’t even have to dance the cha-cha in front of your students.  

    References

    1 Kuh, G. D. (2012). High-impact educational practices: What they are, who has access to them, and why they matter. Peer Review, 14(3), 29-30.

    2 Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. New Jersey: Prentice-Hall.

    3 Fontaine, S. (2006). Journal of Higher Education Outreach and Engagement, 11(2), 45.

    4 https://www.ncsc.org/courtools/trial-court-performance-measures

  • 20 Dec 2024 4:39 PM | Anonymous member (Administrator)

    Dina Gohar
    University of Michigan-Ann Arbor

    People with a growth mindset—the belief that intelligence and other abilities can develop through effort—enjoy numerous benefits inside and outside of the classroom (Dweck, 2006, 2017). For example, students with a growth mindset are more likely to persist when challenged and to succeed in college with higher grades and course completion rates (Yaeger at al., 2019), and even enjoy greater well-being overall (Tao et al., 2022). A growth mindset can be particularly beneficial for students from historically underrepresented or marginalized groups who are likely to face systemic barriers, such as those related to race, socioeconomic status (SES), or first-generation status (Fink et al., 2018). Such students often encounter additional hurdles that can impact their educational journey, from stereotypes to limited resources, which a growth mindset can help them be resilient enough to overcome. For example, an online intervention encouraging high school students to view intellectual abilities as capable of growth through effort, new strategies, and seeking help significantly improved grades among lower-achieving students, especially when the school environment aligned with the growth mindset message (Yaeger et al., 2019).

    However, a growth mindset is NOT a panacea, and not having one is far from the only reason why marginalized students may struggle academically. Overemphasis on individual effort might obscure structural and systemic factors that impact student success, such as discrimination and limited resources. Although many studies suggest that those who are marginalized or from low SES may benefit most from adopting a growth mindset, some studies suggest the opposite (Sisk et al., 2018). For example, Bernardo (2020) found that a growth mindset was positively related to learning outcomes only among higher SES students, highlighting the importance of social resources supporting students’ efforts to improve for the benefits of their growth mindset to be realized. How do we ensure these interventions are successful for the students who stand to benefit from them the most?

    Growth mindset interventions can sometimes backfire for disadvantaged students if not implemented carefully enough to avoid encouraging self-blame or shame for failure. As Hoyt and Burnette’s (2020) “double-edged sword model” highlights, growth mindset messaging can not only yield positive outcomes due to increased self-efficacy (and reduced social essentialism), but also indirectly predict adverse outcomes due to more self-blame and adopting personal responsibility for problems. Therefore, in stigmatizing contexts and in isolation, differentiating responsibility for a problem from expectations for potentially managing the problem is integral for positive growth mindset-related outcomes. For instance, frame marginalized students’ academic difficulties within broader societal contexts. Instead of saying "Your grade is a result of your effort,” you could say “Your grade reflects your current performance, which can be negatively impacted by systemic inequalities that we can hopefully address together.” Rather than saying “if you work harder, you will improve,” acknowledge that “improvement often requires both effort and support. Let's explore the resources available to you, such as tutoring services, my office hours, and study groups, that can complement your hard work so it is reflected in your final grade.”

    In addition to promoting a growth mindset in our students without encouraging self-blame or false hope, we should discuss and actively address systemic barriers that can affect their performance to the best of our ability. We can truly empower our students by providing the resources they need to reap the benefits of a growth mindset while advocating for structural changes that support all students.

    Your Course Syllabus Can Cultivate A Growth Mindset

    As educators, we have the power to cultivate a growth mindset in our students without encouraging self-blame for failure, which can facilitate their resilience and success. How can we do so through small changes in our classes so that we don’t burn ourselves out? My online experiment (N = 200) suggests that an easy way to cultivate a growth mindset in the classroom, whether online or in person, is through our course syllabus as the first point of contact with our students. Compared to those randomly assigned to read a conventional syllabus, students who read a growth mindset-based syllabus scored significantly higher on Dweck’s Growth Mindset Scale afterward (Gohar, 2024). Moreover, students perceived a typically dreaded Research Methods course and its instructor more positively if the syllabus was growth-mindset oriented, much like students perceive a course and its instructor more positively if the syllabus is detailed (Saville et al., 2010), learner-centered (Richmond et al., 2016), inclusive (Fuentes et al., 2020), and warm in tone (Gurung & Galardi, 2021). More specifically, students reported more desire to take and less anxiety about a growth-mindset based course, which was perceived as less challenging, too. This was especially the case for students who identified as female or lower income, who even expected higher grades in the growth-mindset based course–perhaps helping to mitigate stereotype threat. Students also perceived the growth-mindset based instructor as more qualified, reasonable, and nice, which is an added bonus.

    So, what does a growth-mindset based syllabus look like? Using language that emphasizes the potential for growth when describing your learning outcomes (e.g., “grow in your ability to think critically”), assignments, and course (e.g., “This course is challenging, but I truly believe that every student is capable of succeeding with enough effort and persistence, and it is my job to help everyone do well!”). I even include an explicit message that, “studies suggest that the more you challenge yourself to learn, the more your brain actually grows! So, even difficult things like learning how to do research and statistics get easier as you get smarter over time,” which I highly recommend doing at least for research methods and statistics courses that can cause math anxiety and stereotype threat (Luo & Chen, 2024).

    It is not enough to use growth mindset facilitating language in your syllabus, though. Your course design needs to reflect growth mindset practices to show students that you really mean it. it can be hard for students to adopt and sustain a growth mindset if they don’t have the chance to make mistakes without hurting their final grades. Therefore, it is critical to use recursive assessment and grading that actually rewards students’ effort and improvement as well as their performance whenever possible. For example, students can earn an “effort score” that is incorporated into their final grades on assignments, as described in the figure below.

    Click here for a link to a copy of the post with figures.

    If you want to make sure that your students actually read the Syllabus that you put so much time into, at the beginning of the course, you can have them annotate the syllabus with their questions or comments as a graded assignment, which I highly recommend.

    More Tips for Cultivating a Growth Mindset in Your Students

    How else can you help your students develop a growth mindset to reap the benefits while avoiding potential pitfalls? First, if you tend to believe that ability is fixed (you can find out by taking the short linked Mindset Quiz), work on cultivating more of a growth mindset yourself because your mindset can impact your students’ mindset and thus their performance. For example, STEM faculty with a fixed mindset inspire less student motivation and have larger racial achievement gaps in their classes (Canning et al., 2019). For maximal benefit, students’ growth mindsets must also be supported by their teacher’s own growth mindsets (Yaeger et al., 2022), and ideally, peer norms, which can be influenced by teachers. Model a growth mindset for your students by sharing your mistakes and struggles and how you learned from or overcame them for inspiration.

    Second, praising effort and process, rather than performance or ability, can effectively foster a growth mindset and improve student performance, especially after setbacks (Mueller & Dweck, 1998). For example, instead of saying things like "you're a really great writer,” say something like "I can tell you put a lot of thought into this paper." Rather than just praising students for performing well, thank them for the hard work they put into the assignment, highlighting the connection between invested effort and improved performance if relevant. Third, it can be helpful to normalize struggle and “failure” as part of learning and emphasize the opportunity for growth, especially in the face of adversity, which can even facilitate post-traumatic growth beyond pre-trauma levels (Tedeschi & Calhoun, 2004). Simply reminding students that “everyone makes mistakes” can go a long way to normalize them. However, remember to acknowledge the barriers faced by disadvantaged students to prevent them from blaming themselves for struggling and work to address those barriers whenever possible.

    Fourth, help your students develop self-compassion, especially if they tend to be overly self-critical when they make mistakes. Neff’s website, selfcompassion.org, has great free exercises for cultivating more self-compassion. Encourage your students to review their mistakes in a nonjudgemental fashion and to learn from them by rewarding their doing so. Fifth, students can be instructed to say “stop” and breathe when their fixed mindset voice is getting out of hand and to add “yet” to the end of their fixed mindset statements to reap the benefits (e.g., “I can’t do this… YET!”). Finally, remember that a growth mindset isn’t about having “positive thoughts only” but about embracing challenges and learning from our mistakes because we believe in our capacity to improve.

    Best of luck as you carefully nurture a growth mindset in your students and empower them to achieve success inside and outside of the classroom as lifelong learners!


    References

    Bernardo, A. B. I. (2020). Social dominance goals, perceived socioeconomic status, and the academic achievement of Filipino students. Journal of Applied Social Psychology, 50(5), 269-282.

    Canning, E. A., Muenks, K., Green, D. J., & Murphy, M. C. (2019). STEM faculty who believe ability is fixed have larger racial achievement gaps and inspire less student motivation in their classes. Science Advances, 5(2), eaau4734. doi: 10.1126/sciadv.aau4734

    Dweck, C. S. (2006). Mindset: The new psychology of success. Random House.

    Dweck, C. (2014, November). The power of believing that you can improve [Video]. TED Conferences.

    Dweck C.S. (2017). Mindset: Changing the Way You Think to Fulfill Your Potential . Little Brown Book Group.

    Fink, A., Cahill, M. J., McDaniel, M. A., Hoffman, A., & Frey, R. F. (2018). Improving general chemistry performance through a growth mindset intervention: Selective effects on underrepresented minorities. Chemistry Education Research and Practice, 19(3), 783-806. https://doi.org/10.1039/C7RP00244K

    Fuentes, M. A., Zelaya, D. G., & Madsen, J. W. (2020). Rethinking the course syllabus: Considerations for promoting equity, diversity, and inclusion. Teaching of Psychology, 48(1), 69-79.

    Gohar, D. (2024). A growth mindset-based syllabus improves students’ perceptions of taking and succeeding in research methods. Manuscript in preparation.

    Gurung, R. A. R., & Galardi, N. R. (2021). Syllabus tone, more than mental health statements, influence intentions to seek help. Teaching of Psychology, 49(1), 32-37.

    Hoyt, C. L., & Burnette, J. L. (2020). Growth mindset messaging in stigma-relevant contexts: A double-edged sword? Journal of Social Issues, 76(3), 645-667.

    Luo, Y., & Chen, X. (2024). The impact of math-gender stereotypes on students' academic performance: Evidence from China. Journal of Intelligence, 12(8), 75. https://doi.org/10.3390/jintelligence12080075

    Mueller, C. M., & Dweck, C. S. (1998). Praise for intelligence can undermine children's motivation and performance. Journal of Personality and Social Psychology, 75(1), 33-52.

    Richmond, A. S., Slattery, J. M., Mitchell, N., Morgan, R. K., & Becknell, J. (2016). Can a learner-centered syllabus change students' perceptions of student–professor rapport and master teacher behaviors? Scholarship of Teaching and Learning in Psychology, 2(3), 159-168.

    Saville, B. K., Zinn, T. E., Brown, A. R., & Marchuk, K. A. (2010). Syllabus detail and students' perceptions of teacher effectiveness. Teaching of Psychology, 37(3), 186-189.

    Sisk, V. F., Burgoyne, A. P., Sun, J., Butler, J. L., & Macnamara, B. N. (2018). To what extent and under which circumstances are growth mind-sets important to academic achievement? Two meta-analyses. Psychological Science, 29(4), 549-571.

    Tao, V. Y. K., Li, Y., & Wu, A. M. S. (2022). Growth mindset and psychological well-being: The mediating role of personal growth initiative and self-efficacy. Journal of Happiness Studies, 23(3), 1187-1203.

    Tedeschi, R. G., & Calhoun, L. G. (2004). Posttraumatic growth: Conceptual foundations and empirical evidence. Psychological Inquiry, 15(1), 1-18.

    Yeager, D. S., Carroll, J. M., Buontempo, J., Cimpian, A., Woody, S., Crosnoe, R., ... & Dweck, C. S. (2022). Teacher mindsets help explain where a growth-mindset intervention does and doesn't work. Psychological Science, 33(1), 18-32.

    Yeager, D. S., Hanselman, P., Walton, G. M., Murray, J. S., Crosnoe, R., Muller, C., ... & Dweck, C. S. (2019). A national experiment reveals where a growth mindset improves achievement. Nature, 573(7774), 364-369.


  • 05 Nov 2024 9:33 PM | Anonymous member (Administrator)

    Jennifer Samson
    Queens University of Charlotte

    I love teaching Research Methods. When I accepted my present position, I enthusiastically agreed that this would be my primary teaching responsibility. (I think they might have expected me to run away screaming instead.) Our year-long methods sequence is relatively unique in that every year, my 25 or so students propose, design, implement, and report on their own projects in any area of Psychology. Every year there are some unbelievably thoughtful, creative, and interesting projects. But also every year, I have to drag myself through grading the drafts. Even the “good” projects. Especially when I feel like I’m providing the same feedback on draft after draft with little to no improvement.

    How can we help students understand the writing process as iterative instead of “one [draft] and done”? How can we as instructors help students become effective critics of their own writing? How can we keep our grading load manageable and still provide students with the feedback they need? This essay describes trials and errors, lessons learned, and lessons I’m still learning in my search for the elusive answers to these questions.

    Spoiler: There’s not one easy solution.

    Background:

    The year-long research methods sequence I teach is required for all Psychology majors in their junior year at my small, liberal arts university. By the time they get to me, students have completed their first-year writing sequence as well as the specific course prerequisites, including Introduction to Psychology, Statistics and a class we call Information Literacy – reading and writing in a professional Psychology setting, including a focus on literature review. Ultimately, the majority of my students are relatively well-prepared for college-level work but are still learning to write the type of professional academic paper required for the course. It is also worth noting that a sizable minority every year take Information Literacy simultaneously with the fall semester of Methods and so need extra support in my course.

    The research methods sequence is centered around students’ individual empirical projects. In the fall, students complete a four-credit class where we delve into the study of research, emphasizing design for association versus cause/effect and critiquing for different types of validity (designing and critiquing research is an explicit goal for undergraduate students set out by the American Psychological Association, APA, 2023). Concurrently with the class, they complete a two-credit-hour lab where they write a proposal to identify a research question and propose methods to answer it. In the spring, students complete a second four-credit class where they collect data, analyze it, and revise/extend their proposal so it becomes a complete, journal-style empirical paper. They also present their work in a poster session at a local undergraduate conference. The completion of not only the proposal but the entire project and the opportunity for every student to present in a conference setting is a hallmark of our program.

    What I Tried:

    At the beginning of last academic year, I knew I needed to do something different. An influx of late transfers caused the class size to swell by almost a third, and I knew that, short of learning to clone myself, there was no way I could keep up with marking everyone’s drafts in a timely manner (see Ambrose et al. 2010, about importance of constructive, timely feedback for student motivation). Meanwhile, I had already been looking at ways to increase student buy-in for writing as an iterative process and to increase students’ meta-cognitive skills as evaluators of their own work (see Ambrose et al., 2010; Bain, 2004). Therefore, I implemented the following procedures.

    I cancelled lab at key points in the semester (e.g., as outlines were coming together) to instead conduct 20-30 minutes oral check-ins with each student individually during the pre-writing process. In these meetings, we discussed how the ideas were going to be organized within the paper. Meetings, and requiring outlines in the first place, hopefully got students thinking about their papers earlier in the semester than many would have otherwise and encouraged them to engage in prewriting organization rather than diving right into drafting as many are prone to do). I returned minimal written feedback for these preliminary steps and marked primarily completion credit; if they did it thoughtfully and in a timely manner, they earned all the available points for that preliminary step.

    So far, what I was trying was not very different from what I’d done previously. But then, after students turned in their drafts of each section (e.g., literature review, methods), instead of marking it and returning it, I asked them to complete a short self-evaluation. The open-ended questions on the self-evaluation prompted them to focus on the areas of content and organization, common mechanical issues, time management completing the draft, and goals for revision. I then met with each student one-on-one to discuss their drafts. In these 30-minute meetings, we read the draft together and marked some key suggestions using Word’s track changes and comments features. I often used their self-evaluation as a starting place, especially if they had identified strengths or areas for improvement similar to those I noticed. I made a point to not mark the whole paper, but targeted examples. For instance, if my suggestion was for the student to use more parenthetical and fewer narrative (“___ found”) citations, we edited one or two paragraphs together to show them what that might look like.

    At the end of these meetings, we completed the grading rubric (separate from the self-evaluation) together. Students left the meeting with their marked-up paper and (after my first round of meetings where I learned it would be more efficient to record the score and send it on the spot) the completed rubric. Working together to evaluate the drafts not only got them graded more efficiently, but provided students ownership in their learning process and therefore, theoretically, more buy in for the learning process (see Doyle, 2011). The assessment was now part of the learning process (see Bain, 2004).

    At the end of the semester, students submitted not only their final paper, but also a revision reflection (similar to a revise and resubmit letter to the editor) in which they described the feedback they received, what they changed, and what they didn’t change (and why). On this revision reflection, I prompted them to describe the feedback they’d received on each section of their paper and how they’d incorporated it (or not).

    Conclusions So Far:

    Overall, I would say my experiments were a success and this is moving in a good direction, although maybe not there yet. (Will it ever be perfect? Probably not.) Many of the self-evaluations were thoughtful and, anecdotally, I believe more of the students at least registered and gave some thought to the feedback they received. By the end of the academic year, after meetings for the literature review, methods, and results, I noticed that students were doing more of the evaluating as we discussed the rubric for their discussion drafts, rather than waiting for me to tell them what score they earned. I asked the class for their thoughts and, even on the anonymous evaluations, most of them chose not to comment (I’ll take that as, “no complaints”). One student did tell me that they liked having meetings instead of a paper returned so they could ask clarification questions.

    From a professor workload point of view, this approach was exhausting during meeting weeks, when I often had 6-8 meetings per day for several days in a row, but generally much more efficient than grading and returning papers. There were a few (less than 5%) students who delayed scheduling their meetings and/or with whom I had to make special arrangements for an evening or weekend due to athletics, jobs, or other outside commitments taking up most of their days, but we made it work. In the future I should probably be clearer that the onus is on the student to take more initiative and get these scheduled (aka your poor planning is not my emergency, schedule early to have enough choices that will work for you).

    In part because I was forced to stay on schedule, students got their feedback in a timelier manner, even though I spent about the same amount of time on each student (30 minutes per paper to mark vs. 30 minutes meetings). I’m hopeful, although I only have anecdotal evidence thus far, that the feedback was clearer. For example, instead of writing a comment, “be sure to clarify the main idea of this paragraph,” I was able to ask students face to face, “what’s the main idea of this paragraph supposed to be? Yes. Write that.” I am also hopeful that feedback was deeper. It’s easy when I’m reading a paper to get caught up in marking the details, but I found with a one-on-one conversation, I could focus more on discussing the bigger pictures of organization and what points they were trying to make. One area where I saw a marked change was in reference formatting; in a face-to-face situation, I could point to a correctly formatted example and say, “this is correct. How is it different from this other one [that has an error]?”

    In short, I will keep this self-evaluation and oral feedback approach, but with some tweaks. First, I will likely spend some time scaffolding useful self-evaluation, so maybe more students will use the self-evaluation to their best advantage instead of (as I’m sure some did) seeing it as another box to check. For instance, I might do the first self-evaluations in class on the day after drafts are due and maybe show them some of my self-evaluation process on nearly-complete papers. I might also add another meeting, even earlier in the process as students are collecting their potential sources. The biggest change, though, is timing. Last year I cancelled lab on the day the draft was due and held meetings then. This year, I’ll move meetings to the week after the draft due date. I think meeting as a lab on the day the draft is due will allow me to get the students started on next steps more efficiently and having a gap between due date and feedback will allow me to do more skimming ahead and preparation for more effective one-on-one meetings.

    Nothing I’ve written about here is ground-breaking or even particularly innovative. But, sometimes it’s difficult to break from the way we were taught or the way we’re used to doing things. I hope that by sharing my journey so far I might contribute to the conversation as we, as individuals and as a field, strive for that magic solution that will be sustainable for us but still provide our students with the best possible learning experience.

    References:

    Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: 7 research-based principles for smart teaching. Jossey-Bass.

    American Psychological Association [APA]. (2023). APA guidelines for the undergraduate Psychology major. Version 3.0. American Psychological Association. https://www.apa.org/about/policy/undergraduate-psychology-major.pdf

    Bain (2004). What the Best College Teachers Do. Cambridge: Harvard University Press.

    Doyle, T. (2011). Learner-centered teaching: Putting the research on learning into practice. Sterling, VA: Stylus Pub.


  • 09 Sep 2024 8:46 PM | Anonymous member (Administrator)

    V. N. Vimal Rao
    University of Illinois at Urbana-Champaign

    “Thinking like a scientist involves more than just reacting with an open mind. It means being actively open-minded.”      Adam Grant 

    Towards the end of my graduate schooling, I was at a department party and found myself chatting with a first-year student about their research interests. At one point I noticed a chain that they were wearing, and asked them about it. They reacted shyly, almost embarrassed that I had noticed. The chain included a religious symbol. I asked them about how their beliefs inform their research interests, and they seemed surprised at the question. They simply responded, “I’m not sure religion and science have much in common.”

    As a child of immigrants, I grew up in the United States perceived as a ‘hyphenated’ American. As a child, I often saw two versions of myself – the ‘American’ version that most people saw, and a second almost secret version disclosed with family or at cultural events. As an adult, I’m too tired to pretend to be anything other than fully me in everything that I do.

    It is in this spirit that I firmly disagree with my friend that religion and science do not have much in common. If you are both religious and scientific, then they have everything in common – they have you in common.

    It is with this value that one of my projects draws inspiration from. I am Hindu. I enjoy reading and learning about the BhagavadGita and other Vedantic works. In the spirit of being fully me – an educational psychologist and a Hindu – I realized that I could borrow pedagogical insights from the BhagavadGita to help me teach my students Statistics.

    It might seem odd at first to think that modern psychological and educational research can learn a thing or two from religion. But let’s think about it as psychologists:

    1. Many religions have existed for centuries;
    2. They typically include something of a benchmark set of attitudes, beliefs, and practices;
    3. They have encoded into their systems various pedagogies to support the propagation of these attitudes, beliefs, and practices; and
    4. These religious instructional systems have existed for far longer than psychological (or statistical) pedagogies.

     

    Clearly, religion is doing something right in terms of their pedagogical strategies to be able to last centuries. We would be remiss to not at least consider religious pedagogies as potentially viable strategies for our educational objectives. 

    In the case of the BhagavadGita, religious instruction is presented as a dialogue between a student (Arjuna) and a teacher (Krishna). Krishna is hailed as jagadguru, meaning ‘teacher of the world,’ a title bestowed to those teachers whose teachings have worldly impact. While I do not teach the same content as Krishna, surely I can study a jagadguru’s pedagogical strategies. 


    The Structure of the BhagavadGita

    The BhagavadGita is set within the great epic Mahabharata. Specifically, it occurs immediately prior to the outbreak of a war between two sides of a ruling family. In Chapter 1 of the BhagavadGita, Arjuna asks Krishna (who is serving as Arjuna’s charioteer) to take him in front of the opposing army. Arjuna then sees his grandfather, his teachers, and many friends and relatives lined up with the opposing army, and has a crisis of conscience. Arjuna finds himself confused, anxious, and hopeless. It is from this despondence that Arjuna bows to Krishna, pleading for Krishna’s help. 

    Chapter 1 of the BhagavadGita is thus titled “Arjuna Vishada Yoga,” or “Arjuna’s despondency.” Throughout this entire chapter, Krishna stays silent. It is only after Arjuna seeks Krishna’s help, at the start of Chapter 2, that Krishna begins his instruction.

    The implication is clear, and is similar to the old English proverb that ‘You can lead a horse to water, but you can’t make it drink.’ Arjuna would not have been ready or willing to receive Krishna’s instruction prior to experiencing confusion and anxiety. Those feelings created the motivation necessary for Arjuna to steadfastly receive Krishna’s instruction, and earnestly imbibe it into his being. For Krishna’s teachings to have had any effect, Arjuna needed to be ready to do three things: devotedly listen, reflect and contemplate on the teachings, and assimilate the knowledge into his being. Krishna knew Arjuna was ready to do these things only after Arjuna asked for Krishna’s help. 


    My Class

    I teach a very large (500-700 students per section) general education introductory level statistics course. My students are not STEM majors. Most of them take my class because they have to – 99% of 602 students (out of 1019) from the Spring 2024 semester indicated on a survey that they took the course because it was either a major requirement or they were fulfilling a general education requirement. If it weren’t for those requirements, I might not have a course to teach. 

    Consider these students’ motivation. Do they plan to devotedly listen to the content, reflect and contemplate on it, and assimilate statistical thinking into their lives? Do they perceive any need to actually learn statistics? With few exceptions, the answer is no[1]. Prior to Arjuna asking Krishna for help, Krisha stayed silent. My students do not walk into my class because they think they need my help. Following Krishna’s example implies that I too should stay silent. 

    Despite staying silent, Krishna still played an important role in setting the stage for Arjuna’s desire to learn. When Arjuna asked Krishna to drive closer to the opposing army, Krishna could have driven anywhere. However, Krishna chose to drive right in front of Arjuna’s grandfather. By doing so, Krishna created the setting from which Arjuna’s plight would manifest, thereby increasing Arjuna’s motivation to learn and providing the opportunity for instruction.

    Like Krishna taking Arjuna right in front of his grandfather, I too decided that I could create an environment to set the stage for students to develop motivation to learn statistics. To achieve this, I would eschew discussing the syllabus until Day 2, and instead spend Day 1 on telling a series of stories that require statistical thinking to resolve, hoping that students might relate to one, be unable to solve it, and thus develop a desire to learn statistics.

    On Day 1, I tell my students seven different stories, one for each of the content units in the course. Here, I will retell the last of the seven stories I tell my students. This is a true story. 


    My Day 1 Story

    It was a warm August night in Chicago. My grandfather was in the hospital, Day 3 of the current hospitalization due to dizziness and dehydration. The hospital only allowed visitors until 8 p.m., and as I was about to leave, the night nurse came in with some medication for my grandfather.

    The nurse had a pill for my grandfather’s hypertension – he had high blood pressure. I asked what my grandfather’s most recent blood pressure was. “110,” the nurse told me.

    110!? That was not normal for my grandfather. My grandfather measured his blood pressure every morning. He would complete his shower, get dressed, come to the dining room table, comb his hair, pray, measure his blood pressure and pulse, take his medications, and then only begin eating his breakfast. 

    I had spent the last year with him, bearing witness to the BP cuff beeping, inflating, and slowly deflating before giving my grandfather the data for the day, which he recorded in a notebook. Standing in the hospital, talking to the night nurse, I could not remember a single day that my grandfather’s blood pressure was as low as 110.

    I told the nurse that 110 was lower than my grandfather’s typical blood pressure. The nurse explained that they always give the medicine if a patient’s blood pressure is above 130, and won’t give it if the blood pressure is below 100, but when the patient’s blood pressure is between 100-130, it’s at the nurse’s discretion, but they typically give the medication anyway.

    With a blood pressure of 110, did my grandfather really need the medication? I didn’t think so, but the nurse did. If I was wrong and he really did need the medicine, his blood pressure would skyrocket overnight, and he would crash. If, on the other hand, we gave him the medicine but he really didn’t need it, his blood pressure would plummet overnight, and he would crash.

    How would you make a decision about whether to give the medication or not? Would you feel confident about the decision you were making? Would you simply relinquish decision making power and allow the nurse to do whatever they think, even though you know your family best?

    I decided to tell the nurse not to give the medicine, and I felt confident my decision was correct. Despite no medical training, I felt confident because I approached the problem by thinking statistically, and statistically, the answer was clear.

    You are in this class to learn how to think statistically. You are in this class to learn how to apply statistical thinking to the decisions you make in your life. You are in this class because statistics is the science of variability and decision making under uncertainty, and by thinking statistically, you will be better able to navigate the uncertainty you will undoubtedly face in your lives.

    You are in this class so that I can teach you how to think just like me if you ever have to face a situation in a hospital room like I did with my grandfather, and do so calmly and confidently.


    Students’ Reactions to the Story

    When I tell this story to my students on Day 1, I do not tell them my solution. My goal is to get my students to imagine themselves in the scenario, and to think about what they would do. I simply tell them that I know statistics, and that statistics allowed me to calmly make a decision without anxiety, without helplessness, and without despondence. If they find the situation stressful or unnerving, then they need to learn statistics, and I will help them learn how to think statistically. On Day 2 (and again in my last lecture), as a summary of the entire course, I do tell them the solution[2] – this is similar to the structure of the BhagavadGita, in which Krishna summarizes the entire teaching in both Chapter 2 and the final chapter, Chapter 18.

    To evaluate whether this intervention was successful at setting the stage for students’ learning, I conducted three surveys throughout the term – one immediately after the Day 1 lecture, one at the midpoint of the semester, and one immediately before my final lecture. With over 500 responses to each of the three timepoints, I am still in the process of fully analyzing the data. However, it appears that the intervention was indeed successful at motivating at least some students. This is evident from the following example responses to survey items:

    “[The problems] made me want to get an understanding of stats.”

    “It convinced me we need to use statistics for the answers.”

    “[The problems] made me want to learn what statistics does.”

    “I understood I needed to learn stats.”

    Additionally, in the survey data collected prior to the last day of class, this story about my grandfather was by far the story that the students best remembered and saw as important. While there was another story that students said they could imagine themselves in at higher rates, (a story about making a causal inference on whether compression socks can improve your 5k time that I told in the context of my sister and I running together), the fact that students remembered the story about my grandfather’s blood pressure nearly three months after the first lecture and without reinforcement is, I believe, evidence of the story’s efficacy in imparting the necessity and value of learning statistics.

     

    Sources of Pedagogical Inspiration

    This is just one small example of how I strive to draw pedagogical inspiration from anywhere I can, even religion. I do not believe this strategy is unique to a single religion, nor any single source. Another example of pedagogical inspiration I have drawn from religion is from Vedic mahavakyas, i.e., great sayings. These great sayings such as aham brahmasmi meaning “I am brahman” serves a role no different than many other great sayings in all religions. Pedagogically, these short sayings are easy to remember but packed with meaning. They serve as a psychological anchor for content knowledge and further inquiry. What then are our fields’ great sayings? From this inspiration I began teaching my students to say: “Who’s not here?” every time they see a graph, in an attempt to foster a critical statistical literacy habit of mind to question information about the sample and sampling strategy, especially regarding its representativeness and appropriateness for generalization.

    It might seem odd to seek pedagogical inspiration from religion, but it does not seem so odd to me to keep an open mind in terms of potential sources of pedagogical inspiration. Who knows from where revolutionary new ideas can come?

    I believe the best way to support new development and innovation in the teaching of psychology is for each of us to be fully ourselves in all contexts and at all times. Draw on all of your funds of knowledge and apply them generously to your work. Who knows where that may lead? Perhaps, and with any luck, it will lead us forward.

    [1] Only 7% of 602 students (out of 1019) from the Spring 2024 semester indicated that if Statistics was not a required course, they would take it because they believe it is important to learn how to think statistically. 22% indicated that they would take Statistics if it was not a required course because they believe it might help them get a job.

    [2] Based on the past data, and if my grandfather was in a stable condition consistent with how he usually felt over the previous few months, I predicted that my grandfather’s blood pressure should be around 140 – this is a simple model based on the mean. Accounting for variability, I knew that even if he was in stable condition, his blood pressure wouldn’t be exactly 140 – it could be as low as 120 or as high as 160, the typical amount of variation in his blood pressure. Based on this knowledge, I estimated that the RMSE is about 10, and that a middle 95% prediction interval for his blood pressure should be roughly from 120 to 160. If my grandfather was feeling like he normally does, his blood pressure should have been 140. My grandfather’s actual blood pressure was 110. The prediction error was -30. The z-score for the prediction was -3. The prediction was well outside the middle 95% prediction interval for what I expected my grandfather's blood pressure to be. Either my grandfather was feeling completely normal and this measurement was an extraordinary coincidence, or, the hypothesis that my grandfather is feeling like he normally does is not a good hypothesis.

  • 16 Aug 2024 5:20 PM | Anonymous member (Administrator)

    Melissa C. Rothstein
    The University of Rhode Island

    Matriculating into a Behavioral Science Psychology PhD program at 21 years old, I eagerly joined the Health and Alcohol Related Problems (HARP) lab to work under the guidance of Dr. Amy Stamates. Fast forward to my third year of the program, I am not just a student but the instructor of record for an advanced statistics and research methods course at the University of Rhode Island. My passion for research methods and statistics, coupled with my steadfast dedication to ongoing learning, empowers me to connect with students and foster a dynamic and engaging learning environment. Despite the occasional confusion that I still blend in with undergraduates (and, at times, get mistaken for one), I’ve honed the skill of blazer camouflage - an invisibility cloak for looking my age in academia.

    I currently teach Applied Methods in Psychological Research (a 400-level course), where students undertake the challenge of crafting a psychological manuscript comparable to a published journal article. With a class size typically comprising around 15-20 students, this manageable number enables me to provide personalized attention and facilitate hands-on learning experiences tailored to students needs and interests. Throughout the semester, students engage in the collection, cleaning, and analysis of empirical data. The culmination of their efforts results in a written manuscript, which is showcased orally as a presentation at the end of the semester. The assignments provided below have been personally developed, drawn from my undergraduate experience at SUNY Purchase College, or obtained from past instructors at the University of Rhode Island. 

    The Data & The Data Cleaning 

    We utilize Qualtrics to administer a survey to undergraduate students, encompassing various questionnaires chosen by students, covering topics such as happiness and exercise. To enhance the students’ practical skills, I incorporate demonstrations on employing Qualtrics effectively for survey administration. This includes guidance on designing well-structured questionnaires and navigating various features within the Qualtrics platform. Students are explicitly informed that the data collected is solely for educational purposes, as it lacks approval from the Institutional Review Board (IRB) for broader dissemination. Following the initiation of data collection, students undertake cleaning the dataset generated by Qualtrics and coding the questionnaires to prepare for subsequent analyses addressing their research inquiries. Instruction covers data cleaning techniques (e.g., addressing normality, outliers, and missing data) and coding procedures (e.g., sum scores, reverse scoring). Post data cleaning and prior to conducting analyses for their research questions, students are introduced to and explore preliminary analyses, including missing data analysis and reliability analysis.

    The QMRI

    QMRI’s (Question, Method, Results, Implications) serves as a valuable tool for conducting literature reviews and crafting the introduction section of a manuscript. I first encountered QMRI during my undergraduate years at SUNY Purchase College, where it played a pivotal role in my understanding of scientific writing. Now, I incorporate this assignment into my own course. More details can be found here: https://www.purchase.edu/live/files/1244-the-literature-reviewpdf. Students are provided with a template to answer key questions based on the journal articles they read and cite in their introductions. QMRI’s aid in paraphrasing and summarizing, proving particularly helpful for students in the manuscript writing process. The template comprises the following components:

    Q: What is the research question/aims? What is the hypothesis?

    M: What is the method (participants, measures, procedures)? What are the independent and dependent variables?

    R: What are the results of the experiment in lay terms?

    I: What are the implications of the results? Why is this experiment important?

    The IRB Protocol Form

    In my experience, students often gain theoretical knowledge about Institutional Review Boards (IRB) and ethics but lack practical immersion in the process of obtaining approval for an empirical study on human subjects. Consequently, a lab assignment in my course requires students to complete an IRB protocol form for their proposed study (even though gaining approval is not required for students to be able to carry out their research in the course). Research suggests that writing protocols has the potential to function as an educational tool in various domains, such as clarifying and refining research questions, conducting literature reviews, enhancing writing clarity, and ensuring adherence to ethical principles in research (Balon et al., 2019). In this assignment, students work in small groups, typically four to five members, to collaboratively fill out a protocol form and submit it for “IRB approval.” I review the protocols and provide feedback (though my feedback is not as thorough as what the IRB would provide). This exercise encourages students to thoughtfully consider the intricacies of their cross-sectional study, providing valuable insights into the steps researchers take to achieve ethical data collection.

    The Peer Review

    Peer review is an integral part of the learning process in my course because it offers valuable feedback from both their peers and the instructor. This feedback includes constructive criticism, insights, and suggestions aimed at enhancing the quality of students’ work. In fact, research shows that students who were more critical of their peers’ writing tended to achieve higher grades on their own writing (Yalch et al., 2019). All students receive training on how to provide insightful and professional peer review feedback before this assignment. Upon selecting research topics and receiving instruction on providing peer reviews, students are grouped based on shared interests. In these groups, students review each other’s work in two rounds: (1) introduction and method sections and (2) results and discussion sections. Each group consists of three students, with each student reviewing the work of two peers. Written reviews consist of 1-2 pages, encompassing a summary of the research, impression of the paper, and identification of any major or minor issues. Students are also asked to post comments and use tracked changes in the document while reviewing to provide more direct feedback. Additionally, students evaluate their reviewers based on the timeliness, professionalism, and helpfulness of the feedback received. This evaluation is factored into students’ grades to account for the feedback provided and received.

    The Scaffolding

    Using scaffolding for a complex assignment such as writing a psychological manuscript has been beneficial for both students and myself. Scaffolding involves organizing assignments and course materials systematically to align with course learning objectives and ensuring clear communication of goals and processes to students. More information on this approach can be found here: http://www.brooklyn.cuny.edu/web/aca_facultywac/Workshops-AssignmentScaffolding-120412.pdf. At the beginning of the semester, students are tasked with formulating research questions and hypotheses based on survey topics. Subsequently, they develop an outline for their manuscripts before progressing to drafting the content. I divide the manuscript assignment into three parts: (1) a draft introduction and method sections, (2) a draft of the results and discussion sections, and (3) the final manuscript encompassing the title page, abstract, introduction, method, results, discussion, references, tables, and figures. Grading is reserved solely for the final manuscript. Feedback is provided on the drafts, concurrent with peer reviews, to guide students in scientific writing and enhance their skills. Training students on manuscript writing is timely and requires a lot of thoughtful effort. My feedback is usually centered around creating coherent writing, using scientific language, and accurately reporting statistical analyses. To further support students in their learning journey, I encourage an iterative process for manuscript development via scaffolding as described previously. After the initial drafts and feedback stages, students are given an opportunity for revisions before submitting the final manuscript.

    Furthermore, the scaffolding approach extends to collaborative learning experiences, where students engage in discussions and workshops focused on key elements of manuscript writing. These collaborative sessions foster a supportive environment for peer learning. By integrating scaffolding and iterative practices, the aim is to empower students not only in producing high-quality manuscripts but also in fostering a comprehensive understanding of the research and writing processes inherent in psychological studies.

    The Statistics: Guess That Test

    Throughout the semester, students are introduced to a range of statistical tests applicable to addressing their research questions. Since students have already acquired the mathematical foundations for these tests prior to the class in other research methods and statistics courses, the emphasis in this course shifts towards fostering a conceptual understanding. Before taking this course, students would have taken a Quantitative Methods in Psychology course, a Research Methods and Design in the Behavioral Sciences course, and related laboratory classes. The primary focus is on ensuring that students can confidently determine the appropriate statistical test for different research scenarios, emphasizing practical applications and decision-making in test selection. To enhance conceptual understanding, I employ interactive methods such as real-world examples and class discussions to allow students to apply their knowledge to practical situations. This reinforces their ability to discern the most suitable statistical test for a given research context. Here are a couple of example questions below: 

    A psychologist is interested in assessing whether there is a significant difference in anxiety levels before and after a therapy intervention within the same group of participants. What statistical test should the psychologist use and why?

    Solution: Paired-samples t-test, because it enables comparisons between related measurements (pre- and post-intervention anxiety levels) within the same group of participants, facilitating the assessment of if the therapy intervention led to a significant change. 

    You are investigating the relationship between stress levels (measured on a Likert scale) on satisfaction with life (measured on a Likert scale). What kind of statistical test would you run? 

    Solution: Linear regression, because this analysis is suitable for predictive modeling (predicting the value of the dependent variable based on the independent variable) and analyzing relationships between continuous variables. However, given our utilization of cross-sectional data, students commonly employ terms like 'relationship' or 'association' to characterize such connections, rather than using language indicative of prediction or causation.

    The Last Class: All About Graduate School 

    The majority of students enrolled in this class are actively in the process of applying to or preparing their applications for graduate school. As part of the curriculum, I dedicate the last class to discussing master’s and PhD programs in psychology to raise awareness and provide insight into the various opportunities available within the field of psychology graduate programs. I delve into key aspects of the application process, including important considerations when choosing between master’s and PhD programs, crafting a compelling personal statement, and securing strong letters of recommendation. The goal is not only to spread awareness but to provide students with the knowledge and resources needed for a successful transition to graduate studies in psychology. 

    In my role as an instructor, I strive to create an inclusive and collaborative learning environment where students feel inspired and empowered to actively engage in the field of psychology. Recognizing the diverse backgrounds and perspectives within the classroom, I encourage open discussions and harness the wealth of collective experiences. In addition to teaching Applied Methods in Psychological Research, I actively mentor students from my class and lab in their individual research endeavors. This mentorship extends beyond the classroom, providing students with personalized support and fostering a sense of community within the psychology department. As I navigate the dual roles of student and instructor, I remain committed to fostering a learning environment where curiosity thrives, critical thinking is encouraged, and each student feels empowered to explore the domains of psychological research.

    References

    Balon, R., Guerrero, A.P.S., Coverdale, J.H., Brenner, A.M., Louie, A.K., Roberts L.W. …  (2019). Institutional review board approval as an educational tool. Academic Psychiatry, 43, 285-289. https://doi.org/10.1007/s40596-019-01027-9

    Yalch, M. M., Vitale, E. M., & Kevin Ford, J. (2019). Benefits of peer review on students’ writing. Psychology Learning & Teaching, 18(3), 317-325.          https://doi.org/10.1177/1475725719835070

  • 02 Aug 2024 4:23 PM | Anonymous member (Administrator)

    Alexis Grosofsky, Beloit College
    Jordan R. Wagge, Avila University
    Jared G. Branch, University of Utah


    Empirical research articles are an ideal pedagogical medium for helping teach core methodological and statistical concepts to psychology students. Rather than relying on fabricated descriptions of tools like surveys, experiments, and statistical tests, instructors can use full (but short!) research reports to ground these topics in real-world applications. This essay describes an open education resource (OER) we created called “Psychological Literacy for Undergraduate Methods and Statistics” (PLUMS) -- a collection of brief empirical articles to teach methodology and statistics to psychology undergraduates. The articles are accompanied by targeted factual and discussion questions about the research and include information about the design(s), analysis(ses), and any graphical/tabular displays. The methodological and statistical information is cross-referenced by “tags” (e.g., figures and graphs like bar graphs, statistical analyses like regression analysis, methodologies like convenience sampling, and subfields like social psychology), allowing instructors to select empirical articles to coincide with the topic(s) being covered.

    Research methods and statistics are the heart of psychology. Regardless of what subfield of psychology you select, each of them involves research and statistics. After all, our discipline is an empirical science. Norcross and colleagues (2016) collected data using their Undergraduate Study in Psychology (USP) questionnaire and found that (as of 2014) almost all baccalaureate programs required courses in research methods (98%) and statistics (96%). Thus, it is very important that we do a good job teaching students about these topics. This is a difficult task given that many undergraduates find these courses daunting and often try to put them off as long as possible. Students often do not think that research methods and statistics are real psychology. Instead, “real psychology,” to many undergraduates, is learned through content courses such as social, clinical, cognitive, or developmental.

    Despite what students may think, the American Psychological Association (APA) definitely believes that research methods and statistics are important. In their “Guidelines for the Undergraduate Psychology Major, Version 3.0” research methods and statistics are covered in two of their five goals:

    • Goal 2 “Scientific Inquiry and Critical Thinking” has more attention to statistical reasoning than in previous versions;

    and

    • Goal 4 “Communication, Psychological Literacy, and Technology Skills” describes communicating effectively and demonstrating psychological literacy.

    Both of these goals are ones that our Psychological Literacy for Undergraduate Methods and Statistics (PLUMS) project addresses.

    The fact that research methods and statistics are so fundamental to our discipline, coupled with their recognition by the APA, underscores the need to enhance the teaching of research methods and statistics. Students should come away from these classes realizing how important research methods and statistics are to the empirical science of psychology. They should also come away from these courses being (and feeling) competent in their understanding of these vital topics.

    Our idea of having students read real-life examples of research methods and statistics in empirical articles is supported by work done by Lewandowski and colleagues (2017). They describe how they have students read an empirical article covering the design students are learning about before introducing that design to the students. The idea is that students will learn the material better if their interest is captured first as demonstrated by Sizemore and Lewandowski (2011) who found that lessons about confounds were more successful in capturing students’ interest when they were framed around clinical depression rather than memory.

    A book very similar to PLUMS was published by Milinki (2000, 2006). This text introduced articles by methodological technique (e.g., survey research, quasi-experimental research). Once a technique has been selected, the instructor then selects from the 2-5 articles within that technique. Both the second and third authors have used articles from Milinki’s book when teaching Research Methods / Statistics. They observed that using actual empirical reports resulted in their students showing more engagement than when they did not use such articles. The text does have some limitations. First, this text has not been updated since the second edition was published in 2006. Additionally, the organization requires instructors to select only by methodological technique (rather than by statistical technique or other relevant tags).

    We sought to not only update Milinki’s (2000, 2006) work but also to expand upon it. Our project involved the following: first, we selected recent articles for 15 subfields in psychology (see Table 1).

    Table 1

    Subfields included in PLUMS

    _____________________________________________________________________

    Cognitive

    Cross-Cultural

    Development

    Disorders

    Drugs

    Emotion & Motivation

    Learning

    Marketing

    Memory

    Neuroscience

    Personality

    Sensation & Perception

    Sleep

    Social

    Stress & Health

    ______________________________________________________________________

    Each article includes the reference making it easy to find the original article. Additionally, we wrote targeted factual and discussion questions about the research for each article. The factual questions are accompanied by the correct answer as well as the page number where the answer is found and can serve as reading quizzes. For example, “How did the researchers collect data about age preferences?” [they used an Implicit Association Test (IAT), p. 957]. The discussion questions then go beyond simple factual questions and require students to think critically about the reading. For example, “Can you think of another way to conduct this type of research that does not involve using the IAT?” These can be used for classroom or LMS-based discussions. We also include information about the design(s), analyses, and graphical/tabular displays to allow for cross-referencing of information, allowing instructors multiple ways to select empirical articles to coincide with whatever topic or technique is being introduced to their class.

    We envision instructors being able to not only complement current class topics but also to have additional options such as:

    • Assigning some of the articles as extra credit activities (e.g., having students answer the factual and/or discussion questions posed).
    • Using the articles to serve as jumping-off points for students to create a research proposal as a capstone project in a research methods and/or statistics course.
    • Enriching content courses with topical empirical articles related to the course’s subfield.

    The first author hand-selected 8 - 12 articles within each of 15 subfields of psychology, based on presumed undergraduate student readability and recency (publication year). We had undergraduate psychology students read and rate all of the selected articles. The students provided ease-of-reading ratings (on a 5-point Likert scale: 1 = easiest, 5 = hardest; M = 2.0, SD = .48) as well as interest ratings (again on a 5-point Likert scale: 1 = no interest, 5 = most interest, M = 4.5, SD = .71). We were successful in finding 5 articles for each included subfield that were rated as both relatively easy to read as well as interesting. In cases of conflict between the ratings, we prioritized selecting articles that were rated as easier to read rather than more interesting.

    We created a system for instructors or students to submit articles (and the corresponding metadata) so that the materials are regularly updated and enriched. Contributions will be reviewed by members of the project making it a peer-reviewed process. Accepted submissions would be acknowledged as a contribution to the project that could be listed on an instructor's or student’s CV.

    Empirical article libraries, such as the one we built, explicitly help improve student competence with methodology and statistics by using real, published data that they may encounter as fledgling producers or consumers of research. Conducting research comparing having students read articles from our project (specifically selected to be brief, readable, and interesting) vs. using a traditional textbook would be relatively easy to do. In fact, several faculty members could collaborate on such research. We hope that including some of these articles will also make the topics of research methods and statistics classes (which can be dry) more enjoyable.

    We believe that incorporating empirical research will help to counter the perception that methods and statistics are boring (and isolated) subjects rather than the heart of the science of psychology. As instructors, we should be determined to have our students become better consumers of research/statistics and be more aware of what different research designs can (and cannot) tell us. This is especially important given that about 75% of students do not go on to graduate school (Lewandowski et al., 2017), and therefore must learn these skills as undergraduates. For instance, some of these discussion questions speak to applied issues (e.g., “How might we try to decrease bias against older individuals?”). Being psychologically literate will help all of our students become better citizens and better able to know what questions to ask when confronted with data.

    Our project is available at https://sites.google.com/beloit.edu/plums/home. We hope you find it useful.

    References

    Norcross, J.C., Hailstorks, R., Aiken, L.S., Pfund, R.A., Stamm, K.E., & Christidis, P. (2016). Undergraduate Study in Psychology: Curriculum and Assessment. American Psychologist, 71(2), 89-101. doi: 10.137/a0040095

    Lewandowski, G.W., Ciarocco, N.J., & Strohmetz, D.B. (2017). Chapter 23: Research Methods 2.0: A New Approach for Today’s Students. In R. Obeid, A. Schartz, C. Shane-Simpson, & P.J. Brooks (Eds.) How We Teach Now: The GSTA Guide to Student-Centered Teaching. Retrieved from the Society for Teaching of Psychology web site: https://teachpsych.org/ebooks/howweteachnow

    Milinki, A. (2000, 2006). A Cross Section of Psychological Research: Journal Articles for Discussion and Evaluation. Pyrczak Publishing.

    Sizemore, O. J., & Lewandowski, G. W. (2011). Lesson learned: Using clinical examples for teaching research methods. Psychology Learning & Teaching, 10(1), 25-31. https://doi.org/10.2304/plat.2011.10.1.25


  • 16 Jul 2024 5:31 PM | Anonymous member (Administrator)

    Lisa Dierker
    Wesleyan University


    My Story

    I was still in my 20s when I arrived at Wesleyan University, fresh off a 3-year post-doctoral fellowship at the Yale School of Medicine. When asked to teach a research methods course, I had what felt like a brilliant idea driving home from the grocery store one day. I would not use a textbook and I would not deliver lectures. My own classroom training had been ineffective and uninspiring. As I tell my students, I learned 20 different kinds of post hoc tests but didn’t understand when or why to actually use one. So, instead of drowning my own students in information the way I had been drowned, I decided to get them involved with large, real-world data sets and support them in conducting original research. I would teach them what they needed to know when they needed to know it and not before. Their own questions would drive the learning and I would help them to experience the research process from start to finish. Passion-Driven Statistics was born!

    Ten years later, it would become a multidisciplinary introductory statistics course at Wesleyan and a National Science Foundation funded model serving thousands of students across disciplines and educational environments in the United States and Internationally (e.g., Canada, Ghana, Nigeria, Philippines, Peru, United Kingdom, and still reaching). Passion-Driven Statistics is now a widely used project-based curriculum that has been implemented as a statistics course, a research methods course, a data science course, a capstone experience, and a summer research boot camp. Liberal arts colleges, large state universities, regional colleges and universities, medical schools, community colleges, and high schools have all successfully implemented the model.

    The curriculum has been found to attract higher rates of under-represented minority (URM) students compared to a traditional statistics course and students enrolled in Passion-Driven Statistics are more likely to report increased confidence in working with data and increased interest in pursuing advanced statistics coursework (Dierker et al., 2018). This project-based approach also promotes further training in statistics. Using causal inference techniques to achieve matched comparisons across three different statistics courses, students originally enrolled in Passion-Driven Statistics were significantly more likely to take at least one additional undergraduate course focused on statistical concepts, applied data analysis, and/or the use of statistical software compared to students taking either an activity-based psychology statistics course or a math statistics course (Nazzaro, et al., 2020). In more recent research Passion-Driven Statistics has been found to be associated post-graduation with a higher likelihood of holding a job in which a primary responsibility includes working with data, greater confidence in working with data, and a higher likelihood of earning more than $100K annually (Dierker et al., in press).

    A New Role

    I always thought that I understood the ingredients that make Passion-Driven Statistics so empowering, and if asked, I would have told you about the opportunity to ask your own research questions, or I would have pointed to its just-in-time and need-to-know approach to content knowledge, or even its focus on technical skills in the service of disciplinary content and critical thinking. This year, I stepped back in to teach the course after several years away from it. Seeing it with fresh eyes more than 20 years after that first spark of inspiration made me realize that so much of its power comes from the simple act of new learners teaching newer learners.

    I used to be the “new learner,” understanding exactly what it felt like to encounter and struggle with the abstract concepts, disciplinary jargon, mathematical complexity, and the arcane programming syntax involved in authentic research. Two decades later, I find that my role in the course has changed. I am no longer a new learner, and as much as I try to recreate that space and those feelings in myself, the “curse of knowledge” and my hard-won expertise hold me back. Now, I am recognizing an entirely new role in providing support to those former Passion-Driven Statistics students who have generously stepped in as peer mentors, warmly guiding our newest generation of students in the same empowering way that I was able to all those years ago. They are now the new learners teaching our newer learners from a place of empathy, passion, patience, high expectations, and mutual support. Every day in class, I see them using their new learners’ superpowers to inspire others, to explain concepts by getting to the simpler, more digestible parts faster, and to understand students’ perspectives in a deeply genuine way. I have loved watching them hone their skills in listening, adapt to the needs of the individual students and nurture them in ways that meaningfully impacted their own educational trajectory when they played the role of the newer learner.

    Working this semester with some of the current peer mentors, Joyce Sun, Erin Byrne, and Luis Perez, has reminded me that Passion-Driven Statistics is as much a culture as it is a course. It is a space where no one needs to know everything, where we can all bring our best stuff, and where moral support and compassionate engagement allow our students to become the heroes of their own learning. Together, we take students out of their comfort zone and then love them through the fallout by creating an inviting classroom and an experience that gives students a safe and supportive space to get things wrong before they get them right.

    While my role as expert in this space may continue to be necessary and even valuable on rare occasions, it is also wholly insufficient. It is only together with new learners, our newer learners, and expert voices that we hold the necessary and sufficient ingredients to change lives in the data analytics space. I know, it sounds rather dramatic, but it is! 

    And if that were not enough, I am also marveling at the chorus that I have continued to hear from peer mentors across the years, that they “learn more as a peer mentor” than they did when taking the Passion-Driven Statistics course for the first time. Though secondary and post-secondary education continues to resist the power of learning through teaching, it is the most untapped, cost positive tool that we currently have as educators. I believe that it is stronger even than the current promises of AI. Peer mentors may serve as volunteers, be paid through student work programs or training grants, or receive course credit as teaching assistants or through course designations (e.g., statistics education practicum). It does not have to be a promise for the future. We have everything that we need right now.

    The Next Step

    You might be interested to learn that my time away from teaching Passion-Driven Statistics has been spent designing a new project-based curriculum aimed at reimagining General Education. The goal of this new initiative is to expose students to a wide range of digital skills as they learn traditional disciplinary content. Within our digital “Introduction to Psychology” course, students explore concepts and content in the field of psychology through video storytelling, programming, data visualization, web development, design and more. This novel curriculum is aimed at solidifying new content knowledge, exposing students to modern digital tools, and providing them with the opportunity to create new learning artifacts.

    And with this, I have found myself a new learner again, not just conquering new content outside of my research subdiscipline, but learning new tools, new skills, new design principles and being useful again the way only a new learner can be. All this newness is of course accompanied by uncertainty, vulnerability, and the distinct possibility of utter failure. It is hard and that is what I love about it. I find myself feeling inspired again and eager to bound out of bed in the morning to face new challenges and to find the transformative experience that I first found in the Passion-Driven Statistics classroom all those years ago.

    I am always eager to network with passionate instructors excited about things we have not even imagined yet. Please feel free to reach out at ldierker@wesleyan.edu.

    Resources for Passion-Driven Statistics are available at https://passiondrivenstatistics.com/. Some that you might find particularly useful include a free e-book and translation code aimed at supporting the use of diverse statistical software. Resources for Digital Intro are available at https://digitalintro.wescreates.wesleyan.edu/. I encourage you to take advantage of our introductory psychology lessons and project videos on our Youtube Channel. I am also happy to share a new project sharing platform, OpenLab, where students can get inspired, post learning artifacts, and share their work and learning by creating a free digital portfolio. Follow us on Instagram or check us out on LinkedIn to learn more!

  • 15 Apr 2024 6:42 PM | Anonymous member (Administrator)

    Rachel T. Walker
    University of the Incarnate Word
    Click here for a link to the article with figures

    When I was an undergraduate student in biology, I decided to take a statistics course in psychology. I didn’t realize at that time that I would later be teaching this course in graduate school, and I couldn’t imagine that statistics would be one of my favorite courses to teach. Statistics can be a challenging subject for many students, but effective teaching can make a significant difference in how it's perceived (Pan & Tang, 2004). Over the years I have taught this course using a variety of teaching strategies based on the department layout of the course. As I progressed in teaching this course, I wanted the course to be flexible and responsive to students’ needs and create an active and effective learning experience for teaching behavioral statistics.

    Over the years, I have continued to ask questions related to effective teaching strategies. What if I embedded videos or journal articles related to the real-world application of statistics? How could formative assessments such as quizzes, discussions, and polls during the course gauge students’ understanding? How can I use hands-on applications to illustrate the concepts of the material? Can I combine traditional lectures with interactive elements? Could I use a technology integration like SPSS (a software package used for the analysis of statistical data) to provide a hands-on project? How can I use scaffolded learning to break down complex statistical concepts? I will share some of the ways I have addressed these questions.

    What if I embedded videos and research related to the real-world application of statistics?

    I embed videos and research materials using a mixture of resources. Here are several examples of how I use short videos within the lecture. I show the videos during class, but students can also access them outside of class to confirm their understanding of the material.

    I incorporate various Crash Course Statistics videos into the semester, offering detailed examples that illustrate the practical applications of specific statistical concepts in our daily lives. Before the start of the semester, I reach out to students and shared a crash course statistics video that provides the purpose of statistics; for example, how meteorologists use statistical methods to analyze historical weather data, identify patterns, and make predictions about future weather conditions, and how companies use statistics to aid in analyzing consumer behavior, preferences, and trends. I incorporate additionalCrash Course Statistics videos to provider a preview of specific statistical concepts, such as central tendency, before diving into the lecture content. For instance, before the central tendency lecture, I share a video that provides an overview of how these statistics can determine the center of both normal and skewed distributions.

    Crash Course Statistics Preview

    https://youtu.be/zouPoc49xbk?si=bBGlQy3SviHhirAH

    Mean, Median, and Mode: Measures of Central Tendency: Crash Course Statistics #3

    https://youtu.be/kn83BA7cRNM?si=arSRn7zQJddDpOhj

    In one of the classes, I cover the four levels of measurement along with fundamental definitions and a few examples. Following that, I present a brief video offering visual insights into the distinctions among the measurement scales.

    Data Science & Statistics: Levels of measurement

    https://youtu.be/eghn__C7JLQ?si=mOoqzh-k-adUtNz6

    Another instance related to the use of a short video involves the application of bar graphs. Initially, I instruct students on the X- and Y-axes to depict data. Students acquire the skills to construct histograms and bar charts and interpret their representations. Once they grasp the fundamentals of bar graphs, I introduce a video that provides real-world instances of commonly shared misleading graphs.

    How to spot a misleading graph

    https://youtu.be/E91bGT9BjYk?si=4Rn8keUpH5yGpVC2

    In addition to videos, I also distribute sections of a journal article, allowing students the chance to practice reading and interpreting the results section. I first provide students with the abstract of the article to offer a brief overview of the article, highlighting the main objectives, methods, results, and conclusions of the work. I share the results section to provide an overview of the structure of the results on the statistic that relates to the lecture. This is usually the first time that students are introduced to reading the results of a scientific article related to psychology. This process assists students in understanding the format of how statistics are reported in a journal article and the use of APA format. In other psychology courses, students will be required to summarize scientific articles and understand the methods and analyses.

    How could formative assessments such as quizzes, discussions, and polls during the course gauge student understanding?

    Quick quizzes are embedded throughout the lecture to test student understanding after each small section of content. These questions, taken from Cengage’s instructor materials related to the textbook (Essentials of Statistics for the Behavioral Sciences 10th ed., Gravetter et al.) could be multiple choice, true or false, or applied research questions. This process allows students to confirm they understand the course material before we continue to move forward in the chapter.

    I incorporate discussion group assignments in the course to encourage active engagement amount students. Throughout the semester, I offer six discussion board opportunities, where students submit their discussion topics and respond to posts from their peers.

    Here are several examples of sources that could be used for creating discussion group assignments:

    1) A majority of Americans have heard of ChatGPT, but few have tried it themselves. Integrate the information from the tables into your overall understanding of the material.

    https://www.pewresearch.org/short-reads/2023/05/24/a-majority-of-americans-have-heard-of-chatgpt-but-few-have-tried-it-themselves/

    2) How to defend yourself against misleading statistics in the news.

    Integrate the information in the video in your overall understanding of misleading statistics.

    https://youtu.be/mJ63-bQc9Xg?si=CqIubt8xxzHLFtx8

    3) Correlating Barriers to Medication Adherence With Trait Anxiety, Social Stigma, and Peer Support in College Students With Chronic Illness

    Indicate how the information from the tables and result section into your overall understanding of the material.

    https://www.psichi.org/page/273JNFall2022#.Y8R15hXMK3A

    Directions for Response: Make sure your responses are well thought out and each provides at least 3 sentences for each section. Respond to each of the following questions: Describe the topic provided by this resource. What did you find interesting? How would this relate to the real world? What did you find challenging to understand?

    Directions for Replies to colleagues should be at least 3 sentences as well. Reply to another student's post: Replies can include your thoughts about the student's perception of the source or your additional thoughts on the topic related to the source.

    Moreover, I employ Poll Everywhere in diverse manners within a lecture. For example, at the beginning of a lecture on descriptive statistics, students are asked “What type of social media is used the most in the U.S.?” Once students submit their thoughts, I show students the data related to this question that did not support most of their responses for adults. However, I then provide data on teens' use of social media that is closer to their responses. After the discussion, I lecture on descriptive statistics.

    Here are the links I shared from PEW and discussed the changes over time.

    https://www.pewresearch.org/internet/2021/04/07/social-media-use-in-2021/

    https://www.pewresearch.org/journalism/fact-sheet/social-media-and-news-fact-sheet/

    https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/

    https://www.pewresearch.org/internet/2023/12/11/teens-social-media-and-technology-2023/

    I also use Poll Everywhere towards the end if a lecture to ensure that students understand the content. For example, rate your level of understanding of how to calculate an independent t-test. If students respond that they are struggling with this issue it provides useful feedback and students can ask specific questions regarding their issue.

    How can I use hands-on applications to the concepts of the material?

    Here is an example of how I utilize hands-on applications in class. First, I teach students how to read and understand a research scenario, determining information such as the alternative hypothesis, the alpha level, and the variables provided.

    During a lecture, I present how to use that information in the 4-step process for hypothesis testing.

    1. State null and alternative hypotheses.

    2. Identify the critical region based on alpha level, one or two-tailed hypothesis, and degrees of freedom.

    3. Compute the statistics by showing all calculations.

    4. Draw out the distribution with the critical and test statistic. Conclude and report the findings in APA format.

    After students took notes on this process, I provide them with another research scenario to solve during class. While students are working through the 4-step process, I assist them along the way. For example, if a student wants to know if they are on the correct path, they might ask if their critical region is correct. If the student is incorrect, instead of saying no, I ask them a question.. I asked them to show me how they came to that conclusion. This process allows the student to find the correct answer in most cases. Students can then proceed to complete homework questions using the hands-on applications introduced in class. In many real-world scenarios, the use of statistical software and tools has become standard due to their efficiency, accuracy, and ability to handle large datasets. This process of manual calculations can be more effective in conveying the step-by-step process, contributing to better conceptual communication. However, in some work situations small datasets and manual calculations can be quicker than setting up and using statistical software.

    Can I combine traditional lectures with interactive elements? Could I use a technology integration like SPSS to provide a hands-on project?

    How can I alter my previous teaching of behavioral statistics? I did something I thought I would never do. I had to remove some of the content to provide students with the opportunity to learn how to analyze, interpret, and summarize their results by integrating technology. I use SPSS, but other types of software can be used. such as Microsoft Excel. I've excluded lectures covering paired-t tests, two-way ANOVA, and Regression. While these statistics are referenced in a lecture, students won't receive in-depth information about these subjects. Our department offers an elective course in Advanced Statistics, providing students with the opportunity to explore and delve into more intricate statistical concepts. In addition, this change allowed me to use those class times to embed a lab component into the lectures.

    I provide students with a preexisting dataset that I collected earlier, which they use in the lab component of the class. This data is employed for descriptive statistics, independent t-test, one-way ANOVA, and correlations. I familiarize students with the broader subject of the research they will be examining, which involves personality and social networking. Subsequently, I clarify the variables and their measurements, such as gregariousness and the frequency of social media usage. In the lab, I guide them through the SPSS layout to enhance their understanding of the software's functionality and then provide lab for each of the four types of statistics that will be analyzed in SPSS. For example, after I teach the independent t-test, I will have a lab focused on how to calculate the independent t-test in SPSS, how to interpret the outcome, and how to write up the findings in the APA format. I provide handouts for the lab that include an introduction, the steps to complete in SPSS, an example of the output, and a paragraph of the findings of the example. As I explain this process, students follow along by mimicking my steps. Subsequently, I task students with forming hypotheses derived from the measured variables. In the assignment, students are required to generate two hypotheses. I review each hypothesis before examining the analysis of the first one in the lab. Afterward, I provide feedback on the results of each student's first hypothesis before the conclusion of the lab session. Throughout the lab, I employ the Socratic method to facilitate learning and guide students in completing the assignment related to the second hypothesis outside of class.

    How can I use scaffolded learning to break down complex statistical concepts?

    Teaching a course using scaffolding involves providing structured support to students as they learn new concepts, gradually removing this support as they gain mastery. Here's my step-by-step guide on how I implement scaffolding in a course:

    1) Assess prior knowledge: I use poll everywhere at the beginning of a lecture.

    2) Break down the information: I define terms, provide steps for analysis, and utilize quizzes.

    3) Provide guidance: I allow students individual practice in and out of the classroom.

    4) Encourage collaboration: I embed collaboration with the instructor and other students.

    5) Continuous assessment: I assess in-class calculations, poll everywhere, and quizzes.

    6) Gradual release of responsibility: I utilize the Socratic method in the lecture and lab.

    7) Applications to real-world tasks: I offer discussions on real-world situations and provide students with the opportunity to analyze, interpret, and report on existing data.

    8) Flexibility: I utilize an adaptive based on various levels of support needed.

    It is essential to teach statistics according to students' needs and foster an active and effective learning experience for various reasons. This includes the utilization of active learning methods, such as hands-on activities and engaging discussions, to keep students motivated and involved in the learning process. Additionally, enhancing understanding by presenting practical situations, connecting statistical concepts to real-world scenarios, and equipping students with proactive skills and problem-solving abilities are key objectives in this approach.

    In summary, teaching statistics in a way that addresses students' needs and incorporates active learning methodologies enhances the overall learning experience, making the subject more accessible, engaging, and applicable to students' academic and professional pursuits.

    I consistently adapt and modify the course design in response to student feedback and through collaboration with fellow instructors. This ongoing process makes teaching this course a continuous and rewarding experience. This story never ends… which makes this course still one of my favorite courses.

    Reference

    Pan, W., & Tang, M. (2005). Students' perceptions on factors of statistics anxiety and instructional strategies. Journal of Instructional Psychology, 32(3), 205.


<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
Powered by Wild Apricot Membership Software