Society for the Teaching of Psychology: Division 2 of the American Psychological Association

E-xcellence in Teaching
Editor: Manisha Sawhney
Associate Editor: Annie S. Ditta

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
  • 05 Feb 2024 7:47 PM | Anonymous member (Administrator)

    Brooke O. Breaux
    University of Louisiana at Lafayette

    My department’s Psychological Science course has two primary objectives: 1) for students to start building the underlying knowledge that they will need to become a producer of psychology, and 2) for students to become more familiar with psychology as a major and a discipline. Psychological Science—designed for second semester freshman who have taken only an introductory psychology course—was integrated into my department’s 2020-2021 curriculum. Our intention was for Psychological Science to be taught as a traditional in person course, but due to the precautions taken by my university in the midst of the COVID-19 pandemic I taught this course first as a synchronous online course and then as a hyflex course with students deciding whether to attend class in person or online and then, finally, as a fully in person course. Setting aside the complexities of teaching the course in formats different than the one we had in mind when developing the course, Psychological Science itself is ambitious. At a minimum, students enrolled in this course are required to complete a pre-course and a post-course assessment, to take exams and/or quizzes, to construct an actionable plan for their professional development and career exploration, to earn a research ethics certification (i.e., Undergraduate Training on Human Subjects Research through the Collaborative Institutional Training Initiative (CITI), to serve as a participant in actual psychological research, and to write a brief APA Style research proposal. Faculty assigned to teach this course are required to cover topics ranging from psychology as a discipline—including degrees and careers in psychology—to psychology as a science—including research methods, research ethics, and APA Style writing. When teaching this course for the first time, I made the incorrect assumption that if my goal was to have my students write quality research proposals, all I needed to do as an instructor was to provide them with the relevant research design concepts and a clear assignment rubric. What I learned that first semester was that such an approach was insufficient for many of my students and that they needed significantly more scaffolding to produce what I would consider to be a quality product.

    I have now taught this course six times and have dramatically changed the way in which I teach research methods. The approach I have developed is highly scaffolded, involving a sequence of three assignments. For each of the assignments, I have constructed explicit instructions, aligned the delivery of course topics with the assignment deadline, and eliminated unnecessary complexity; however, before diving into a more detailed discussion of my efforts to make the writing of an APA Style research proposal a much more integral part of the course, I thought it would be useful to discuss the development of our Psychological Science course, the integral role it plays in my department’s current curriculum, and our efforts to standardize certain elements of the course.

    Curricular-Level Enhancements: How Did We Get Here?

                When I was hired as a faculty member, undergraduate psychology majors did not take our Introduction to Psychology course. They took two courses designed for majors: one focused more on the basic science of psychology and the other focused more on the applied aspects of psychology. After several semesters of teaching the basic science half of this introductory course sequence, I advocated for a change in our curriculum. This change was supported by the faculty members teaching these introductory courses for majors, who agreed that our curriculum lacked a true research methods course, that we could do a better job of preparing students for our Psychological Statistics course, and that the order in which we introduced certain topics and assessed certain learning outcomes in our curriculum could be improved. To illustrate this last point, it is helpful to know that students enrolled in our basic science of psychology course for majors were typically freshman who were taking the course during their first semester in college. This is the same semester in which the majority of students take their first general education English writing course, which requires them to write papers in MLA Style. Then, during the same semester, our basic science of psychology course for majors introduced students to the discipline of psychology; taught them about some of the major themes, concepts, and findings related to basic science topics, such as the biological psychology and cognitive psychology; and required them to write an APA Style literature review. It is no wonder, then, that many students found it difficult to be successful in this course. I knew that our department could provide students with a better introductory learning experience and that such a change could also serve to strengthen our curriculum.

    We went with a change that would require our majors to take a Psychological Science course but only after taking our Introduction to Psychology course. The decision to have all students take our Introduction to Psychology course was supported by documents such as “Strengthening the Common Core of the Introductory Psychology Course” in which the American Psychological Association (2014) explains that there is no evidence in the literature to suggest that having two introductory psychology courses—one for majors and one for nonmajors—is needed. The decision to create a new course for majors was supported by Stoloff et al. (2010), who suggest that departments that want a more robust Introductory Psychology course for their majors can modify other requirements and sequencing. For example, departments that want to provide more early experiences might be better served by creating another course, such as one that addresses research methods (Stoloff et al., 2010), career preparation (Atchley, Hooker, Kroska, & Gilmour, 2012; Brinthaupt, 2010; Thomas & McDaniel, 2004), preparation for the major (Atchley et al., 2012; Dillinger & Landrum, 2002), or writing in the major (Goddard, 2003)” (American Psychological Association [APA], 2014, p. 20). To this end, we determined that students would benefit from the creation of a required Psychological Science course designed to target these specific objectives.

    Psychological Science is a critical course in our curriculum, providing students with a solid foundation in research methods and serving as a prerequisite for our required Psychology Statistics course. Because of its foundational nature in our curriculum and because it would inevitably be taught by a variety of faculty members, we determined that a minimum standardization of the course would be necessary to ensure similar outcomes across all students. Included in our standardization of this course is the requirement for all students to complete a brief APA Style research proposal, consisting of an APA Style title page, introduction with APA Style citations, method section, and APA Style reference entries; however, what we did not specify was a means by which faculty are to achieve this objective. There are two faculty members who regularly teach Psychology Science, but other faculty members are assigned to teach this course as needed. Everyone who teaches Psychological Science is considered a member of our standardization committee. The role of this committee is to address any issues a faculty member has with the standardization and resolve these issues by updating or changing the standardization.

    Course-Level Enhancements: What Am I Doing?

    Teaching psychological research methods to undergraduates who have only had an introductory psychology course is challenging, and requiring undergraduate students to complete research proposals within such a course can be overwhelming for everyone involved, especially when the class is not small (i.e., around 45 students), does not include a laboratory component, and takes place during a 15-week semester. In the context of research methods courses, project-based learning experiences, such as writing a research proposal, are generally encouraged; however, because the assignments described in the literature tend to focus on more advanced students (e.g., Chamberlain, 1986), I used trial-and-error to develop an approach that enables students to more effectively and efficiently produce quality research proposals. Interestingly, my intuitions ended up aligning with strategies that have been advocated by other instructors, such as reducing unnecessary complexity, especially as it relates to research design (e.g., Yoder, 1979), and offering students the opportunity to practice producing quality writing (e.g., Ishak & Salter, 2017).

    My initial approach to teaching this course was to provide lectures on the relevant topics in the order that they appear in the textbook, expecting students to incorporate this information into their research proposal document. My students found this part of the process exceedingly difficult and this strategy resulted in research proposal that did not meet my expectations; therefore, I created a three-stage (i.e., Introduction Section, Method Section, and Appendices), step-by-step process for developing a research proposal. The instructions for each section are contained within step-by-step documents that are made available to students on our learning management system. To reduce unnecessary complexity I reordered the course topics so that the concepts read about in the textbook and discussed in class would be directly relevant to the part of the research proposal that students are currently working on, and students are explicitly told which step in the step-by-step documents the textbook readings and lecture materials are relevant to. I also created a grading form that aligns with the step-by-step document, which enables me to provide timely feedback at each stage.

    Anyone interested in how I have aligned lecture topics, APA course objectives, and development of an introduction section, method section, and appendices can access this information in the form of a poster I presented at the APS-STP 2023 Teaching Institute (Breaux, 2023;  Actual resources that I used during the Spring 2023 semester, such as the step-by-step guidelines (e.g., “Introduction Section Instructions”) and grading forms (e.g., “Introduction Section Rubric”), can be found in the main folder I created for the APS-STP 2023 Teaching Institute (Breaux, 2023; Readers are invited to use or modify the resources provided for educational purposes only.

    I have also made the literature review portion of the introduction more manageable by requiring students to cite only four empirical research articles. This approach allows students to focus on basic skills, such as integrating information from different sources and using APA Style citations appropriately. It also helps students avoid both accidental plagiarism (often due to insufficient paraphrasing skills) and intentional plagiarism (often due to issues with time management). Another change that I made was to have the whole class focus on the same topic. I always try to select a topic that psychology undergraduates could relate to on a personal level, such as the extent to which college students believe psychological myths (e.g., Hughes et al., 2015) and the extent to which college students engage in self-care (e.g., Zahniser et al., 2017). I have found that topics related to the teaching of psychology and social psychology tend to be more accessible to students at this stage in their academic careers and that topics related to biological psychology and cognitive psychology are the most difficult. Pre-selecting a topic for the semester affords two primary benefits: Students can start reading the empirical literature sooner, and I can address issues specific to the topic during class time. My current approach to teaching Psychological Science shares similarities with Passion Driven Statistics (, which is a project-based approach to teaching statistics that focuses on providing students with only as much information as they need to successfully complete the current tasks they have been assigned.


    These improvements have made teaching psychological research methods to undergraduates who have only had an introductory psychology course feel much more manageable. Even though my evidence is primarily anecdotal, students seem less intimidated by the research proposal process because they are more aware of my expectations and the ways in which I want them to utilize the course materials when working on their research proposal. I hope that my experience can inspire other faculty members not only to continue improving their own courses to meet the needs of students but also to advocate for broader curriculum changes in their own departments, and I hope that what I have learned along the way can be used by others to improve how we teach psychological research methods to undergraduates.


    American Psychological Association. (2014). Strengthening the common core of the introductory psychology course. Washington, DC: American Psychological Association, Board of Educational Affairs. Retrieved from

    Breaux, B. O. (2023, May 23-24). Benefiting from explicit instruction, content alignment, and strategic simplification [Poster presentation]. APS-STP 2023 Teaching Institute, Washington, D.C., United States.

    Chamberlain, K. (1986). Teaching the practical research course. Teaching of Psychology, 13(4), 204-207. 

    Hughes, S., Lyddy, F., Kaplan, R., Nichols, A. L., Miller, H., Saad, C. G., Dukes, K., & Lynch, A.-J. (2015). Highly prevalent but not always persistent: Undergraduate and graduate student’s misconceptions about psychology. Teaching of Psychology, 42(1), 34–42.

    Ishak, S., & Salter, N. P. (2017). Undergraduate psychological writing: A best practices guide and national survey. Teaching of Psychology, 44(1), 5–17.

    Stoloff, M., McCarthy, M., Keller, L., Varfolomeeva, V., Lynch, J., Makara, K., Simmons, S., & Smiley, W. (2010). The undergraduate psychology major: An examination of structure and sequence. Teaching of Psychology, 37(1), 4–15.

    Yoder, J. (1979). Teaching students to do research. Teaching of Psychology, 6(2), 85-88.

    Zahniser, E., Rupert, P. A., & Dorociak, K. E. (2017). Self-care in clinical psychology graduate training. Training and Education in Professional Psychology, 11(4), 283–289.

  • 27 Nov 2023 3:13 PM | Anonymous member (Administrator)

    Amanda W. Joyce
    Murray State University

    Psychological research methods can be a dreaded course for students and instructors alike.  Students report negative emotions from and perceptions about about research, they struggle to see the relevance of research-related material, and they are concerned about the complexity of the research process, all of which can negatively impact their understanding of the course content (Balloo, 2019; Murtonen et al., 2008; Rancer et al., 2013).  Similarly, instructors broadly report concerns about student tardiness, dishonesty, inattention to material, and lack of preparation (Fazily et al., 2018; Lashley & de Meneses, 2001) which could be exacerbated in challenging courses like research methods. 

    Thus, innovative techniques are needed to improve student and instructor experiences in research methods.  Frequently, this innovation comes in the form of applied, active learning that is directly relevant to student experiences—characteristics which have long been touted as beneficial for student learning (Ball & Pelco, 2006; Etengoff, 2023).  In fact, a recent study drawing upon interviews of experienced research methods instructors heavily emphasized the benefits of allowing students to apply what they learned, particularly through hands-on research experiences (Lewthwaite & Nind, 2016).

    Involving students in hands-on research experiences, however, can present still more challenges.  Individual student projects can lead to a heavy grading burden for instructors, and partnered or group projects can be fraught with interpersonal complaints and social loafing.  The purpose of this essay is to explore an option for whole-class collaborative data collection that still allows students individually to propose, analyze, write about, and present data on a project of their own personal choosing.  The collaborative data collection process encourages accountability and teamwork.

    The Project

    Pedagogical Context

    At my university, psychological research methods and statistics are taught in a combined three-course sequence, with the third course focusing on hands-on data collection in what is generally the students’ first research project.  Enrollment for this third course is typically 15 students, all Psychology majors.  The learning objectives for this course require successfully navigating the research process (e.g., “Generate an original research question,” “Conduct a research study in accord with APA’s ethical principles,” etc.).  Thus, the learning objectives of the course, as well as the teaching technique I propose here, encourage students to navigate the research process, from idea generation to final presentation. 

    The Research Project: What Works for Me

                I have personally had great luck with an approach to teaching research methods that intermixes individual and group work while leading students through their first ever quantitative research project.  I have found it to increase individual accountability and teamwork while reducing many of the headaches associated with individual or paired data collection.  I provide a brief overview of the project below.  I am also happy to share course resources with interested readers.

    Students’ experience with hands-on active learning through research occurs through a semester-long research project that occurs in three main phases: (1) individual idea generation, (2) group questionnaire and database creation, and (3) individual data analyses and presentation.

    Individual Idea Generation

                Students begin the semester by individually generating research questions.  Research shows that students have better learning experiences when they work on projects that are personally meaningful (Andresen et al., 2020), and I have found this to be true in my classes as well.  We spend several class periods discussing the contents of a strong research hypothesis that would be testable under the constraints of semester-long project collected on students at their university.  For instance, we discuss how longitudinal hypotheses or hypotheses about overly-specific populations who we are unlikely to recruit on campus (i.e., the elderly; fraternity members who have been diagnosed with schizophrenia) would not be appropriate.  I also limit students to correlational (as opposed to experimental) research designs, which work best within our collaborative data collection process that emphasizes surveys as the primary data collection method.  During the first week of classes, students submit a list of five research questions that they are interested in exploring, which means that they are generating ideas before they have had the benefit of all class discussions on the topic, but generally one or two of their ideas are appropriate, and I am able to guide them toward those ideas. 

    Then (week 3) students submit a final research question for approval before they dive into their topic of interest.  A librarian visits the course to teach students about how to use library resources to find peer-reviewed journal articles on their topics of interest, and students use this information to find five or more articles (week 5), which they summarize and later synthesize into an introduction section for their research paper (week 7).  

    Group Questionnaire and Database Creation

                Students then gather measures relevant to their individual research hypotheses.  They often overlap with their peers in their topics of interest, meaning that there is overlap, too, in the measures that they may choose.  For instance, one student may be interested in anxiety and sleep quality, while another is interested in fraternity and sorority membership and anxiety, and yet another is interested in sleep quality and religiosity.  I encourage students with overlapping topics to work together to find common measures so as to reduce their burden in working with said measures, and I find that they are happy to take this opportunity for reduced workload.  When students happen to not have variables in common with their peers, I encourage them to use brief measures, such as short-form versions of scales rather than full scales, so as to reduce participant burden.

    Students submit their measures (week 6) and, after I have reviewed each of them, we spend a class period gathering each measure into a class-wide shared Google Doc that will later become the questionnaire packet that participants receive.  Combining the measures into a single document during class ensures that everyone has the ability to closely supervise the process and catch any potential errors, like missing items or typographical errors, particularly in overlapping measures, for which several students are very closely monitoring.

    Throughout the semester, students learn about the ethical aspects of research, and they have been working through ethical certification (CITI Training).  Thus, as soon as measures are gathered, we are ready to submit our project, as a single application, to our institutional review board (IRB) for approval.  I submit the application on students’ behalf, but I include the measures and hypotheses that they have provided to me, and we spend one class discussing the contents and importance of the IRB application and process.

    In the one to two weeks (usually weeks 7 and 8) needed for IRB approval, the class prepares for data collection.  First, we learn about the data collection process and how to write about it.  Students learn departmental policies for data collection, including how to reserve rooms, how to use our participant management system (SONA), and more, and they write drafts of methods sections for their final paper. 

    When students begin collecting data (usually week 8 or 9), they host research sessions individually, but they collect on the full research packet that was approved by the IRB.  In other words, even though students collect data individually, they collect data relevant to everyone.  This means that they have the ability to share research materials, that they can cover each other’s research sessions in case of emergency, and that they feel a personal accountability to the group to do good research.  It also means that they can have a large sample size, typically 100 or more students drawn from our department’s research participant pool.  I emphasize throughout the semester how we are a team working toward a common goal, and I find that students will often organically support one another in ways that I haven’t anticipated, such as offering up suggestions about where to find free or cheap printing for research materials.

    Similarly, we crowdsource data management.  We spend several class periods building a shared class database in Google Sheets.  Students are responsible for creating a key for their individual measures so that everyone knows how data should be entered for all measures.  Again, in a combination of individual and group efforts, each student is responsible for entering all data that they collect, meaning that they are helping to support not only their own research interests but also their peers’.  This shared data entry strategy is another way in which I find students embracing the collaborative nature of this type of work—many will offer to cover data entry for another student when they know the other student is overwhelmed with their participant workload.

    Individual Analyses and Presentation

    When students finish data collection (week 11 or 12), we can begin the data analysis process.  Students are reminded as a group how to run the most common analyses (calculating a scale score from Likert data, determining participant demographics, running a reliability analysis, correlations, and t-tests).  Then there are several in-class workdays during which students can practice these analyses on their own data.  Each student is responsible for analyzing data relevant to their own research hypothesis.  I float around the computer lab to provide support to students with questions, but as there is only one of me, they find additional support in their classmates.  Students often answer one another’s questions and double-check analyses.  This is easily the most rewarding part of the semester, hearing students teaching and encouraging one another, and cheering when they see statistically significant results.

    Following analyses, students are responsible for sharing their results in a final research paper.  They previously submitted a draft of an introduction (week 7) and method section (week 9).  The initial method draft was written at a time when they did not know their participants’ characteristics, so in that draft, they left placeholders for these numbers.  Thus, one of their first tasks after data analyses is to write a new draft of their methods section with these placeholders replaced with actual data.  They submit this alongside their results section (week 12) with a discussion section to follow roughly two weeks later.  While writing generally can’t be completed fully in class, students have several in-class writing days so that they can consult with the instructor and their peers when questions arise. 

    Students then learn about data presentation and create a draft poster to be submitted during the last week of class. Again, because students are working on individual research hypotheses, each of these paper and poster drafts are individual, but students have the benefit of receiving feedback from peers and the instructor on drafts at all stages, meaning that final projects are often in phenomenal shape.

    Students submit their finished products early during finals week, and then individually present their research to the class during the final examination period.  This is another very encouraging part of the semester, as students learn more about their peers’ projects and offer encouragement for their hard work.  Furthermore, because the work was approved by the IRB, students are in a very good position to later take their research projects to other venues, such as on-campus undergraduate research conferences and/or regional professional conferences, to share their findings with a broader audience.

    The Outcome

                The structure of the class research project, intermixing group and individual components is, admittedly overwhelming sometimes, particularly if an individual student must miss class frequently, such as in cases of student athletes.  In those cases, the students’ lack of attendance has the potential to harbor everyone’s progress on the collaborative project, so a fair amount of instructor foresight and flexibility is necessary in order to accommodate those absences and ensure that the project can still move forward.  That said, I have found the collaboration to be worthwhile.  Grades, attendance, and course evaluations have increased since I began collaborative data collection, as have student accountability and teamwork.  As students move in and out of group and individual efforts, they see the ways in which they efforts impact themselves and others, and they embrace the process of working toward a common goal.

                More than that, students recognize the ways in which collaboration has allowed them to more effectively manage their time so that they are not duplicating efforts.  For instance, by pooling their data collection, they avoid saturating the research pool and have access to many more participants than they would if they had collected data individually.  Similarly, from the instructor perspective, students’ collaboration allows me to more efficiently work with them (for instance, allowing me to work with one IRB application instead of 15), so that I can free up time to provide more detailed feedback on drafts throughout the semester, which also benefits the students.

    Teamwork makes dreamwork.  Gone are the days of spending countless office hours listening to students complain about how their research partner isn’t doing their fair share of the work.  Gone, too, are the days of trying to grade results sections based on data collected from 7 participants.  Instead, I see students working together and holding themselves to a high standard, and I see their efforts resulting in extraordinary outcomes.  I hope that others can find relief and excitement in a similar approach.


    Andresen, L., Boud, D., & Cohen, R. (2020). Experience-based learning. In Understanding adult education and training (pp. 225-239). Routledge.

    Ball, C. T., & Pelco, L. E. (2006). Teaching research methods to undergraduate psychology students using an active cooperative learning approach. International Journal of Teaching and Learning in Higher Education, 17(2), 147-154.

    Balloo, K. (2019). Students’ difficulties during research methods training acting as potential barriers to their development of scientific thinking. Redefining scientific thinking for higher education: Higher-order thinking, evidence-based reasoning and research skills, 107-137.

    Etengoff, C. (2023). Reframing psychological research methods courses as tools for social justice education. Teaching of Psychology, 50(2), 184-190.

    Fazli, A., Imani, E., & Abedini, S. (2018). Faculty members' experience of student ethical problems: A qualitative research with a phenomenological approach. Electronic Journal of General Medicine, 15(3).

    Lashley, F. R., & de Meneses, M. (2001). Student civility in nursing programs: A national survey. Journal of Professional Nursing, 17(2), 81-86.

    Lewthwaite, S., & Nind, M. (2016). Teaching research methods in the social sciences: Expert perspectives on pedagogy and practice. British Journal of Educational Studies, 64(4), 413-430.

    Murtonen, M., Olkinuora, E., Tynjälä, P., & Lehtinen, E. (2008). “Do I need research skills in working life?”: University students’ motivation and difficulties in quantitative methods courses. Higher Education, 56, 599-612.

    Rancer, A. S., Durbin, J. M., & Lin, Y. (2013). Teaching communication research methods: Student perceptions of topic difficulty, topic understanding, and their relationship with math anxiety. Communication Research Reports, 30(3), 242-251.

  • 19 Jul 2023 7:12 PM | Anonymous member (Administrator)

    Daniel A. Clark, Madelynn D. Shell, & Andria F. Schwegler
    Texas A&M University--Central Texas

    *Note: For the version with the figure included, please follow this link:

    Learning about research and statistics may be a much-maligned element of any undergraduate psychology program from the perspective of students (Harlow et al., 2009), but it is also widely viewed as an important element in psychological literacy (APA, 2013). On the faculty side, teaching these courses is often cited as challenging due to the amount of material required (Ciarocco et al., 2017). Instead of both faculty and students suffering in silence while engaging in these courses, we decided to take steps to improve how we teach all of our research-oriented undergraduate courses with the goal of distributing some of the content in the research methods course across other courses leading up to it. This redistribution of the workload was intended to ensure that students have equitable preparation for research methods and that students leave the program with equivalent experiences.

    To start the process, full-time faculty in the undergraduate psychology program began meeting regularly to discuss the desired alignment across the research course sequence (i.e., writing in psychology, statistics, and research methods) and rewrite the course learning outcomes in a manner that captured what we were doing in our individual classes. As academics, we did not always agree on everything, but we were inspired by a desire to improve our teaching and our students’ learning to find common ground. Putting the students’ learning ahead of our own idiosyncratic preferences enabled us to listen to each other’s perspectives, consider multiple ways to achieve a goal, and make decisions based on research across our respective content areas to facilitate learning. Such collaboration acknowledges that each faculty member has the academic freedom to teach using the methodology that they feel is best, but it also recognizes that courses do not exist in a vacuum (for further discussion see Cain, 2014). Courses exist in the context of programs which requires that faculty members come together at the program level to: 1) articulate the scope and quality of education we are providing to our students and 2) develop alignment across the curriculum so students acquire the same basic skills regardless of instructor, enabling them to graduate from the program with comparable knowledge and experiences. On a personal level, we were also seeking to reduce our own frustrations from teaching the research methods course with students who were not adequately prepared for it.

    Step 1: Start with the End in Mind

    We started by looking at the big picture: skills that were necessary for students to ultimately be successful in the research methods course and their psychology degree in general, rather than being bogged down by individual course outcomes and descriptions.  Consistent with previous research on teaching research methodology (Ciarocco et al., 2017; Gurung & Stoa, 2020), we found that our end goals for student performance in the course and in the program aligned quite well despite some differences in structure and content. For example, we agreed that we wanted our students to conduct IRB-approved human subjects research and collect real world data, a high impact practice (American Association of Colleges & Universities, 2013). The larger goal was for these research projects to provide grist for student conference presentations and graduate school applications. Our discussions regarding how we could set our students up for success led to the articulation of fairly specific skills (see Figure 1) that also resulted in clarifying some wording in the program learning outcomes. These specific skills fit our needs well though others might find that broader, more general wording allows for individual variation between faculty.

    Figure 1. Skill alignment across three research-oriented courses

    Step 2: Back Track to the Beginning

    Our program is housed in a regional, upper-level university that offers only junior and senior level courses in partnership with 2-year colleges. The undergraduate psychology degree includes three, four-credit hour research-oriented courses that students take in sequence: writing in psychology, statistics, and research methods. Research methods is a content-heavy class, particularly when designing original research and collecting data as part of the course, so we decided to introduce some of the research methods skills in the prerequisite courses. For example, in many universities, learning APA style starts in introductory or general psychology courses (Fallahi et al., 2006; Gurung et al., 2016). Because our university does not offer introductory-level courses, we added teaching of these skills to the first course in the research sequence, writing in psychology. In addition, we added basic research design to the writing in psychology course, as evidence suggests this can improve scientific reasoning in students at the introductory level (Becker-Blease et al., 2021). These skills prepare students for critically reading research articles not only in the writing in psychology course but across the curriculum.

    In addition to shifting skills to the beginning of the program, we moved some skills to the second course in the sequence, statistics, which students take prior to research methods. For example, students often enter research methods not knowing how to write statistical analyses in APA style, create online surveys, or clean and format data in a spreadsheet. These skills are essential to successfully completing the research project in research methods. Instead of waiting to introduce these skills in research methods, we modified the lab portion of the statistics course to include instruction in these areas. Thus, students come into research methods with an introduction to many of the basic skills they will use.

    Step 3. Ground the Plan in Learning Research

    These revisions have improved consistency and quality across our program because they are aligned with current knowledge about learning. In our discussions, we brought to bear years of research that has documented learning effects that should be incorporated into education. We know that prior knowledge improves subsequent learning, likely by reducing cognitive load (Simonsmeier et al., 2021). Spacing and retrieval practice also enhance learning (Latimier et al., 2021). By introducing important skills in earlier courses, we have made more effective use of these known mechanisms to facilitate learning. For example, as can be seen in Figure 1, relevant aspects of APA style were revisited in all three of the research-oriented courses in the curriculum. Although research methods instructors teach APA style, they now know that these skills have been introduced in previous courses and are able to focus on transfer and application of these skills rather than teaching a brand-new skill. The goal of this explicit attention to introduction/encoding, spacing, interleaving, and retrieval of information is for subsequent learning in research methods to be easier and more long lasting for students.

    Step 4. Put it in Writing

    After the end skills and curriculum map were sketched out in the first three steps, it was time to put those changes into writing so we could communicate them clearly to our students. We expanded and rewrote the course learning outcomes and the course descriptions so that they directly aligned to each of the program learning outcomes and reflected the scaffolded structure of the content students were expected to demonstrate. We also reviewed course prerequisites to ensure students were acquiring the material in the order we had designed. Using required prerequisites helped ensure that students enrolled in courses to build up their prior knowledge (Lauer et al., 2006). Finally, we discussed required assessments in each course. Although these were minimized to prioritize faculty academic freedom, we identified some core assessments that needed to be included in our courses. For example, a key outcome in research methods was writing a full research manuscript in proper APA style.  


    By aligning our course learning outcomes with program learning outcomes and identifying exactly where in the program these concepts were introduced and reinforced, we know that students are exposed to basic knowledge before entering research methods. We are also assured that when students graduate from our program, regardless of the section they completed, they are all equipped with the same basic skillset. As a 100% transfer institution, our students come to us with very diverse backgrounds and preparation. Ensuring that every student has the same exposure to essential skills such as APA style, survey development, and statistical analysis before research methods facilitates the process of the data-collection project.  Importantly, this plan embeds the high-impact practice of undergraduate research into the required curriculum, creating equitable access and opportunities for all students which have been chronic problems with implementation of these experiences (Zilvinskis et al., 2022). By focusing on broader program and course learning outcomes and using these to align our research-oriented curriculum, we were able to provide our students with a better, more consistent experience, without infringing on faculty academic freedom to choose how they teach these outcomes. We found this was a satisfying blend of faculty subject matter expertise and a collective articulation of expectations and standards that benefitted both our faculty and our students.





    American Association of Colleges and Universities. (2013). High Impact Practices. Retrieved from:

    American Psychological Association. (2013). APA Guidelines for the undergraduate psychology major: Version 2.0. Retrieved from:

    American Psychological Association. (2011). Principles for quality undergraduate education in psychology. Washington, DC: Author. Retrieved from principles.aspx

    Becker-Blease, K., Stevens, C., Witkow, M. R., & Almuaybid, A. (2021). Teaching modules boost scientific reasoning skills in small and large lecture introductory psychology classrooms. Scholarship of Teaching and Learning in Psychology, 7(1), 2–13.

    Cain, T. R. (2014, November). Assessment and academic freedom: In concert, not Conflict. (Occasional Paper #22). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

    Ciarocco, N. J., Strohmetz, D. B., & Lewandowski, G. W. (2017). What’s the point? Faculty perceptions of research methods courses. Scholarship of Teaching and Learning in Psychology, 3(2), 116–131.

    Fallahi, C. R., Wood, R. M., Austad, C. S., & Fallahi, H. (2006). A program for improving undergraduate psychology students’ basic writing skills. Teaching of Psychology, 33(3), 171–175.

    Gurung, R. A. R., Hackathorn, J., Enns, C., Frantz, S., Cacioppo, J. T., Loop, T., & Freeman, J. E. (2016). Strengthening introductory psychology: A new model for teaching the introductory course. American Psychologist, 71(2), 112–124.

    Gurung, R. A. R., & Stoa, R. (2020). A national survey of teaching and learning research methods: Important concepts and faculty and student perspectives. Teaching of Psychology, 47(2), 111–120.

    Harlow, L. L., Burkholder, G. J., & Morrow, J. A. (2009). Evaluating attitudes, skill, and performance in a learning-enhanced quantitative methods course: A structural modeling approach. Structure Equation Modeling.

    Latimier, A., Peyre, H., & Ramus, F. (2021). A meta-analytic review of the benefit of spacing out retrieval practice episodes on retention. Educational Psychology Review, 33, 959–978.

    Lauer, J. B., Rajecki, D. W., & Minke, K. A. (2006). Statistics and methodology courses: Interdepartmental variability in undergraduate majors’ first enrollments. Teaching of Psychology, 33(1), 24–30.

    Simonsmeier, B. A., Flaig, M., Deiglmayr, A., Schalk, L., & Schneider, M. (2021). Domain-specific prior knowledge and learning: A meta-analysis. Educational Psychologist.

    Zilvinskis, J., Kinzie, J., Daday, J., O’Donnell, K., & Vande Zande, C. (2022). Introduction: When done well – 14 years of chasing an admonition. In J. Zilvinskis, J. Kinzie, J. Daday, K. O’Donnell, & C. Vande Zande (Eds.), Delivering on the promise of high-impact practices: Research and models for achieving equity, fidelity, impact, and scale (pp. 1-12). Stylus.

  • 05 Jul 2023 2:29 PM | Anonymous member (Administrator)

    Amanda Mae Woodward
    University of Minnesota Twin Cities

    Open Science, or the practice of making research transparent and accessible, is becoming more prevalent in psychology research (Santoro, 2022; van der Zee & Reich, 2018). Journals, including Developmental Science and Psychological Science, accept registered reports and award authors badges for engaging in transparent research practices. As open science becomes more widely used, educating future researchers about the values and tools available is important. Graduate students, who have prior research knowledge, may benefit from guides and recommendations as they refine their research skills (Kathawalla et al., 2021). Undergraduate students, many of whom are just beginning to learn about research, may benefit from a more structured introduction to open science.  

    Infusing open science into undergraduate courses can be beneficial to students planning to enter the field of psychology because they will be introduced to modern research methods and values. Undergraduate students who learn about open science may gain skills that will make them more competitive for graduate school, including programming and communicating research decisions effectively. Further, students may gain a deeper understanding of research workflows as well as a better appreciation of how to evaluate mixed evidence and the importance of replication.  

    Of course, many undergraduate students do not go on to graduate school (APA, 2016). An introduction to open science can also be beneficial for these students. Activities that introduce undergraduate students to open science can help them refine skills, such as critical and analytical thinking skills, familiarity with software and databases, and evaluating evidence and making decisions, that are beneficial across a wide variety of careers (Naufel et al., 2018). All students, regardless of whether they go to graduate school, will come in contact with research findings in their daily lives. By helping them learn more about transparent and accessible research, these students will be better prepared to be informed consumers.

    While introducing students to open science can increase students learning, it may feel like it is adding to the instructor’s burden (many of us struggle to find space to add material given requirements and needs of a single course!). To facilitate the inclusion of open science into the classroom, I have written about the methods I have tried in my courses below. They are listed by category and provide some reflection on the ease of including them in the semester.


    Introductory Statistics Courses:

    About My Course:

    My introductory statistics course is a 4-credit course with a large lecture (~350 students) and a lab component. Students in this course learn both descriptive statistics and inferential statistics using R Programming. To introduce students to Open Science, I include the following: 


    Pre-registrations involve describing your methods and your analyses prior to collecting or analyzing your data. There are several platforms for doing this, including the Open Science Framework ( and Aspredicted ( Prior to covering inferential statistics, students in my course are presented with several scenarios, including those where the analyses are planned before data collection, those where data points are removed, or those where they are given no information. Students then discuss which type of evidence they would find more believable and whether they think sharing research plans ahead of time was a good or bad idea. After this discussion, I provide a brief recap of benefits and considerations with pre-registration and students explore the Aspredicted website. Then, I tell students that they will be expected to do a mock pre-registration for the inferential statistics we cover in class.     

    Students base their mock pre-registration on the prompts for the practice problems I provide in class. Specifically, students are asked to 1) identify the research question, 2) identify the variables in the prompt, 3) describe the scale of measurement used, 4) determine the independent and dependent variables, 5) write their hypotheses, 6) identify the correct statistical test and explain why, and 7) explain what information they will base their conclusion on. 


    This activity was relatively easy for me to include in my introductory statistics course. The pre-registration includes questions about all the information I typically want students to be able to identify. The main difference is that I am explicitly asking them to mention these pieces, rather than having an implicit expectation that they connect the scale of measurement and scenario to the statistic they calculate. I think pre-registrations help students in the course form connections between the wording of research questions, hypotheses, and analyses. 


    Advanced Statistics Course:

    About My Course:

    I teach a smaller, more advanced course that focuses on using R for statistical analyses. This class has approximately 20 students enrolled who meet twice a week. Students work on a final analysis project using existing data.


    For their secondary analysis project, students complete a pre-registration using the Open Science Framework template, which includes more questions. This step of their project helps them think through their specific research questions, what data they have access to, and what analyses would be appropriate for their project.

    Pre-registration Reflection:

    This pre-registration activity takes more effort and time than the alteration I made to my introductory course. However, it makes grading their final projects much easier and has led to more student meetings about the analyses they choose. Because this step is due before their data analyses, it gives us time to discuss different approaches to analyzing their data. It also makes them think through how to use the data (e.g., what should they do if they have missing values? Do they want to use summed scores or another approach?). By including this activity, I have shifted some of the work required to grade their final project to the beginning of the semester and I have noticed that their final projects tend to be of a higher quality. 

    Data and Code Sharing:

    In this course, there are several activities related to sharing code and data. First, I model sharing code and data by making all the course notes available via GitHub. I post the “blank shell” notes at the beginning of the class, and “commit” updates of the notes as we complete each learning outcome so students can see the updates in real time. Students in this course are expected to hand in their weekly assignments on GitHub so that they get the same experience of using GitHub. As part of their final project, students are expected to share the code they create and the necessary data on either GitHub or the Open Science Framework.

    Beyond assignment submissions, students in this course are also expected to evaluate each other’s code and to provide feedback to code posted on GitHub. This helps them think through what is needed for code to be reproducible as well as to think through ways to make their own files more easily accessible.

    Data and Code Sharing Reflection:

    Teaching students to use GitHub takes more time and an understanding of how different operating systems work. However, there is very helpful documentation to get started on GitHub and detailed instructions for syncing R Studio with GitHub. Further, there are ways for students to take steps, rather than fully integrating their work with GitHub. For instance, they can download files from GitHub and then reupload it by pointing-and-clicking. Overall, students have appreciated the opportunity to learn about GitHub, even if it was challenging.  


    Advanced Research Methods courses:

    About the Course:

    I teach an advanced undergraduate course on open science research methods. This course is approximately 10 students who meet twice a week. In this class, students learn about different aspects of open science and focus on applying what they learn to a replication project across the semester. 

    Data code and sharing:

    Students in this course use available materials to assess the methods used in a study. For instance, they can look at the study design and see how the variables were operationalized. Shared materials also allow students to understand how researchers transform raw data into the forms we often use in analyses because they can walk through the data cleaning code and the analysis script.

    Shared materials also facilitate students completing a replication project in their research methods course. My students are currently completing a replication project through Project CREP, which offers a great set of resources on the Open Science Framework to facilitate this process. Students in my course have used these materials to 1) create a pre-registration, 2) to develop a Qualtrics survey to collect data, and 3) have started using available data to complete analyses in R or in JASP. 


    Admittedly, this course is very easy to include open science topics in because it is the nature of the course. Students in the course have mentioned enjoying the activities mentioned above and have found connections between what they are learning and in their other courses. As the semester has progressed, I have seen the quality of student evaluations of open materials improve.


    Content Courses: 

    Though I have not included open science explicitly in my content courses, I believe some activities, like discussing replications and using open data, could be beneficial. Below are two examples related to cognitive development that I plan to use in the future. 

    Discussing Replications:

    Students in this course will read an article describing a failure to replicate (Oostenbroek et al., 2016) and a response from the original study’s author (Meltzoff et al., 2016). They will be prompted to think about the evidence presented in each paper and to identify factors that could have led to different results. The discussion would continue by having students think what other evidence they may need and how these papers relate to theories concerning imitation and social learning. 

    Open Data:

    There are several open data sources that include visuals. For instance, Wordbank ( is an open data source that allows you to examine children’s vocabulary growth around the world. An activity asking students to look at overall trends and as well as trends in specific groups would be one way to highlight a benefit of open data.


    There are many ways to introduce students to open science as part of our courses. Introductions can be short or more in depth, depending on instructor preference and the amount of material covered in a semester. Including these activities across my courses have led to fruitful conversations about cognitive development and about the methods and statistics we use.   



    APA (2016, February). By the numbers: How do undergraduate psychology majors fare? Monitor on Psychology, 47(2), 11. 

    Kathawalla, U. K., Silverstein, P., & Syed, M. (2021). Easing into open science: A guide for

    graduate students and their advisors. Collabra: Psychology7(1).

    Naufel, K. Z., Spencer, S. M., Appleby, D., Richmond, A. S., Rudman, J., Van Kirk, J., …

    Hettich., P. (2019). The skillful psychology student: How to empower students with

    workforce-ready skills by teaching psychology. Psychology Teacher Network, 29(1).

    Retrieved from ptn/2019/03/workforce-ready-skills

    Santoro, H. (2022, January). Open Science is Surging. Monitor on Psychology, 53(1), 1.   

    van der Zee, T., & Reich, J. (2018). Open Education Science. AERA


  • 21 Jun 2023 12:45 PM | Anonymous member (Administrator)

    Lynne N. Kennette(1) & Phoebe S. Lin(2)
    1. Durham College
    2. Framingham State University

    Since its introduction in November 2022, ChatGPT ( has caused a lot of chatter, especially in educational circles. ChatGPT is a software application that uses artificial intelligence (AI) to simulate human speech and/or writing. Some see it as a cause to re-think assessments or a risk to academic integrity. Others welcome it as a new teaching tool. Regardless of your view, its presence is a good opportunity to re-think our assessments and to examine whether this new technology might be a threat to the skills we expect students to be able to demonstrate during their studies. As such, we provide some food for thought about how faculty might re-consider their assessments in the context of this new tool.


    Although an initial reaction to ChatGPT might be one of concern, it’s important to take a step back and really focus on the pedagogy. Starting with our course learning outcomes may help to re-focus and/or overcome our concerns. For example, do you have a specific learning outcome tied to writing or critical thinking that you need to ensure students are demonstrating (rather than ChatGPT)? If so, maybe there are other ways to focus on those skills in a way that departs from the standard writing assignment. Or, if the ability to detect AI-written text in students’ assignments is relevant for your pedagogical goals, there are several AI detectors which assess the extent to which text is likely to be written by a human rather than a machine. One example of these detectors is GPT-Zero (, which provides general scores for the overall product as well as highlights which sentences are more likely to be written by AI (this sentence-by-sentence analysis is similar to what you see in plagiarism detection software like TurnItIn). So, if writing is a crucial part of your course, then perhaps detecting it will be important. If not, then perhaps writing doesn’t need to be so prominent in your assessments, rendering ChatGPT much less useful for students and consequently less concerning for faculty. Below, we provide some examples below of how instructors might modify or personalize assessments in ways that make it more challenging for AI to produce useful text for students to use. Then, we provide some ideas for how ChatGPT can be used to support student learning, rather than trying to fight against its use.

    Assessment Modifications

                If you’re concerned about your current assessments, you can modify them. One way to circumvent this type of AI is to ask students to write about something that it doesn’t know about. The data used by ChatGPT is a couple of years old (though they will certainly be updating the corpus of text regularly), so something very recent from the news, or specifically related to students’ direct/campus experience and/or which might not have been written about, or a newly published empirical study will make students have to do at least the majority of the work themselves. Additionally, you could assign something very specific that you have done in class as the basis of a writing assignment. For example, an in-class experiment where you classically conditioned students to salivate to the word Pavlov using sour candy is something that AI likely doesn’t know about. Similarly, asking students to summarize the class’s specific talking points during a debate or group discussion would fit the criteria. Another example of preventing the use of ChatGPT would be to implement a writing assignment that utilizes the self-reference effect that is specific to the institution, for example by requiring students to point to specific student services available at the college and/or buildings/offices on campus. By having students write from their own perspective, this also highlights the self-reference effect, making their recall of the course material more likely. For instance, instructors may assign a writing prompt of “Our university has recently adopted an anti-racist mission statement to better support student learning. Which specific components of this mission statement do you think will have the greatest impact on the campus culture (e.g., which element(s) are you or other students most likely to be able to act upon)? Describe specific ways in which you think this mission statement will impact your experiences on our campus.” In addition to making the course material more personally relevant, another benefit of this is to allow students the opportunity for their authentic human voices to be heard, providing greater potential for them to grow as critical thinkers. Finally, requiring students to interpret or otherwise write about data collected in class and/or the outcome of a study that was (or not) replicated in class may be another way to circumvent the bot by requiring things it doesn’t know anything about. 

    Another way to limit the usefulness of ChatGPT is to require that specific references be included in the assignment (or the basis of the assignment), which will also make it easier for faculty to detect inaccuracies written by AI. Currently, the AI is not very good at including specific sources as either in-text citations or references, or citing their sources in general (though with a bit of work and the right prompts, it can be done, though still not very effectively). In actuality, sometimes ChatGPT even invents references and sources that don’t exist at all. There are other AI tools such as Perplexity  ( that do a much better job with referencing sources; however, students will still have some work to do because sources are websites and may not be scholarly.

    Reflections and personal accounts of feelings are more difficult for GPT to do, as are integrating life experiences with specific course concepts. With this type of prompt, ChatGPT will generally begin its response by saying something along the lines of it’s a robot so it can’t really reflect, but then will provide some kind of reflection text. So students can still use this, but it might not be as easy for them to obtain immediately useful text from ChatGPT.

    Producing written work (or at least a draft or outline) in class, the old-fashioned way (pen and paper) is one way to limit the number of up-front contributions that ChatGPT can make. Encouraging students to collaborate in groups and produce a mind map or other visual representation of what they will be writing will also deter students from using AI to write for them. Further, implementing group work has been shown to increase the quality of communication in the classroom, establish boundaries for expectations of the amount of work each individual should contribute, and establish respectful social norms that each group member has a valuable contribution to make in the learning process (Aronson, 1978). These activities, if they serve as the basis of a written assignment, cannot be input into ChatGPT (at least not at the current time) because it is not text and therefore students wouldn’t be able to easily use the bot. It would be less work for them to write the assignment themselves than to convert the mind map into text and then somehow feed it to ChatGPT.

    Another example of moving away from traditionally written work would be to include more oral work from students, whether live or pre-recorded, in front of the entire class or only the instructor, as a group or individually. There are many possible permutations such as presenting the content that would traditionally be in a paper, or a less traditional format such as an interview (e.g., with a fake researcher) to learn about the topic in question. Added benefits of this teaching technique might include strengthening oral communication skills and building rapport between the students and the instructor.

    Using ChatGPT

    As the saying goes, “If you can’t beat them, join them.” It is likely that students will be able to (at least somewhat) circumvent anything you try to do to limit their use of ChatGPT, so why not include this AI tool in their assignment? One way that ChatGPT can be leveraged is to help students provide higher quality writing as it can provide students with feedback on their writing. So perhaps you can assess students on how they addressed the feedback provided and their explanation for how they improved their writing. You might require them to submit their original draft, the feedback they received from ChatGPT, and their improved version, along with a reflection/explanation about how they used the feedback and what they found the most challenging to address and also the most helpful feedback they received from ChatGPT. 

    Since GPT is often inaccurate (especially with references and in-text citations), students could also be tasked with asking it to write something on a particular topic and then tracking down scholarly sources to support the claims made (and if those claims cannot be supported, then re-work the text to reflect that). This will provide students the chance to practice using the library and tracking down sources as well as the mechanics of proper citations and a chance to work on their information literacy skills more broadly. This could be especially valuable in a research methods course to emphasize the rigorous process of publishing scientific research in addition to highlighting the merits of integrity in both writing and research practices.

     ChatGPT can also “grade” assignments, so students could ask it to write an assignment and then use the rubric to score it as a starting point. Students would then improve the writing in order to make it a better and more accurate version of what was produced. Again, asking students to reflect on the process and provide specific examples of where ChatGPT was the least accurate according to the rubric (for example) makes use of the tool and forces students to practice their own writing as well. This exercise could help develop stronger reading comprehension skills in addition to writing skills, as this assignment will help them differentiate between weak and strong writing as well as weak and strong arguments. Similarly, students can use AI to summarize research articles that they could then use in a paper where they synthesize or otherwise integrate the information in some way that is appropriate for the course. Alternatively, students could compare the generated summaries to their own and verify its accuracy, identify errors, and reflect on any differences in focus (e.g., perhaps ChatGPT focused its summary on the results of the study whereas students used a lot of their summary to describe the methods).

    Transferable Skills

    The other consideration, as we try to wrap our heads around the impact of this tool, should be about students’ eventual workplaces. Many workplaces are already using AI to assist in various tasks, whether overtly or covertly (Walia, 2023), and expectations in the workplace will likely adjust in terms of how long tasks should take their employees to complete in light of this new technology. As such, it would be a disservice for faculty not to give students a chance to use this tool and to become more efficient with it. Using AI is likely to become a new transferable skill (also known as a “soft skill”), which should be developed during their studies and then used in whatever workplace they end up. The transferable skill that current students need to develop in order to be competitive in the workforce may no longer be how to write from scratch, but rather, how to critically evaluat what ChatGPT (or a similar tool) is creating and be able to assess its accuracy, quality, and/or build on that text. Much like the invention of the hand-held calculator still required students to evaluate the answer given using their critical thinking skills, a similar skill is likely to be what needs to be developed to contend with ChatGPT. Further, with the increased efficiency of AI use, workers will have more time available in their daily work schedules to devote to more “human” tasks that involve original and creative thinking, such as problem solving or generating new ideas to implement for various projects.

    One somewhat recent assessment trend and best practice has been to use authentic (as opposed to disposable) assessments (Jhangiani, 2017; Seraphin et al., 2019). That is, making students’ assessments something that has an audience and purpose beyond the instructor and classroom. For example, asking students in a writing/grammar course to proofread a local business’ website or tasking psychology students to create pamphlets on a particular issue to distribute through a student services office on campus (e.g., studying/memory tips). In this way, these assessments resemble their eventual workplace more and have a clear purpose. What will their eventual workplaces look like? That is the million-dollar question.


    Whether friend or foe, its presence in the education landscape cannot be ignored. Moving away from written assignments or including more of a focus on work completed during class can be used by instructors to quickly modify their current assignments in light of the availability of ChatGPT. Although certain approaches to assessments might reduce a student’s ability to use ChatGPT to produce work for courses, perhaps the technology can be used in a way to encourage critical thinking or improve students’ writing skills. Using these types of tools in time-saving ways may be the expectation of workplaces in the not-so-distant future and students would be well-served to understand its functionality in our classrooms. 

    Disclosure: Although we did not use ChatGPT in any way to write this article, we did ask it for feedback on our writing once we had finished and it thought our writing was organized, clear, and a well-written piece overall (Paraphrase from OpenAI's ChatGPT AI language model, personal communication, February 12, 2023).


    Aronson, E. (1978). The jigsaw classroom. Sage.

    Jhangiani, R. (2017, February 2). Ditching the ‘‘disposable assignment’’ in favor of open pedagogy. E-xcellence in Teaching Blog.

    Seraphin, S. B., Grizzell, J. A., Kerr-German, A., Perkins, M. A., Grzanka, P. R., & Hardin, E. E. (2019). A conceptual framework for non-disposable assignments: Inspiring implementation, innovation, and research. Psychology Learning & Teaching18(1), 84-97.

    Walia, P. (2023, February 9).  Study finds more workers using ChatGPT without telling their bosses. Techspot.
  • 10 Apr 2023 5:01 PM | Anonymous member (Administrator)

    Jason S. McCarley (1), Raechel N. Soicher (2), & Jannah R. Moussaoui (1)
    (1: Oregon State University, 2: Massachusetts Institute of Technology)

    *Note: For the version with figures and additional resources included, please follow this link:

    The Gestalt principles of perceptual organization are a staple of undergraduate psychology. Examples like those in Figure 1 are common in Intro Psych, Cognitive, and Sensation & Perception textbooks, and discussion of the psychology behind them is important for a number of reasons. Historically, the Gestalt movement was an enormously influential school of thought (Rock & Palmer, 1990; Wagemans et al., 2012). Practically, the Gestalt principles are useful for designing displays, graphs, and lecture slides (Kosslyn, 2006; Moore & Fitz, 1993; Wickens et al., 2022). And pedagogically, the Gestalt phenomena reveal perceptual processes that students might normally take for granted.

    Figure 1.  Visual demonstrations of three familiar Gestalt principles.

    Discussion of Gestalt grouping typically focuses on visual processes, like those illustrated in the figure. But perceptual organization isn’t exclusive to vision; Gestalt processes are also necessary to organize messy sensory inputs in other senses, including touch (Gallace & Spence, 2011) and hearing (Bregman, 1990). In hearing, specifically, Gestalt processes help turn soundwaves crashing on the eardrum into a mental representation of the sound sources around us. Bregman (1990) used the term auditory scene analysis to describe the perceptual organization of sound, and auditory streams to denote the output of this analysis. An auditory stream is thus the analogue of a visual object or group.

    To accompany his book on auditory scene perception, Bregman provided demonstrations of auditory Gestalt effects. These are useful classroom demonstrations in two ways. First, they establish a unifying principle, showing students that perceptual organization operates in similar ways across different senses. Second, they make Gestalt phenomena accessible to students with visual disabilities.

    Below, we present three of Bregman’s auditory examples that can be used as classroom demonstrations in discussions of Gestalt grouping. For each one, we describe and illustrate the analogous visual effect, and explain the correspondence between the auditory and visual phenomena. We also provide links to downloadable files, made available by Bregman, that demonstrate the auditory phenomena.

    A larger set of examples is available on Dr. Bregman’s website.

    Grouping by Similarity

    In vision, the Gestalt principle of similarity says that items that look alike tend to group with one another. In theory, for example, we could see the dots in Figure 1A as forming columns, arbitrary clusters, or no pattern at all. Instead, we tend to group the dots by color, into rows.

    In his demonstration of auditory grouping by similarity, Bregman manipulates pitch to gradually segregate a series of notes into two streams. Figure 2 provides a schematic illustration. The stimulus is a well-known melody interleaved with random distractor tones. To begin, the melody and distractors are within the same pitch range, and the melody is camouflaged. As it plays repeatedly, the melody gradually moves into a higher pitch range. Eventually, it segregates from the distractor tones and becomes recognizable. Pitch here plays the same role as color does in Figure 1: sounds of similar pitch are grouped into a distinct auditory stream, standing out from sounds of dissimilar pitch.

    Figure 2.  Auditory grouping by similarity of pitch. A: When a melody is embedded amongst distractor tones from the same pitch range, it is effectively camouflaged. Here, the notes outlined in black represent the melody and the notes without outlines represent the distractors. B: When the melody and distractors are in different pitch ranges, the melody stands out and is easy to recognize.

    Grouping by Proximity

    The principle of proximity holds that items near one another are grouped together. In vision, proximity is spatial. In Figure 1B, for instance, the vertical separation between dots is smaller than the horizontal separation, and as a result, we perceive the dots as forming columns.

    In hearing, proximity is temporal. Bregman’s demonstration, illustrated in Figure 3, interleaves a series of three descending tones with a series of three ascending tones. The descending tones are in a higher pitch range than the ascending tones. To begin, the tones are played slowly, and we hear a series of notes jumping back and forth between pitch ranges. Next, the tones are played quickly. Now, temporal proximity and similarity combine to segregate the ascending and descending series into two distinct streams that seem to run simultaneously. Near enough to one another in time, the tones of similar pitch group.

    Figure 3.  Auditory grouping by proximity. A: When interleaved high and low tones are played slowly, we hear a single sequence that jumps between pitch ranges. B: When the interleaved tones are played quickly, notes of similar pitch group together. We perceive two simultaneous streams, one high-pitched and one low-pitched.

    Grouping by Connectedness

    The principle of connectedness (Rock & Palmer, 1990) holds that items connected to one another are grouped together. Figure 1C shows the influence of visual connectedness. Dots alternate color from top to bottom, and are closer together horizontally than vertically. But because they are linked by thin vertical lines, the dots perceptually group to form columns. Here, connectedness overpowers similarity and proximity.

    Bregman’s demonstration of grouping by connectedness shows an equally powerful effect. Figure 4 illustrates. In the unconnected condition, a series of tones alternates between high (“beep”) and low (“boop”) pitch. The impression is of two distinct streams, one high-pitched (“Beep. Beep. Beep…”) and one low-pitched (“Boop. Boop. Boop…”). In the connected condition, a smoothly rising and falling tone is interposed between the low- and high-pitched tones. Now, we hear a single stream of sound, smoothly modulating between high and low (“Beeeeooooeeeeoooo…”). Just as in vision, connectedness transforms isolated fragments into a unified perceptual object.

    Figure 4.  Auditory grouping by connectedness. A: When interleaved high and low tones are unconnected, we hear separate high- and low-pitched streams. B: When high and low tones are connected by a rising and falling tone, we here a single, undulating auditory stream.


    The Gestalt principles are foundational knowledge for psych undergrads. Teaching them with an exclusive focus on vision, though, can limit their accessibility and give students an unduly narrow view of the role they play in our mental life. Auditory demonstrations give us a way to expand the reach and impact of our lessons on perceptual organization.


    Bregman, A. S. (1990). Auditory scene analysis: The perceptual organization of sound. Cambridge, MA: MIT Press.

    Gallace, A., & Spence, C. (2011). To what extent do gestalt grouping principles influence tactile perception? Psychological Bulletin, 137(4), 538–561.

    Kosslyn, S. M. (2006). Graph design for the eye and mind. New York, NY: Oxford University Press.

    Moore, P., & Fitz, C. (1993). Using Gestalt theory to teach document design and graphics. Technical Communication Quarterly, 2(4), 389–410.

    Rock, I., & Palmer, S. (1990). The Legacy of Gestalt Psychology. Scientific American, 263(6), 84–90.

    Wagemans, J., Elder, J. H., Kubovy, M., Palmer, S. E., Peterson, M. A., Singh, M., & von der Heydt, R. (2012). A century of Gestalt psychology in visual perception: I. Perceptual grouping and figureground organization. Psychological Bulletin, 138(6), 1172–1217.

    Wickens, C. D., Helton, W. S., Hollands, J. G., & Banbury, S. (2022). Engineering psychology and human performance (5th edition). New York, NY: Routledge.

  • 02 Mar 2023 4:06 PM | Anonymous member (Administrator)

    Vanessa Woods
    (University of California, Santa Barbara)

    To truly create equity, a university must make sure that every student, regardless of background, can be successful. However, the reality is that students from groups that have been marginalized in higher education are entering a university that is not designed for them (e.g., needing to navigate the “hidden curriculum;” Laiduc & Covarrubias, 2022), and the responsibility for ensuring success begins with the instructor. My overall objective as an educator is to create learning opportunities that are engaging, meaningful, and motivating to students from diverse backgrounds. I strive to create inclusive learning environments in my courses and to make them relevant to students’ lives.

    Combatting the Hidden Curriculum

    There are three primary ways I create inclusive learning environments to promote motivation and engagement for all students. First, I create learning environments that include elements set up to combat the hidden curriculum for those students who are from marginalized groups (Laiduc & Covarrubias, 2022). To combat the hidden curriculum, I use explicit welcome and belonging messages in my syllabus, discuss what office hours are for, and continue to message throughout the course to convey this is their space, and my commitment to supporting them in their learning endeavors. The welcome message serves as a statement of community and conveys my appreciation for the strengths students bring into the space, and the belonging message explicitly conveys my belief they can be successful in the course and major. This includes repeated messaging that:

    1) students belong in the course and major,

    2) students bring important perspectives and ideas to our classroom space,

    3) I believe they can be successful in my challenging high-work courses,

    4) there are mechanisms for growth and improvement in the course structure,

    5) it is normal to sometimes struggle with content, and

    6) I am here to guide and coach them through that process.

    My teaching messages and practices have been informed by the literature on wise interventions, which underscore the importance of thinking about students' needs in academic settings in order to support students' ability to reframe inferences related to belonging. I embed meeting these needs in messages to motivate diverse students to engage effectively in the course (Laiduc & Covarrubias, 2022). Further, recent scholarship demonstrates that the syllabus can be an important tool to communicate instructor support for equity and inclusion, and as a tool to highlight a student/learner centered design for the course (Fuentes, Zelaya, Madsen, 2021; Richmond et al., 2019). For example, I include the following welcome message in my syllabus:

    “Our Course Community– As participants in a required pre major course in Psychological and Brain Sciences we all share an interest in the mind and behavior. I am excited to see where you will take your knowledge of methods in Psychology when you write your research proposal for this course. I value a diverse set of viewpoints and I welcome the strengths and talents you bring to the table as part of our community in this research methods course.”

    I also include a belonging message as well that reads:

    “You as a Researcher–You belong in Psychological and Brain Sciences and you belong here in this class as an undergraduate researcher. We have complete confidence in your ability to be an active capable member of this course. We also have complete confidence in your ability to develop your research and writing skills, and we are committed to guiding you through this process. Please feel free to discuss these things with Dr. Woods.*”

    Additionally, this kind of messaging is woven into my lectures and in the ways I engage the students when talking about the course structure.

    Providing Scaffolding for Student Success

    The second way I support inclusive classroom environments is to ensure that there are appropriate mechanisms to scaffold students to be successful in the course. I build in assignment scaffolding, revision opportunities, along with both individual and group exams, as well as exam retakes during finals, to ensure students can be supported in their learning. These structures provide different formative opportunities to demonstrate their learning, and contribute to a collaborative and collegial learning environment for students (Ambrose et al., 2010; Boothe et al., 2018). I carefully structure the content and pace of my courses so that students’ knowledge can build incrementally. For example, when writing a research proposal, I have five small assignments that culminate in the students having a draft of their proposal done 2 weeks before it is due. My assignments are developed to both guide the student through the material, foster autonomy, and reinforce working hard to improve with opportunities for revision (e.g., writing, peer review, feedback).

    I started recording insight discussions which are small groups of students discussing how they approached difficult course concepts and how they overcame the challenging content and gained insights into effective learning strategies. Informal student feedback (e.g. mid quarter feedback surveys, student discussion posts) suggests students find these insight discussions from their peers very useful; students like knowing that someone else had to struggle to understand a concept (helping to reframe the challenge of the course). On my exams, I ask integrative questions (e.g., authentic assessments) to test the students’ ability to put together information they have learned from different sources, and to apply this knowledge to a novel and real-world situation (Nolan et al., 2020). I design assignments that can foster the development of the students ability to think and practice as psychologists and neuroscientists by using structured peer review (Adler-Kassner & Wardle 2022; Miller-Young & Boman 2017; Woods, Safronova, & Adler-Kassner 2021). Cumulatively, my course practices and assessments give students multiple ways of gaining knowledge and of demonstrating their understanding of course concepts.

    Promoting a Welcoming & Engaging Classroom Environment

    The third way I support inclusive learning environments that promote belonging is to promote a comfortable, welcoming, and engaging classroom climate that encourages students to actively engage with content and learn from each other (Felten, 2020). When I set up group work, I have sections that promote students getting to know each other, and assign ways for students to have clear roles in the group (e.g., the person who likes dogs the best is the person who scribes for the worksheet, the person who likes Halloween the best is the one who ensures that everyone contributed to the discussion). I try to get to know the students with informal polls and chatting with them before class starts, to build relationships and to get a sense of each unique student. I create an open and warm classroom environment so that students are encouraged to ask questions or to express their point of view, using frequent in class questions that require some discussion. I strive to show cultural competence in my ways of communication to ensure that students who take my courses gain confidence in their abilities, and learn how to study and organize knowledge in meaningful ways (Tanner & Allen, 2017). I model tolerance and openness to opposing viewpoints so that students can feel safe in expressing their ideas and opinions. Specifically, last year I asked the students to help me stop using the phrase “you guys” so I could work on using non-gendered terms, and the class was very supportive in “catching” me and stopping me so I could reframe my language. This also modeled for students that we can make mistakes, while still being active members of the course that belong in the major.

    The strategies I have used to promote inclusive learning environments to combat hidden curriculum, including intentional course structure for learning, improvement, and creating a welcoming class climate, have been developed through many discussions with colleagues who value inclusion. My colleagues work to create classroom structures fostering belonging (Wilson et al., 2019), and that includes storytelling to engage students and foster knowledge application to real world situations (Alea Albada, 2022). Further, my inclusive strategies have been influenced by reading work that emphasizes kindness, affirmation, and communal goals (Estrada et al., 2019), and the importance of validating student’s experiences (Rendon, 1994). I was inspired by thoughtful workshops by Kimberly Tanner and Viji Sathy on best classroom practices for equity.

    My understanding of how to think about creating inclusive classroom environments includes thinking about the deep teaching practices; self-awareness, empathy, classroom climate, leveraging campus student support service network (Dewsbury, 2019). The growth and energy I have gained come in collaborative spaces and in conversation. I have realized how important it is to get to know your students as people. Consider taking the extra five minutes with a student asking about their hobbies, or goals, or passions, or starting a conversation with a colleague about what can be done to create more inclusion in the spaces you occupy. If we all add one or two small things that can foster inclusion, we can change our teaching and learning practices to promote equity in higher education and create real change.

    *Feel free to wordsmith the examples to suit your perspectives for your course, while citing appropriately.


    Adler-Kassner, L., & Wardle, E. (2022). Writing Expertise: A Research-Based Approach to Writing and Learning Across Disciplines. Clearinghouse.

    Alea Albada, N. (2022). Try Telling a Story: Why Instructors Share Personal Stories with Students.

    Ambrose, S., Bridges, M., DiPietro, M., Lovett, M., and Norman, M. (2010). How learning works: 7 Research-based Principles for Smart Teaching. Jossey-Bass.

    Boothe, K. A., Lohmann, M. J., Donnell, K. A., & Hall, D. D. (2018). Applying the principles of universal design for learning (UDL) in the college classroom. Journal of Special Education Apprenticeship, 7(3), n3.

    Dewsbury, B. M. (2019). Deep teaching in a college STEM classroom. Cultural Studies of Science Education, 15(1), 169–191.

    Estrada, M., Eroy-Reveles, A., & Matsui, J. (2018). The influence of affirming kindness and community on broadening participation in STEM career pathways. Social Issues and Policy Review, 12(1), 258–297.

    Felten, P. (2020). Critically reflecting on identities, particularities and relationships in student engagement. In A handbook for student engagement in higher education (pp. 148-154). Routledge.

    Fuentes, M. A., Zelaya, D. G., & Madsen, J. W. (2021). Rethinking the course syllabus: Considerations for promoting equity, diversity, and inclusion. Teaching of Psychology, 48(1), 69-79.

    Laiduc, G., & Covarrubias, R. (2022). Making meaning of the hidden curriculum: Translating wise interventions to usher university change. Translational Issues in Psychological Science, 8(2), 221.

    Miller‐Young, J., & Boman, J. (2017). Uncovering ways of thinking, practicing, and being through decoding across disciplines. New Directions for Teaching and Learning, 2017(150), 19-35.

    Nolan, S. A., Bakker, H. E., Cranney, J., Hulme, J. A., & Dunn, D. S. (2020). Project assessment: An international perspective. Scholarship of Teaching and Learning in Psychology, 6(3), 185.

    Rendon, L. I. (1994). Validating culturally diverse students: Toward a new model of learning and student development. Innovative higher education, 19(1), 33-51.

    Richmond, A. S., Morgan, R. K., Slattery, J. M., Mitchell, N. G., & Cooper, A. G. (2019). Project Syllabus: An exploratory study of learner-centered syllabi. Teaching of Psychology, 46(1), 6-15.

    Tanner, K., & Allen, D. (2007). Cultural competence in the college biology classroom. CBE—Life Sciences Education, 6(4), 251-258.

    Wilton, M., Gonzalez-Niño, E., McPartlan, P., Terner, Z., Christofferson, R. E., Rothman, J. H. (2019). Improving academic performance, belonging, and retention through increasing structure of an introductory biology course. CBE—Life Sciences Education 18(ar53), 1-13.

    Woods, V. E., Safronova, M., & Adler-Kassner, L. (2021). Guiding Students Towards Disciplinary Knowledge With Structured Peer Review Assignments. Journal of Higher Education Theory & Practice, 21(4).

  • 23 Jan 2023 9:18 PM | Anonymous member (Administrator)

    Michael Dubois
    (University of Toronto)


    While preparing to teach a course for the first time, I was keenly aware of the little time I had with students, and the consequent limits in how much material I could reasonably cover. Doubtless, many other instructors find it difficult to narrow down which topics can be included in a course, and to what degree of detail. Relatedly, scientific publications are being produced at ever-increasing rates, with the total sum of publications in the hundreds of millions (often languishing unread in piles on desks and desktops). With such a volume of extant literature, it is simply not feasible to cover everything.

    Complicating the dilemma between time and the quantity of facts is the increasing demand for students to learn skills (as found in “Psynopsis: Education Issue,” 2021).

    Skills-based education is critical in supporting students after graduation: more than 1-million undergraduates take introductory psychology each year (Gurung et al., 2016), but nearly 60% of psychology graduates do not obtain further education. Furthermore, around 50% of those graduates will obtain careers in sales, marketing, management, and other businesses where there is little need for specific course-related information (Carnevale, Gainer, & Meltzer, 2020), but greater demand for general cognitive and technical skills.

    The American Psychological Association’s Committee on Associate and Baccalaureate Education (CABE) reports that employers are interested in employees with skills across 5 broad domains: cognitive, communication, personal, social, and technological (Appleby et al., 2019; Hettich, 2021).

    Thus, while instructors need to carefully choose materials relevant to the course, they must simultaneously consider how their teaching addresses the broader needs of students—both within individual courses, and beyond university life. I contend that instructors must creatively find ways to “double dip”—presenting course content via pedagogical methods that concurrently develop students’ professional skillset.

    Specifically, teaching students computer coding skills and giving them the opportunity to practice those skills in class are useful ways of presenting course materials in an engaging way, and fostering the mastery of practical and employable skills. 

    By “computer coding,” specifically, I am referring to the use of text-based computer code in collecting, combining, cleaning, analyzing, and visualizing experimental psychology data. In my practice, this takes the form of using the R programming language (R Core Team, 2019): a data file (CSV format) containing behavioural responses for one participant are combined into a single larger file; this larger file is then cleaned and prepared for further analysis; finally, variables are compared (descriptively and inferentially), and presented in a graphical format. 

    How would this look in practice?

    Implementing coding in class can be extremely flexible, and tailored to meet the desires of students and instructors alike. An effective “crash course” can take as little as a single class, with more substantial integration lasting the duration of a course.

    One technique I recommend is allowing students to find data (class-relevant, and personally interesting) to explore themselves. For instance, a student in developmental psychology could find data for child behaviours to compare with adults.

    In all cases, early instruction should cover the particular programming language and syntax that students will use, and how to acquire the relevant software. Students can be briefly taught how to perform basic operations on data in furtherance of answering a research question. Finally, instructors should demonstrate how to use online resources to extend their basic knowledge of coding.

    What pedagogical goals can coding support?

    Including any degree of coding provides at least an initial step toward facilitating students’ mastery of skills and future employability: this includes both the technical skill of coding, and the cognitive skills that coding demands (e.g., developing questions, selecting analysis methods, interpreting results). Such cognitive skills are directly relevant to student learning and performance outcomes (Krain, 2010).

    Importantly, there are also many pedagogical benefits associated with implementing coding in class.

    Here I emphasize three of the APA’s Principles for Quality Undergraduate Education in Psychology (2011):

    Principle #1: Students are responsible for monitoring and enhancing their own learning.

    The approach I have proposed is empowering for students. First, they choose a dataset and determine what variables it contains, and what question(s) to answer. Next, they must extend a (presumably) cursory knowledge of coding skills to answer their question. Importantly, errors offer a critical opportunity reflect and adapt. Finally, students should be encouraged to present their work to peers (or other non-experts). Many programming languages include methods for producing polished “reports” (e.g. Rmarkdown, Jupyter Notebooks)—these make an excellent class assignment or final project, and emphasize communicating knowledge.

    Together, students are primarily responsible for each step throughout the coding process, with instructors only helping to guidance/direct thereafter. By assuming responsibility, students will be more invested and interested in the material they encounter; student interest is a key predictor of many positive outcomes—from motivation and effort (McManus, 1986a, 1986b), to learning and retention (Lester, 2013; Subramaniam, 2009).

    Principle #2: Faculty strive to become scientist–educators who are knowledgeable about and use the principles of the science of learning.

    Like any skill, mastery of coding requires multiple practice sessions. Helpfully, one of the most robust and evidenced pedagogical principles is that repeating and distributing learning over time is linked to greater learning (Delaney, Verkoeijen, & Spirgel, 2010). Thus, instructors should consider assigning multiple coding exercises—whether different analyses of the same data, or exploring new data all together—to support students’ mastery of the coding skill, and also learning course content.

    The approach I describe has a large overlap with principles of “active learning”—the idea that students learn more when they are participants in learning, and not merely passive recipients (Nelson, 2008; Park et al., 2021). Indeed, active participation is one of the fundamental elements of coding instruction from training organizations such as The Carpentries (Atwood et al., 2019).

    Principle #3: Psychology departments and programs create a coherent curriculum.

    As noted by the APA’s Introductory Psychology Initiative (IPI), the varying domains of psychology are linked by a common foundation of scientific inquiry (The American Psychological Association (APA), 2021). This common foundation is often only taught in research methods classes (or not at all), which can leave students with a tenuous understanding of the higher-order processes that unite the field of psychology. I propose that including coding in multiple psychology classes, even to a minimal degree, can help bridge this conceptual gap.

    Additionally, by implementing coding early and often in a curriculum, students are able to take concepts and skills acquired in lower-level classes and further develop them during upper-level classes (another form of distributed practice).

    Finally, the IPI notes that a high-quality curriculum should include an “integrative/capstone experience,” and emphasizes the acquisition of knowledge & skills that are directly relevant to students’ lives. I propose that a thorough and rigorous analysis of data is an excellent example of such an activity!

    This has often been (partially) accomplished via honours thesis projects, however, knowledge of coding introduces several alternatives: having students analyze data collected by faculty, partnering with community organizations to analyze their data, or reanalyzing previously published datasets. Learning (both the coding skill, and the psychological content) benefits when the learning is related to real-world applications and problems (Yurco, 2014).

    Coding concerns

    Before concluding, I thought it important to address several concerns that instructors and/or students may have.

    Given this approach to teaching is necessarily technical (in fact, having students acquire the technical skill is one of the primary goals), consideration should be given to both hardware and software.

    First, hardware (computers) is required, but this can take many forms: including student-owned laptops, or institutional computers (library, computer labs, etc.). Most data analyses have fairly low graphical and computing demands, meaning that essentially any computer is suitable.

    Second is software. Many extremely popular coding languages (R, Python) operate in software that are fully functional and open-source (i.e. free). The only requirement is access to an internet browser and sufficient computer memory to download the program.

    More recently, cloud-based computing environments (Posit Cloud, Jupyter Notebook) have been developed as alternatives to locally stored software, which allows for all students and instructors to share a single technical space and avoids many of the technical conflicts that occur due to different operating systems, software versions, packages, etc.

    Another potential concern relates to the limited level of familiarity and expertise in coding that students (and instructors for that matter) may bring into the class. On one hand, I believe this concern could be applied equally well to limited knowledge of course material. Still, different coding languages vary a great deal in their perceived levels of difficulty. As such, when deciding how to integrate coding into classes, instructors should consider what level of difficulty students can reasonably manage, given their abilities.  

    Moreover, students and instructors should be reassured that they will be supported in learning to code with an abundance of online resources. Although open-source software does not come with access to formal support services (like a “help desk”), they are often rooted within vibrant communities who share documents, guides, example code, workshop materials, videos, and even forums to discuss problems and solutions. Instructors should seriously consider which supports they intend to use (and share with students) in order to provide maximal scaffolding during the learning process.

    Lastly, I will address student assessment—in terms of both the format and content.

    Given the flexibility instructors have in implementing coding, there is similar flexibility in the format of student assessment. For instance, instructors can still use traditional written exams containing standard question formats (multiple-choice, true-or-false, etc.). Conversely, instructors can emphasize more advanced assessment types such as completing code exercises (fill-in-the-blanks, or open-ended), giving presentations, or producing analysis reports.

    Regarding the content of class assessments, I suggest following the APA’s emphasis on transferable skills (Appleby et al., 2019). For instance, communication skills can be assessed by having students give presentations or write reports on their analyses; critical thinking can be assessed by having students explain their decision-making processes when choosing between different analysis options; information management ability can be assessed by having students explain how they interpret their analysis findings.

    Conversely, coding skills should not be the primary focus of assessment. First, many different coding solutions achieve the same outcomes (making grade assignment difficult). Second, given the dynamic and ever-evolving lifecycle of code and packages, it is not important to assess the particular syntax students use (so long as it is of practical utility).

    Importantly, including any of these assessments do not preclude assessing course facts and concepts. Instead, it simply shifts the emphasis of learning outcomes being measured to provide greater balance between course content and skills. 


    Let us return to the original instructor’s dilemma: covering all of the relevant facts and findings is likely impossible, given the finite contact hours with students. This is especially true when considering the need to teach students the cognitive and technical skills they will need later in life.

    I suggest that we can resolve (or at least address) this dilemma by using coding as the fundamental medium by which students engage the key concepts of psychology. This will help support students develop their ability to explore and evaluate data (conceptually and practically), while still exposing them to the key ideas and class material. Moreover, this approach leverages multiple pedagogical principles (distributed practice, active learning) that are known to improve student learning outcomes.

    The general nature of coding, and the increasing availability of online data and supporting resources, make this endeavour quite feasible—while still offering significant flexibility to the varied needs of institutions and individuals.  

    Take a moment to ask yourself this question: “what do I want my students to remember in five years?” Whatever your answer, how can you make that outcome a reality? Including elements of computer coding may be an effective means to that end.


    Appleby, D. C., Young, J., Kirk, J. Van, Rudmann, J., Naufel, K. Z., Spencer, S. M., … Richmond, A. S. (2019). Transferable Skills: The Skillful Psychology Student. American Psychological Association.

    Atwood, T. P., Creamer, A. T., Dull, J., Goldman, J., Lee, K., Leligdon, L. C., & Oelker, S. K. (2019). Joining Together to Build More: The New England Software Carpentry Library Consortium. Journal of EScience Librarianship, 8(1), e1161.

    Carnevale, A. P., Gainer, L. I., & Meltzer, A. S. (2020). Workplace Basics: the competencies employers want. Retrieved from

    Delaney, P. F., Verkoeijen, P. P. J. L., & Spirgel, A. (2010). Spacing and Testing Effects: A Deeply Critical, Lengthy, and At Times Discursive Review of the Literature. Psychology of Learning and Motivation - Advances in Research and Theory (1st ed., Vol. 53). Elsevier Inc.

    Gurung, R. A. R., Hackathorn, J., Enns, C., Frantz, S., Cacioppo, J. T., Loop, T., & Freeman, J. E. (2016). Strengthening introductory psychology: A new model for teaching the introductory course. American Psychologist, 71(2), 112–124.

    Hettich, P. (2021). What Skills Do Employers Seek? Four Perspectives. Eye on Psi Chi Magazine, 26(1), 20–24.

    Krain, M. (2010). The effects of different types of case learning on student engagement. International Studies Perspectives, 11(3), 291–308.

    Lester, D. (2013). A review of the student engagement literature. Focus on Colleges, Universities, and Schools, 7(1), 1–8.

    McManus, J. L. (1986a). “Live” Case Study/Journal Record in Adolescent Psychology. Teaching of Psychology, 13(2), 70–74.

    McManus, J. L. (1986b). Student Composed Case Study in Adolescent Psychology. Teaching of Psychology, 13(2), 92–93.

    Nelson, C. E. (2008). Teaching evolution (and all of biology) more effectively: Strategies for engagement, critical reasoning, and confronting misconceptions. Integrative and Comparative Biology, 48(2), 213–225.

    Park, E. S., Harlow, A., AghaKouchak, A., Baldi, B., Burley, N., Buswell, N., … Sato, B. (2021). Instructor facilitation mediates students’ negative perceptions of active learning instruction. PLoS ONE, 16(12 December), 1–16.

    Psynopsis: Education Issue. (2021). Psynopsis, 43(3). Retrieved from

    R Core Team. (2019). R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing. Retrieved from

    Subramaniam, P. R. (2009). Motivational Effects of Interest on Student Engagement and Learning in Physical Education : A Review. Int J Phys Education, 46(2), 11–20.

    The American Psychological Association (APA). (2011). APA Principles for Quality Undergraduate Education in Psychology. The American Psychological Association (APA).

    The American Psychological Association (APA). (2021). APA Introductory Psychology Initiative (IPI) Student Learning Outcomes for Introductory Psychology. The American Psychological Association, (OCTOBER), 2021. Retrieved from

    Yurco, P. (2014). Student-Generated Cases: Giving Students More Ownership in the Learning Process. Journal of College Science Teaching, 043(03), 54–58.

  • 16 Dec 2022 7:56 PM | Anonymous member (Administrator)

    Amanda Cappon & Lynne N. Kennette
    (Durham College, Oshawa, Ontario)

    Every student is unique. They enter our classrooms at varying ages and stages in their life. Students present with differing learning preferences and motivations, and as educators in post-secondary institutions, we are a font-line point of contact, privy to this diversity in our students. Valuing this diversity and further striving to provide inclusive learning spaces in the classroom, we believe that educators of any discipline can benefit from using the biopsychosocial model.

    The following article will first describe the model and its origins, link it to a “whole student” perspective, and then apply it to students’ current learning space and eventual workplace. Finally, it will connect the biopsychosocial model to principles of equity, diversity and inclusion (EDI).

    The Biopsychosocial Model

    Each of us is a product of both nature (our innate genetic make-up and personality) and nurture (our experiences and social-cultural exposures), which interact to create the unique combination that makes us, us. The biopsychosocial model originated in the 1970s through George Engel (see Fava & Sonino, 2017). Engel worked in psychiatry at the time and the biomedical model was considered the gold standard, but Engel saw the medical model as reductionistic, so he sought to expand it to integrate knowledge from the behavioural and social sciences which led to the inception of the biopsychosocial approach (Fava & Sonino, 2017).

    Below is an adapted chart which readers may find useful in understanding the biopsychosocial model as it might relate to the students in their classrooms. We have based our version, which we have adapted for the focus of this paper, on a previous synthesis from the PsychDB psychiatry reference database (PsychDB, n.d.). Please note that this is not intended for educators to pry into the personal lives of their students. Rather, this chart and the model itself, can be used to shift perspective in acknowledging the diversity of our students and the internal and external challenges they may be dealing with while in our classrooms.

    Biopsychosocial Model for Students



    Social and Cultural

    Predisposing Factors

    (What are the longstanding factors in the student’s life that may be affecting their functioning?)

    • Genetics
    • Medical conditions
    • Learning or developmental disorders
    • Interactions with peers
    • Cognitive style
    • Self-esteem
    • Poverty
    • Access to mental health care
    • Exposure to conflicts

    Precipitating Factors

    (Was there an acute event in this student’s past?)

    • Medical illness or injury
    • Use of alcohol or drugs
    • Conflicts around identity (common during transitions such as to post-secondary)
    • Psychological distress/ disorder
    • Changes to relationships
    • Recent immigration
    • Loss of home

    Perpetuating Factors

    (What chronic things are going on which might affect learning?) 

    • Chronic illness 
    • Cognitive deficits, or learning disorder
    • Adjustment of medication
    • Ongoing substance use
    • Beliefs about self/others/the world
    • Self-destructive behaviours
    • Poor coping 
    • Personality traits
    • Loss of social support
    • Ongoing transitions
    • Food insecurity
    • Working too many hours at part-time job
    • Isolation
    • Unsafe environment



    (What is protecting them and allowing them to learn well?) 

    • Overall health
    • Intelligence Abilities
    • Resiliency
    • Popularity
    • Positive sense of self
    • Good coping skills
    •  Self-awareness
    • Positive familial relationships
    • Availability of supportive social network
    • Financial support

    Seeing the Whole Student

    As educators, we can apply the biopsychosocial model in our learning environments through how we engage with and respond to our students. If we engage with our students from a “whole human” perspective, we can not only better manage our personal biases (in an ongoing way), but we can also model this approach for our students to apply in their respective lives.

    As an example of how faculty might apply the biopsychosocial model, consider a fictional student who repeatedly requests evaluation extensions. Understandably, this can be frustrating for the professor. It is likely that the student would have provided us with their “precipitating factors” resulting in their need for an extension, but we may not be privy to the longstanding, predisposing, or perpetuating factors that are impacting the student. While the student is not required to disclose this information, and we, as educators, are not required to grant an extension, we can better support students in their learning if we take a step back to consider these additional factors which could be contributing to our student’s struggles. In this way, we may be better able to support the diverse needs of this and other students while also mitigating our own frustration. Planning our courses a priori with the many factors outlined in the biopsychosocial model can help.

    Similarly, by designing a course which provides all students choices for assessments, which is in line with the recommendations of universal design for learning (CAST, 2018), students may be able to demonstrate the same learning outcomes in a format which takes into account their whole life context. For example, students who don’t have their own computer at home, or who live in a low-income neighbourhood, might prefer to demonstrate their knowledge by writing a test rather than completing a digital project or group assignment. In both of these examples, the idea promoted by the biopsychosocial model is that it may facilitate open dialogue with the student, referral to on campus supports, and a supportive response that promotes genuine student-teacher engagement. In this way, seeing the whole student begins when designing the course and continues throughout the semester.

    Fostering Student Application of the Biopsychosocial Model

    In forty years since its inception, the biopsychosocial model has been thoroughly researched in the healthcare field (see Fava & Sonino, 2017) and has made its way into the classroom. In social science and humanities classrooms, the pedagogical application of this model is a little more obvious because it more easily connects to the content of the course. For example, educating future clinicians to look at the “whole human” (including biological, psychological, social, and cultural aspects) can be taught through various methods of self-reflection, having the student engage with their own biopsychosocial development, or by applying (individually or in groups) this model to an imagined client scenario.

    In other disciplines, the application of the biopsychosocial model may be less obvious but valuable nonetheless, especially since many fields are experiencing a shift toward giving more space and importance to equity, diversity and inclusion practices. For example, Flynn et al. (2022) recently identified a paradigm shift in the field of occupational health and safety toward the biopsychosocial model. This paradigm shift is intended to advance “health equity” because the social determinants of health intersect with other social structures which have ultimately led to exclusionary practices in work environments, research findings, and more (Flynn et al., 2022). Flynn et al. (2022) also highlight the importance of using this model to better understand one’s position within the “complex social web” in which we exist. And, perhaps more prudently, this model can help with our awareness of our own perspective of the world which, if left unattended, can lead to reflexive thoughts and behaviours as well as innate assumptions or judgements of others.

    An example of how this model can be included in the classroom of any discipline would be to create a “case scenario” specific to your course content and have students select one case to analyze (independently, in pairs, or in small groups) from a biopsychosocial perspective. That is, ask students to consider various aspects of the person in the scenario and how those aspects might affect behaviour. For example, would Einstein have come up with the theory of relativity had he not encountered some of the hardships in his life (such as World War II)? Alternatively, students can themselves create the descriptions of people related to their field of study (including all aspects of the model) and exchange with another student to examine. Depending on the specific outcomes for your course, you might consider using a celebrity, criminal, researcher/historical figure in your field, a family member, a client encountered during a co-op placement, or even themselves. Regardless of the figure chosen (or if you provide them with a fictitious description of a person), students should consider the influence or role of various aspects of the biopsychosocial model in determining behaviour. Some aspects that the student might consider could be biological (genetic predisposition, underlying medical issue, certain hereditary mental health disorders), psychological (developmental stage, certain mental health disorders, intelligence, attachment style), social (attitudes, social expectations, education, family values), and/or cultural (religion, economic status, sexual identity, ethnicity/race, trauma/crisis, language). Discussions can be quite engaging, especially as students bring in their own unique and diverse lived experiences. Various disciplines can adapt this activity to meet the needs of their curriculum. For example, in a law course, perhaps describe an accused/known criminal; in a social work course, it could be a client; in a nursing course, a patient; in a business course, an applicant for a job; in a literature course, present a character from a novel; in a math course, it could be a key historical figure. The purpose of the activity is to engage students in considering the “whole human” as it draws on their awareness of barriers, strengths, and struggles from a holistic perspective, and how that can impact various outcomes and behaviours. This type of discussion serves to deepen our awareness of diversity as well as social justice/injustice and can also promote a sense of cohesion in the classroom.

    One of the primary benefits of exposing students to the biopsychosocial model in their courses is that they gain some additional skills (e.g., critical thinking skills) and become more familiar with the struggles of others, developing empathy. Thinking about all the variables that make people behave a certain way makes students less likely to defer to stereotypes for any particular group or to commit the fundamental attribution error (whereby erroneously attributing others’ behaviours to internal attributes). Ultimately, students will be equipped with enhanced interpersonal skills from their exposure to this model during the course of their studies.

    Linking the Model to Students’ Eventual Workplace

    Students attend post-secondary institutions with the goal of earning a credential and gaining employment in their field of study. Their learning journey is the foundation which will allow them to apply the learning outcomes and so-called “soft skills” which they developed during the course of their studies. The biopsychosocial model is about developing a general ability of looking at the “whole human” and the diversity among us which is beneficial regardless of one’s field of study. In business management, for example, an important biopsychosocial application might be equitable hiring practices. In health sciences (e.g., nursing), it may be particularly important to be able to assess and advocate for medical intervention on behalf of a patient, a skill which requires a holistic approach. An employee in the field of data analytics might be working on social science or humanities research projects where it would be advantageous to apply a biopsychosocial perspective in their representation of the research data. In any field, developing the skills and ways of thinking related to principles of diversity, equity and inclusion will place students in a better position to be competitive in the job market and better employees once hired.

    A Connection with Equity, Diversity, and Inclusion (EDI)

    Perez et al’s (2019) research revealed an overall lack of engagement and understanding of equity, diversity and inclusion (EDI) among post-secondary departments, and as a result, among students as well. And yet, these constructs are overwhelmingly relevant for a students’ own identity and ultimate success in their personal and professional lives. Barnett (2020) wrote about “leading with meaning” where he qualitatively reviewed 12 empirical articles on equity, diversity and inclusion among higher education institutions in the U.S. He concluded that, while educating students on constructs of equity, diversity and inclusion is a complicated process which is dependent on the context, it is imperative that post-secondary institutions focus not only on the specific content which is taught, but also on how that content is taught. Overall, engagement at all levels, including administrators, educators, and individual student relationships, is critical to truly infuse the practices of inclusion, promote equity, and maintain awareness of diversity. The biopsychosocial model is not the sole solution, but it can definitely aid in a more inclusive pedagogical practice.


    As educators, we can lay a foundation for EDI by including the biopsychosocial model when designing and delivering courses. As we engage with each diverse learner, we should practice our own awareness of biological, psychological, and social/cultural influences for that learner, continuing to model inclusion by welcoming all student contributions in the classroom. We can help foster EDI in our students by helping them to engage with the biopsychosocial model in our courses to enhance our students’ learning and develop their thinking in a way that will promote EDI both in the classroom and in the workplace.

    The classroom is a powerful space to model skills and foster the application of key concepts, and as educators, we can have a lasting impact on our learners. The biopsychosocial model can be used to help us focus on the diversity of our students, both in our pedagogical practices and in the content and skills we help to develop in them.


    Barnett, R. M. (2020). Leading with meaning: Why diversity, equity and inclusion matters in US higher education. Perspectives in Education, 38(2), 20–35.

    CAST (2018). Universal Design for Learning Guidelines. Center for Applied Special Technology.

    Fava, G. A., & Sonino, N. (2017). From the Lesson of George Engel to Current Knowledge: The Biopsychosocial Model 40 Years Later. Psychotherapy and Psychosomatics, 86(5), 257–259.

    Flynn, M. A., Check, P., Steege, A. L., Sivén, J. M., & Syron, L. N. (2022). Health Equity and a Paradigm Shift in Occupational Safety and Health. International Journal of Environmental Research and Public Health, 19(1), 349–352.

    Perez, R,. J., Robbins, C. K., Harris, L. W., & Montgomery, C. (2020). Exploring Graduate Students’ Socialization to Equity, Diversity, and Inclusion. Journal of Diversity in Higher Education, 13(2), 133–145.

    PsychDB, Psychiatry DataBase (n.d.). Biopsychosocial model and case formulation.

    Quiros, L., Kay, L., & Montijo. A. (2012). Creating Emotional Safety in the Classroom and in the Field. Reflections: Narratives of Professional Helping, 18(2), 42–47.

  • 03 Oct 2022 9:54 PM | Anonymous member (Administrator)

    Nicole Alea Albada (University of California, Santa Barbara)

    One of the most influential papers that I read in graduate school was one by Alan Baddeley (1988) titled, “But, what the hell is it for?” The title, of course, was brave and bold, but so was the content. Baddeley and other cognitive psychologists at the time (e.g., Neisser, 1978) argued for an ecological approach to the study of memory. They argued that memory researchers needed to move outside of the confined parameters of the lab to study memory in people’s everyday ecologies. Doing so would move researchers beyond questions about how the memory system works (i.e., the mechanics of memory) to questions about memory’s real-world usefulness or function. I followed in this tradition as an autobiographical memory researcher asking questions over the years about the functions of remembering and sharing the personal past with others in a variety of ecologies (e.g., lifephase, cultural, and online contexts). In recent years, I have become interested in the functions of remembering and sharing stories of one’s personal past in the classroom ecosystem. Why? Because I noticed that I do it often so it must be serving a purpose.  

    For example, on the first day of my research methods course, I tell students my life story - that I grew up in Key West, Florida, a small island that is the Southernmost point in the Continental United States; that I come from a family of pharmacists (great-grandfather, dad, sister) but that I took a different path in my academics to study psychology; that I stayed in Florida to earn my PhD at the University of Florida so that I could be close to my extended Cuban family; that I met my husband there and that he and our teenage son are obsessive surfers so I spend most of my free time at the beach; that we lived on the Caribbean island of Trinidad and Tobago for over ten years where I taught research methods and statistics at the University of the West Indies before coming to teach them at the University of California Santa Barbara.

    I am not the only instructor that seems to share personal stories, like the one above, with their students. It is quite common. For instance, a survey conducted by Houska and colleagues (2015) of 100 university psychology professors found that 91% of teachers reported using stories at least occasionally over the last five years of teaching and, of those, 89% were informal personal stories or brief self-disclosures. Why might personal story sharing be so commonplace? My proposition, from a functional approach, is that it must be serving some purpose in the context of the classroom and instructional ecology. What might these functions be?

    Teaching and Learning Function of Personal Stories

    The scholarship of teaching and learning literature is peppered with many and diverse suggestions about the reasons why instructors share personal stories with students (e.g.,  Brakke & Houska, 2015). Perhaps the most common suggestion, which matches well with the objectives of our profession, is that instructors’ personal stories are shared with students to help them better learn and retain information. For instance, the instructors in the Houska and colleague’s (2015) survey said that they tell stories because it helps the course material “come to life for students” and as such “stories are what students remember” (p. 22). Landrum and colleagues (2019) also home in on the power of stories to help students learn and retain information. They reviewed work which indicates that stories pull students into material for deeper learning because stories are interesting, feel relevant, and are in a form (narrative) that is familiar and easily accessible for students.

    We have found similar evidence for what we have coined as the teaching and learning function of personal stories in our own correlational work (Alea & Osfeld, 2022). We surveyed student’s perceptions of my use of personal stories when teaching research methods for psychology. Students reported that the stories that I shared with them - like those about my husband’s very-distant fourth-place finish in a swim race to demonstrate ordinal scale of measurement, or the time when I was an undergraduate research assistant and caught an older adult writing down a list of vocabulary words that he did not know so that he would get them correct on the next assessment as an example of a (blatant) practice effect - helped them to better understand the material from “quite a bit” to “very much.” Students openly expressed that the stories were helping them learn, with comments like: “She would talk about example[s] related to her son that helped [me] remember experiment designs” and “I liked all the examples [personal stories] because they showed how to apply the topics we were learning in class to real life situations, and it made conceptual topics more concrete and understandable.” Thus, evidence from correlational and anecdotal studies, and from both instructors’ and students’ perspectives seems to suggest that instructor’s personal stories have the power to serve a teaching and learning function.

    Socioemotional Functions of Personal Stories

    Personal stories also seem to have the power to serve other non-academically oriented, but I would argue, equally important functions for students. For example, through the snippet of my life story shared above, I am hoping that students infer that: I come from a small town but made it to a big university; that I value diversity and other cultures, that family is important to me, that it is okay to take your own path and diverge from expectations, and that they should feel confident in me teaching their course because of my experiences. I could have just told my students all of this, but instead I tell them through story, believing that it speaks volumes. Sharing this personal story with my students is not teaching them more about the content of research methods, so it is not serving a teaching and learning function, but is likely serving other socioemotional functions that are relevant to a student’s experience as they navigate courses and university.

    To better understand and systematically delineate what these socioemotional functions of personal stories in the instructional context might be, we have developed the Personal Stories in Teaching (PST) Survey (Alea, Adams, & Mohiuddin, 2022). The items for the survey were constructed by pulling ideas from the teaching and learning (e.g., Brakke & Houska, 2015) and autobiographical memory functions literature (e.g., Bluck & Alea, 2011), as well as by asking expert university instructors why they share personal stories with their students. Factor analysis indicated that in addition to the teaching and learning function of personal stories, instructors were telling students about their personal experiences in order to serve four other specific functions:

          The social bonding function, which involves instructors sharing personal stories with students to create connections, by letting them know more about us and the ways that we may be similar to them, and in doing so creating an overall more positive and communal learning environment.

          The directive function, which involves instructors sharing personal stories with students about accomplishments and missteps that we have had, in an effort to help direct students’ pathways.

          The empathic function, which involves instructors sharing personal stories with students to help them feel better when they have not succeeded at something and to provide reassurance that will help students to grow in emotional ways.

          The identity function, which involves instructors sharing personal stories with students as a way to encourage them to explore other cultures and perspectives as a means to promote further self-exploration and understanding. 

    Incorporating Personal Stories into Instruction

    I would very much like to end this post with strong evidence-based suggestions for how to implement personal stories when teaching so that they are functional for students. I would like to provide suggestions for, for example: How long should the stories be? How personal should they be? When in a lecture, beginning or end, might a personal story best serve a teaching and learning function? Are personal stories always functional? This, after all, seems to be what instructors want. In 2014 - almost a decade ago now - there was a call from the Society for Teaching of Psychology’s Story Task Force to provide evidence about the efficacy of stories as an instructional tool and a set of guidelines for how to best use stories when teaching. The culmination of this call to action was a free edited book, Telling Stories: The Art and Science of Storytelling as an Instructional Strategy (Brakke & Houska, 2015). The book, and work that followed, is full of instructors’ ideas for how they use stories in their own courses and quasi-experimental studies conducted in classes about the efficacy of personal stories for teaching.

    I have been thinking recently, however, that it might be time to bring some of the work exploring the functional outcomes of personal stories in the classroom back into lab-based settings so that variables - like content, timing, and outcomes - can be better controlled. This is hard for me to suggest, given my theoretical foundation in the ecological memory movement. However, I feel compelled to do so because two recent, separate lab-based studies (Alea & Osfeld, 2022; Kromka & Goodboy, 2019), with similar well-controlled methodology, in which lecture content was delivered with and without a personal story, showed little to no improvement for student learning when a personal story was included. The reasons for not finding evidence to support a teaching and learning function of personal stories are many: perhaps the story manipulation was too weak, or in the wrong location in the lecture to be effective, or perhaps a one-time lecture presentation with a single personal story does not mimic well the story sharing experience that occurs in the context of a classroom during the course of an entire term in which socioemotional functions of personal stories are also playing a part in learning. These are all questions that remain to be answered, and a nuanced approach with lab-based and in situ research designs are needed. The end result will give us the full story of the functions of personal stories in instruction. 



    Alea, N., Adams, P., & Mohiuddin, H. (October 2022). The Personal Stories in Teaching (PST) Survey: Exploring why instructors share personal stories with students. Society for the Teaching of Psychology’s 21st Annual Conference on Teaching, Pittsburgh, PA, USA.


    Alea, N. & Osfeld, M. (2022). The teaching and learning function of personal stories: Correlational and experimental evidence. Teaching of Psychology, Online first, 1-11.


    Baddeley, A. (1988). But what the hell is it for?. In M. M. Gruneberg, P. E. Morris, & R. N. Sykes (Eds.), Practical aspects of memory: Current research and issues (pp. 3–18). Wiley.


    Bluck, S., & Alea, N. (2011). Crafting the TALE: Construction of a measure to assess the functions of autobiographical remembering. Memory, 19(5), 470–486.


    Brakke, K., & Houska, J. A. (2015). Telling stories: The art and science of storytelling as an instructional strategy. Society for the Teaching of Psychology.



    Houska, J. A., Brakke, K., Kinslow, S. L., Zhao, X., Campbell, B., & Clinton, A ( (2015). The use of story among teachers of psychology. In K. Brakke, & J. A. Houska (Eds.), Telling stories: The art and science of storytelling as an instructional strategy (pp. 14–26). Society for the Teaching of Psychology.


    Kromka, S. M., & Goodboy, A. K. (2019). Classroom storytelling: using instructor narratives to increase student recall, affect, and attention, Communication Education, 68(1), 20-43,


    Landrum, R. E., Brakke, K., & McCarthy, M. A. (2019). The pedagogical power of

    storytelling. Scholarship of Teaching and Learning in Psychology, 5(3), 247–253.


    Neisser, U. (1978). Memory: What are the important questions? In M. M. Gruneberg, P. Morris, & R. H. Sykes (Eds.), Practical aspects of memory (pp. 3–24). Academic Press.

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
Powered by Wild Apricot Membership Software