E-xcellence in Teaching
Editor: Manisha Sawhney
Associate Editor: Annie S. Ditta

  • 16 Apr 2017 9:18 AM | Anonymous
    OMG RU Really Going to Send That?
    Email Communication with Students

    Andrew Peck, PhD

    The Pennsylvania State University

         Electronic communication plays an important role in traditional collegiate education and online learning. In 2001, the number of email messages outnumbered letters sent by the United States Postal Service (Levinson, 2010). In 2002, Bloch reported that the typed word began to establish itself as the primary means of interpersonal communication, mentioning a case in which a student broke-up with her boyfriend via email. In fact, email has become the most widely used instructional technology (see Wilson & Florell, 2012).  Recognizing this, at least one college tells students that email is the “lifeline of [their] communication with the college.” (http://www.gwinnetttech.edu/webmail/, sec. 1). Interestingly, while we are most likely to initiate electronic correspondence to send course announcements or meeting requests, students tend to use their “lifeline” to make appointments, ask questions, and offer excuses (Duran, Kelly, & Keaten, 2005)


          Email can benefit faculty members and students in a variety of ways. Email is a relatively inexpensive way to communicate with many people quickly, it fosters collaboration, file sharing (Hassini, 2004) and group problem solving (Hassini, 2004; Wilson & Florell, 2012), and it provides an electronic record or “paper trail” for later reference (Wilson & Florell, 2012). Email can also increase the accessibility of the instructor (Hassini, 2004; Wilson & Florell, 2012). We can use email to provide feedback, which can foster academic development (Duran, Kelly, & Keaten, 2005), motivation (Duran, Kelly, & Keaten, 2005; Kim & Keller, 2008), and achievement (Kim & Keller, 2008). Some have noted that email can increase student writing (Hassini, 2004), although others have expressed concerns about the quality of students’ electronic correspondences (see Bloch, 2002). Email can increase communication with students who struggle with face-to-face communication, including foreign, shy, or disabled students (see Bloch, 2002; Duran, Kelly, & Keaten, 2005). Finally, email use can improve students’ perceptions of us, especially when our responses are helpful and prompt (Sheer & Fung, 2007), and include appropriate emotional content (Wilson & Florell, 2012).


          Like other instructional technologies, email is a tool, and misuse can result in unexpected consequences. Although the option to send a message to a large group of people quickly can be helpful, email does not come with “you probably shouldn’t send that” warnings, and sometimes people will send ill-conceived electronic messages to many recipients, as these examples of public Tweets (posts on Twitter) demonstrate:

    “I can't believe my Grandmothers making me take out the garbage   I'm rich f*** this I'm going home I don't need this s***”   - 50 cent (Note: I’ve added spaces and censored the message to make it more readable and appropriate for readers)

     “With so many Africans in Greece, at least the mosquitoes of West Nile will eat homemade food.”   - Voula Papachristou, Greek triple jumper who was removed from the Greek Olympic team for posting this sarcastic comment

     Although many of us are fortunate enough to have students who don’t send inappropriate mass mailings to classmates regularly, email does provide an avenue for upset students to vent before they’ve fully considered the consequences. Furthermore, while email increases the accessibility of the instructor, it also means that students have increased expectations about our availability and personal attention. Consequently, responding to email seems to have changed the nature of our work.

         Some of us prefer to use email as little as possible because the loss of non-verbal, social, and contextual cues can increase misunderstandings (Hassani, 2004), but many of us seem to treat it as a job requirement (and sometimes it is). Nonetheless, it can be time consuming to respond appropriately to student messages (Hassani, 2004), and sometimes responding becomes “the third shift in an already overcrowded day” (Mason, 2010, para. 3). Sometimes, when it is clear that students did not take the time to read important announcements sent via email, we wonder if sending email is worth the time it takes us to compose the message.

         To make matters worse, sometimes we wonder if the email students send are actually written by the student who is listed as the sender. In our department, my colleagues and I have received messages from student accounts that were actually written by those students’ friends, roommates, and parents. Ironically, some of us might wish students’ parents wrote messages for their children more often, as student messages can be too casual for many educators (see Bloch, 2002). It is not uncommon for electronic messages to lack grammar and punctuation, as this example demonstrates:

     “can i come 2 ur office i need 2 meet w u b4 the test i have ?s thx”

     Faculty Member Expectations

          Faculty members vary in their expectations of student email (Biesenbach-Lucas, 2007). To help students understand specific expectations, some of us include a statement about email communication in their syllabus. Here is an excerpt from a sample syllabus that focuses on instructor accessibility and other concerns:

    Email policy: On weekdays, I check my mail once -- in the early morning. If you send me an e-mail after 6 a.m., do NOT expect an answer until the next day. I do NOT check my mail at all on weekends. So if you send me a message anytime after 6 a.m. on Friday, you will not get an answer until Monday morning. I do not open emails with attachments. I do not open emails without subject lines. I do not open emails written in languages I can’t read – so be sure if you have your email set to a non-English format that your name and information come through in English. (http://public.wsu.edu/~mejia/Handbook/Sample_105_Syllabus.htm, para. 2)

    Here is an excerpt from another syllabus that focuses on tone and style:

     …all email communication will follow the guidelines enumerated here.  Email should be composed in formal, professional language, and with attention to the propriety accorded to the position of the writer, and the addressee…(http://www.hist.umn.edu/hist3722/syllabus.html, para. 9)

     Some might worry that including these types of statements in their syllabus might cause students to view them as overly strict, but students may not be aware of how they come across in their email and appreciate knowing teacher expectations (Martin, 2011).

         While a syllabus statement can help, challenging email messages seem to come with the job. While there are no recipes or guidelines we can use to construct the perfect email message, people have offered a number of helpful considerations. To help sort out these considerations, I have organized them below using based the popular green, yellow, red color coding scheme to reflect the potential gravity of the student’s message or the educator’s response.

     Code Green Messages

          Fortunately, we sometimes get “Code Green” messages. These messages are complimentary or positive in tone and content (I wanted to thank you for…, I enjoyed your course, are you teaching others…), ask for appropriate information respectfully, or include appropriate requests. Generally, these messages are easy to respond to professionally, so there is little need to offer strategies for responding to these types of messages.

     Code Yellow Messages

          Unfortunately, “Code Green” messages are often outnumbered by “Code Yellow” messages. These messages require us to proceed cautiously, as the message might require a considered response. Experience suggests that there are several types of “Code Yellow” messages: those that demonstrate that students misunderstand their own responsibilities, messages containing inappropriate personal information, and messages motivated by students’ anxieties (see Wilson and Florell, 2012, for an excellent review).

         Sometimes students misunderstand their own responsibilities, and deflect or request accommodations to compensate (Wilson and Florell, 2012). For example, my colleagues and I get messages from students like these:

    Dr. __ , I didn’t do well on your final exam. I am on the __ team and need an A in your class to get into my major and retain my scholarship. Please help.

     Dr. __ , I didn't realize the ___ was due yesterday. What can I do to make-up those points?

     Dr. __, I won’t be prepared for class discussion and can't do the first reading quiz because I just ordered the book. I apologize for any inconvenience.

     Dr. __ , I didn't make it to class today. Can you please send me the notes I missed?

    Sometimes students will include personal details of their lives inappropriately to justify a request. Sometimes lonely students just write to be friendly, and sometimes students seeking relationship advice confuse us with writers for the Dear Abby column. Consistent with examples provided by Wilson and Florell (2012), here are some example messages my colleagues and I received:

    Dr. __, How are you? I would like to make an appt. to meet with you. I don’t have anything specific to discuss, I just thought I would stop in to say hi and chat. I have two dogs named….

    Dr. __ , Help!…me and my friend hooked up once in the beginning of the semester and I liked her but didn't think she liked me back so I moved on, and……but now...what should I do?

    Sometimes “Code Yellow” messages are sent by conscientious and responsible students whose anxieties get the best of them.  Consistent with examples provided by Wilson and Florell (2012), here are some example messages we received:

    Dr. ___  , I am in your 11:00 am class. I completed the extra credit writing assignment in class today, but I didn't receive credit in the online grade book yet. Please get back to me right away. I really need this credit. [message sent at 1:30 pm]

    Dr. ___ , I wonder if the study guide you gave us is really everything we need to know for the final. We didn’t cover Chapter 11 in class, and it isn’t on the syllabus, but should I study it anyway? I emailed you earlier today, but I didn’t hear back yet.

    Sometimes, students send “Code Yellow” messages requesting information that is outside of the responder’s expertise. In these cases, it is appropriate to redirect the student to the appropriate resource, often an academic advisor or health services professional. However, many “Code Yellow” messages are class specific, requiring us to respond directly. In these cases, we should try to treat these moments as “teachable moments.” We should model professionalism, maintain a professional tone and offer appropriate content (Wilson & Florell, 2012). Sometimes leading by example can help, and one never knows who will read the message, especially when technologies make it easy to share electronic correspondence with others easily.

         As mentioned above, students appreciate it when we include emotional content in their responses (Sheer & Fung, 2007), but it is important to balance a congenial tone with a professional tone. One way to do that is to express empathy/sympathy when saying “no” (Wilson & Florell, 2012).

    Example: Thanks for letting me know. I appreciate your dilemma. I hope that you can stay on the team and keep your scholarship. I’d really like to accommodate your request, but I have to assign your grade on the basis of merit and abide by the grading policies in our course syllabus or I will…. violate departmental and college policies….create an unfair situation for other students….

    Wilson & Florell (2012) have also recommended that we provide students with perspective and encourage responsible action.

    Example: Unfortunately, you can’t make it up, but it is only worth…you can still do well in the course if you…..

    Example: Yes, you can do that. Please see the syllabus for details.

    They also recommend ending our messages with a positive and sincere tone when possible, but also recognize that a persistent student will struggle to take “no” for an answer. In these cases, it is up to us to end the conversation directly, but not aggressively, ignoring additional email from the student about the same issue.

    Example: Thanks for following-up and providing more information. I hope you have a good weekend.

    Example: I appreciate your continued concerns, but as I said, there isn’t anything else I can do without violating college/course policies. I consider this matter closed.

    Code Red Messages

    While “Code Yellow” messages require us to slow down and respond cautiously, “Code Red” messages often require us to stop what we’re doing to construct a planned response. “Code Red” messages are highly emotional, highly critical, or have an aggressive tone. Examples include pleas for help, student disclosures of abuse or suicidal inclinations, or hostile messages from irate students. While discussing strategies for responding to aggressive behavior, Tunnecliffe (2007) listed a number of potential causes for students’ anger.  He noted that some aggression stems from the lack of critical knowledge or inaccurate information, unrealistic expectations, or previous rewards for aggressive behaviors. Research on the development of the teenage brain also suggests that teenagers are more likely to become highly emotional than we are, and that emotion may cloud students’ reasoning abilities (for an example, see Spinks, 2013). Regardless of the factors involved, many aggressive messages seem to be triggered by perceptions of unfairness or inequity.

         Because of the nature of “Code Red” messages, there are a number of things to consider when responding. On many campuses, when faculty members are alerted to imminent threats of harm (including student self-harm) they are required to alert their chairs/department heads and campus or local police. Many campuses have counseling or intervention teams, other student resources, or partnerships with community programs to offer student resources. When appropriate, we should introduce these resources to victimized students and should consider facilitating student contact/appointment scheduling. If nothing else, we can encourage victimized students to go to the local hospital, where hospital personnel and case-workers can get involved.

         On some campuses, faculty members are instructed NOT to take on the role of detective/police officer or ask the student specific questions about a traumatic experience. This can increase feelings of victimization and make it less likely that the student will share critical details with law enforcement officials, student conduct authorities, police, or health professionals. Instead, we are advised to take the information the student has provided at face value, ask a few general questions (What happened? When? Where?) so that information can be passed on to authorities, reassure the student that they will do what they can to help, and then follow campus guidelines for helping.

         Dealing with aggressive students can be challenging and emotional for us. My colleagues and I have found it helpful to walk away from the computer and let some time pass before they respond (usually 12-24 hrs). This gives us time to cool down so that we can respond more professionally, and it gives the student time to cool down, too. Occasionally, students will realize their message contained inappropriate content or had an inappropriate tone, and they will send a follow-up apology. While there isn’t any research on successful strategies for responding to aggressive email, recommendations can be drawn from discussions about the best ways to communicate with angry students to promote de-escalation. It is important to avoid using a reprimanding tone (Tunnecliffe, 2007), which can promote defensiveness and increase perceptions of victimization. It is also important to recognize that anxiety can increase threat perceptions (Craske, Rauch, Ursano, Prenoveau, Pine, Zinbarg, 2009), and that anxious students are more likely to interpret ambiguous information or references to authority as more threatening than intended. A calm, jargon-free, tone might be more successful (Tunnecliffe, 2007; University of Oregon Counseling and Testing Center, 2012). With this in mind, it is important to note that we should avoid using capitalized words or bold text for emphasis, as some student interpret these formatting cues to mean yelling rather than emphasis (Hassini, 2004).  The University of Oregon Counseling and Testing Center recommends acknowledging the student’s emotion, and Larson (2008) recommends using content cues that facilitate an empathetic or sympathetic tone (e.g., I can see this is really important to you). We should use the present tense, focusing on the present situation rather than rehashing the past (Tunnecliffe, 2007) and explain what we can do (Larson, 2008) rather than explaining why we can’t address the student’s concerns, even if that is nothing more than an offer to meet and discuss.

         Some of us might want to respond to criticisms from students directly. We all make mistakes, and sometimes students’ criticisms are based on something legitimate. In these cases, it might be best to agree with what is accurate and share your plan for corrective action (Tunnecliffe, 2007). If criticism is vague, it is fine to ask for clarification (Larson, 2008). Sometimes the initial criticism, or the response to your request for clarification, can be lengthy. In these cases, it might be best to address concerns globally rather than respond to individual concerns (Tunnecliffe, 2007). If none of these strategies sound appealing, we can always deflect the criticisms by simply thanking students for sharing their views (Tunnecliffe, 2007).

     Final Thoughts: Maintain Perspective

    Regardless of how you choose to respond to critical email messages, it is important to consider Alexander Pope’s “to err is human; to forgive divine” and to cut ourselves some slack (Tunnecliffe, 2007). It is also important to recognize that, while we can make the most out of “teachable moments,” we can’t get through to everyone (Larson, 2008). Research has shown that readers who are angered by email attribute the tone to the writer’s personality (Levinson, 2010). Student politeness affects our feelings towards the student, our beliefs about the student’s competence, and our motivations to help (Stephens, Houser, & Cowan, 2009; Bolkan & Holmgren, 2012). So, it is critically important to remember and apply the lessons we teach our students about the Fundamental Attribution Error and consider that situational, rather than dispositional, factors can lead the student to send inappropriate email.

         Steve Johnson, a football player for the Buffalo Bills, blamed God for a dropped pass and posted the following to Twitter:


    So, the next time you read an annoying email message from a student, take a moment to appreciate that you are in good company.


    Biesenbach-Lucas, S. (2007). Students writing emails to faculty: An examination of e-politeness among native and non-native speakers of English. Language Learning & Technology, 11(2), 59-81.

    Bloch, J. (2002). Student/teacher interaction via email: The social context of internet discourse. Journal of Second Language Writing, 11, 117-134.

    Bolkan, S., & Holmgren, J.L. (2012). ‘‘You are such a great teacher and I hate to bother you but...’’: Instructors’ perceptions of students and their use of email messages with varying politeness strategies. Communication Education, 61(3), 253-270.

    Craske, M.G., Rauch, S.L., Ursano, R., Prenoveau, J., Pine, D.S., Zinbarg, R.E., (2009). What is an anxiety disorder? Depression and Anxiety, 26, 1066–1085.

    Duran, R.L., Kelly, L., & Keaten, J.A. (2005). College faculty use and perceptions of electronic mail to communicate with students. Communication Quarterly, 53(2), 159-176

    Gwinnet Technical College. (n.d.) Student webmail. Retrieved from http://www.gwinnetttech.edu/webmail/

    Hassini, E. (2004). Student–instructor communication: The role of email. Computers & Education, 47,  29–40.

    Kim, C. & Keller, J.M. (2008). Effects of motivational and volitional email messages (mvem) with personal messages on undergraduate students’ motivation, study habits and achievement. British Journal of Educational Technology, 39(1), 36–51. doi:10.1111

    Larson, J. (2008). Angry and aggressive students. Principal Leadership, January, 12-15. Retrieved from http://www.nasponline.org/resources/principals/Angry%20and%20Aggressive%20Students-NASSP%20Jan%2008.pdf

    Levinson, D.B. (2010). Passive and indirect forms of aggression & email: the ability to reliably perceive passive forms of aggression over email. (Unpublished doctoral dissertation). Wright Institute Graduate School of Psychology, Berkeley, CA.

    RC Martin. (2011, June 21). Avoiding the angry email [Web log post]. Retrieved from http://blog.uwgb.edu/alltherage/avoiding-the-angry-email/

    RC Martin. (2012, March 2). Responding to the angry email: A follow-up [Web log post]. Retrieved from http://blog.uwgb.edu/alltherage/responding-to-the-angry-email-a-follow-up/

    Mason, M.A., (2010, July). Email: The third shift. The Chronicle of Higher Education. Retrieved from http://chronicle.com/article/E-Mail-the-Third-Shift/66312/

    Mejia, E. (n.d.). Sample English 105 syllabus. Retrieved from http://public.wsu.edu/~mejia/Handbook/Sample_105_Syllabus.htm

    Richtmyer, E. (2007). History 3722 syllabus. Retrieved from http://www.hist.umn.edu/hist3722/syllabus.html

    Sheer, V.C., & Fung, T.K. (2007). Can email communication enhance professor-student relationship and student evaluation of professor?: Some empirical evidence. Journal of Educational Computing Research, 37(3), 289-306.

    Spinks, S. (2013). One reason teens respond differently to the word: Immature brain circuitry. Retrieved from http://www.pbs.org/wgbh/pages/frontline/shows/teenbrain/work/onereason.html

    Stephens, K.K, Houser, M.L., & Cowan, R.L. (2009). R U able to meat me: The impact of students’ overly casual email messages to instructors. Communication Education, 58(3), 303-326.

    Tunnecliffe, M. (2007). Behavioural de-escalation. Retrieved from http://www.education.nt.gov.au/__data /assets/pdf_file/0014/2318/Module7TeacherNotes.pdf

    University of Oregon Counseling and Testing Center. (2012). Strategies for Dealing with Angry Students Outside the Classroom. Retrieved from http://counseling.uoregon.edu/dnn/FacultyStaff/DisruptiveThreateningStudents/DealingwithAngryStudentsOutsidetheClassroom/tabid/325/Default.aspx

    Wilson, S., & Florell, D. (2012). What can we do about student e-mails? Observer, 25(5), 47-50.

  • 03 Apr 2017 9:00 PM | Anonymous

    Submitted by William S. Altman and Lyra Stein, Editors, E-xcellence in Teaching Essays ___________________________________________________________________________

    Using media in the classroom: A cautionary tale and some encouraging findings

    Lynne N. Kennette
    Durham College


    Instructors should use caution when implementing new methods of teaching or assessments: just because students like it, doesn’t mean their learning necessarily benefits. This was recently revealed to me in one of my classes when I tried a new activity. However, as I discovered through student comments, there is a silver lining (read on!)


    One of the key skills that instructors in psychology try to develop in their students is the identification of  independent variables (IV) and dependent variables (DV), which form the basis of research design and analysis. The very foundation of the scientific method includes identifying changes in one variable and how it relates to  another variable. I wondered whether students would show a performance advantage (or any preference for) using media clips over written scenarios used for identifying IVs and DVs in experiments. So, I presented students with video clips from episodes of the television series MythBusters (Discovery Channel), audio clips from the National Public Radio’s Radiolab series, and my traditional written experiment scenarios.

    Burkley and Burkley (2009) reported the benefits of using MythBusters clips to illustrate experimental designs. Students enjoyed the use of these clips in class, and performed better on MythBusters-related exam questions (compared to control questions). I suspected that students would prefer the video and audio scenarios for their entertainment value, but wondered whether their performance would actually benefit. Previous research suggested that students might both prefer and benefit from multimedia formats because it would stimulate interest and thus retention (Nowaczyk, Santos, & Patton, 1998). Media may also be more engaging than a written description, and engaging content leads to better learning of information (Tobias, 1994), and as we know, students put more effort into tasks they find interesting (Renninger, 1992).

    However, it is also possible that the additional information provided by audio and video clips could distract students from the relevant information required to complete the task of identifying IVs and DVs (Walker & Bourne, 1961). This distracting information may come from the irrelevant “story-telling” details required to make these media commercially appealing (especially in the case of MythBusters). Additionally, because the learner cannot as easily control the stream of information (i.e., the speed at which information is delivered), students may experience a cost when presented with media compared to the traditional written format.


    In two sections of my advanced cognitive psychology laboratory course (and following a brief review lecture on the topic of IVs and DVs), students were presented with traditional written scenarios, video clips, and audio clips and had to identify IVs and DVs. Students were assessed multiple times: immediately following the IV/DV review lecture (Time 1), during the second to last week of class (Time 2), and on the very last day of class (Time 3; here, I presented previously-encountered scenarios to measure retention, however this timepoint resulted in ceiling effects and was, therefore, difficult to analyze). At the end of the class, I also asked students (anonymously) some qualitative questions to obtain their perceptions of the three question types (e.g., which of the three were perceived easier).

    Results and Discussion

    After adjusting for final course grade, it is reassuring to have found that students improved over the course of the semester (F(2, 252) = 50.87, p < .001, hp2 = .288). Student performance on the three formats also differed (F(2, 252) = 4.01, p = .019, hp2 = .031), whereby students answered the traditional written scenarios more accurately than Radiolab questions (Mwritten = 78%, MRadiolab =68%, p = .005), but performance on the written scenarios did not differ from MythBusters questions (p = .128). What is perhaps even more interesting is that students perceived all three to be of similar difficulty, but indicated a preference for the MythBusters clips over the Radiolab audio clips. In addition, many students provided unsolicited feedback about how much “fun” the video and audio clips were and that these allowed them to finally “get” IV manipulation and DV measurement.

    So, does showing students video and audio clips actually benefit learning or performance on assessments? My experience with this activity is particularly interesting because it taught me that using media or multimedia for classroom assessment may not necessarily lead to better understanding, even though students expressed a preference for these formats. Student preference for these formats does, however, suggest that instructors can use multimedia as a valuable tool because they increase student engagement with course material.


    Some of the factors that instructors should consider when contemplating the use of multimedia for teaching and assessment include:

                Familiarity: the written format is a common way to expose students to IV and DV identification, which they may have encountered in previous courses. It is also the most common assessment method (tests and assignments), and therefore students are familiar with this format from high school. If planning to use multimedia for assessments, students should be given ample time to practice assessments using those less familiar formats.

    Superfluous information: Both types of media clips contained additional details that were not directly relevant to the experiment. The presence of these extraneous details could distract students (especially those not sufficiently proficient in experimental design and unable to suppress this irrelevant information). Walker and Bourne (1961) found a linear decline in performance on a problem-solving task with each added piece of irrelevant information (also see Mayer, Heiser, & Loan, 2001, for a more recent investigation).

    Entertainment: Students’ previous experience with MythBusters, Radiolab, or both (or perhaps television and radio more generally) as entertainment may result in difficulty focusing on the relevant experimental features of the clips (i.e., IVs and DVs), leading to declines in performance than with the written experimental scenarios,

    Concluding remarks

                Instructors should use caution when implementing new technologies and new teaching strategies. As my recent experience has demonstrated, just because they like it, doesn’t mean they necessarily learn, perform, or retain it better. Similarly, these new techniques or formats (although interesting for students) may not be appropriate to use during assessments. However, it is encouraging to know that they can lead to increased student engagement (e.g., MythBusters) which can lead to increased learning while in class! Because student engagement is so important, instructors should use many tools to encourage student learning in their discipline, while keeping in mind the considerations outlined above.



    Burkley, E., & Burkley, M. (2009). Mythbusters: A tool for teaching research methods in psychology. Teaching of Psychology, 36(3), 179–184. doi:10.1080/00986280902739586

    Mayer, R. E., Heiser, J., & Loan, S. (2001). Cognitive constraints on multimedia learning: When presenting more material results in less understanding. Journal of

    Educational Psychology, 93(1), 187–198. doi:10.1037/0022-0663.93.1.187

    Nowaczyk, R. H., Santos, L. T., & Patton, C. (1998). Student perception of multimedia in the undergraduate classroom. International Journal of Instructional Media, 25(4), 367–382.

    Renninger, K. A. (1992). Individual interest and development: Implications for theory and practice. In K. A. Renninger, S. Hidi, & A. Krapp (Eds.), The role of interest in learning and development (pp. 361–398). Hillsdale, NJ: Erlbaum.

    Tobias, S. (1994). Interest, prior knowledge and learning. Review of Educational Research, 64(1), 37–54. doi:10.3102/00346543064001037

    Walker, C. M., & Bourne, L. E. (1961). The identification of concepts as a function of amounts of relevant and irrelevant information. The American Journal of Psychology, 74(3), 410–417. doi:10.2307/1419747

    Author bio

    Lynne N. Kennette, Ph.D. is a Professor of psychology and program coordinator for General Arts and Sciences programs at Durham College in Oshawa, Ontario (Canada). She is a graduate of Wayne State University (Detroit, Michigan, M.A. and Ph.D.) and the University of Windsor (Windsor, Ontario, B.A.). She teaches primarily general education courses in introductory psychology and her research focuses on the SoTL as well as how the mind processes languages. This research was conducted at Wayne State University.

  • 15 Mar 2017 3:57 PM | Anonymous

    Teaching in the Core Curriculum:
    Re-thinking our Approach to Introductory Psychology Courses

    Amie R. McKibban
    University of Southern Indiana

        “I am losing hope, Amie. Our students are being raised in a political system that is guided by economic theory. How can I teach students the value of higher education when they come to college asking ‘what job is this going to get me and how much money am I going to make?’” This was the start of a very long conversation I recently had with a former colleague. Indeed, his concerns are well founded, as higher education has been in the center of a heated debate for the last several years. Political critics and academic administrators alike have given much attention to the idea that we need more college graduates with specialized skill sets as a way to increase graduates’ employability. Harvard English professor James Engell (n.d.) laments, “an emphasis on majors believed to land a good job… appeal to ‘utility,’ to a supposedly clear-sighted appraisal of what the ‘real’ world demands of college graduates” (para.2). As Engell further discusses, this central parable in higher education is in conflict with the reality that few entry level jobs require four years of specialized knowledge.

        In a recent survey, the American Management Association (2012) found that over half of executives felt their employees scored average, at best, in four areas: critical thinking, communication, collaboration, and creativity. Most of the executives surveyed agreed, that they need “highly skilled employees to keep up with the fast pace of change” in business (para. 3). Yes, college graduates do need a specialized skill set, but one that focuses on critical thinking and creativity, rather than content-specific knowledge. As Engell (n.d.) points out, even professional schools (e.g., law and medicine) want students who have been exposed to a broad range of knowledge; students who can critically think and “look at life as a whole” (para.3). In other words, we need to begin reemphasizing the value of a liberal arts education and the utility of the core curriculum. As many of us in higher education know, the goal of a liberal arts education is not specialized knowledge or training. Rather, a liberal arts education aims to prepare students to function as productive citizens in a diverse and complex world (Task Force on General Education, 2007). Core curricula at many institutions embrace the same philosophy. This is often asserted in declarations similar to my own institution’s, stating that the core curriculum embraces non-specialized and non-vocational learning, with an emphasis on critical thinking (the ability to analyze and evaluate information) and information processing (the ability to locate, gather, and process information).

        With this in mind, I argue that what the “real” world actually demands of our students is at the very heart of the core curriculum: a curriculum that prepares students for citizenry and productivity, regardless of major. Further, I propose that teaching Introductory Psychology from a core curriculum perspective is a step toward addressing the disconnect Engell so eloquently discusses. Although numerous instructors may currently approach the teaching of Introductory Psychology as a core curriculum class, there are just as many who take a content-based approach. That is, structuring the class with the goal of preparing students to succeed in subsequent psychology courses should they declare a major in psychology. For those of you who fall into this latter category, I encourage you to reconsider the guiding philosophy of the course. In the remainder of this essay, I offer steps (points of consideration) in restructuring the course, and reflect on my own personal experience teaching the class for 13 years, providing insights and examples to help guide you through these considerations. I strongly believe in academic freedom, and therefore these should be taken as general guidelines. You know your students, community, and state requirements best, hence; the content of your actual class should be tailored accordingly.

    Step 1: Develop a course that reaches the majority.

        Although many of us would prefer to receive “graduate-school-bound” students in our classrooms, the reality of teaching is that many students who cross our paths will discontinue their formal educational pursuits after obtaining a bachelor’s degree. Others discontinue before completion of their degree. The majority of students will need to be prepared, as well as possible, for the realities of the working world. A core curriculum approach best meets this reality; I structure my Introductory Psychology course accordingly. Much of my course’s focus is on application of the material to the real world (i.e., making the connections between theory and example) rather than memorization of content. I achieve this largely by telling stories, giving personal anecdotes, discussing clips from popular television shows, and analyzing articles in local and national newspapers.

        My approach is based on fulfilling two tenets of the core curriculum: critical thinking and information processing. Using content from the text to critically evaluate a news article, for example, reinforces the importance of a broad knowledge base for the students. It also models creativity,; one of the four skills sets discussed by the American Management Association (2012). By making the course material relevant to their lives, students are better equipped (and more motivated) to actively engage with the content. As one student recently wrote in my evaluations, “Many of the personal anecdotes and stories that were used to help teach the concepts will be with me for a long time.” The point is this: what you do with the content is much more memorable and meaningful than the content itself. This notion brings me to the second step in re-thinking Introductory Psychology as a core curriculum class.

    Step 2: Choose content for your course based on usability.

        Often times we feel pressure to cover as much material as possible. This makes sense if you are preparing students for the AP test in psychology or if the only students required to take Introductory Psychology at your institution are psychology majors. For many of us, however, this course is part of a larger curriculum, and many students (especially freshman and sophomores) will filter through our classrooms. As such, I argue that it is not the quantity of information we cover that is important, but the quality. Cut content for the sake of experience. Although this may cause some of you to cringe, I offer this: there are many terms, definitions, and facts that we forget along the way (really, how many of you can remember everything from your intro to political science course?), however, we remember the process. That is, our students may not remember the difference between a conditioned and unconditioned stimulus, but if we make the content experiential they will remember the process of classical conditioning.

        Given that many Introductory Psychology students will not become psychology majors, you should choose content by asking yourself “if this were the only course my students took, what would I want them to understand?” That is, what material (theories and concepts) will help students become more productive citizens? What do you feel is most important for them to understand and use in their everyday lives? In other words, what processes are important? For example, I always cover judgmental heuristics when discussing cognitive psychology, using current events in politics and recent findings in medicine. Indeed, understanding how humans make decisions is important in being able to make sound decisions and discover creative solutions. It is also an important process in becoming a knowledgeable consumer of information and services. What processes you feel are important to achieving the goals of the core curriculum are up to you. Choose them, and spend time on them in class. The students will remember these things. As a former student recently told me, “Every time I watch the news or read an article on Facebook, I can’t help but think of you and everything we learned in class. I find myself exclaiming ‘Darn it, McKibban!’ all of the time.”

    Step 3: Seek continual feedback from your students.

        Structuring your course in a way that promotes skill development, rather than content specific knowledge (application rather than memorization) requires continual feedback from your students. Waiting for the results of your teacher evaluations is not sufficient. I have found that having someone outside of my department come in for 20 minutes and run a focus group (while I am not there) results in the best feedback. With whatever approach works for you, ask your students, in an anonymous format, what they find effective about your teaching style, what content they have found most applicable and why, what is working for them and what is not. Tailor the questions to the individual class and discuss the results the next class period. This is something that can be done one to two times during the semester. Students will have suggestions, as well as good insights. The one “golden rule” of implementing this feedback is that you do make changes, when reasonable.

        This idea of a continual feedback loop is not only mutually beneficial, but speaks to the goals of the core curriculum. It gives your students decision making power over their education and provides them with experience in collaborating with an expert in the field when making those decisions. If we are to prepare students for the demands of the world, effectively communicating with others is a skill they must develop, especially when those “others” are people in higher positions. Again, this process is important in developing a course that promotes critical thinking and assists in the development of communication, collaboration, and creativity. Not to mention, you will learn just as much from this process as your students.

    Concluding Remarks

        The steps I have offered are meant to give you a framework in reconsidering the guiding philosophy of Introductory Psychology course development. Given the nature and breadth of the course, we have the unique opportunity to prepare our students for citizenry and productivity; for the challenge of seeing the world as a whole; and for a lifetime of critical thinking and reflection. I encourage you to ask, given your academic environment and situation, if your students would benefit from a focus on quality over quantity. I challenge all of us to find the best way possible to meet the needs of our introductory students, knowing that many of them may not finish college, or will complete a degree outside of the field of psychology. I ask you to tell your students that “whether or not you stay in college and no matter what major you ultimately choose, I promise that you will use the information learned in this class,” and then live up to that promise. After all, psychology in and of itself embraces the philosophy of a liberal arts education and the goals of a core curriculum, and what better class to demonstrate this with than Introductory Psychology? What better way can we tell students “this is the value of higher education?” I think that those of us who teach this class can relate to Engell’s (n.d.) statement that “the aims [of a liberal arts focus] are at once personal and social, private and public, economic, ethical, and intellectual” (para. 9).


    American Management Association (2012). Executive summary: AMA 2012 critical skills survey. Retrieved from http://www.amanet.org/uploaded/2012-Critical-Skills-Survey.pdf

    Engell, J. (n.d.). Professor of English James Engell offers his reflection on the value of a liberal arts education. Retrieved from http://www.admissions.college.harvard.edu/about/ learning/liberal_arts.html

    Task Force on General Education, (2007). The value of a liberal arts education. Retrieved from http://www.admissions.college.harvard.edu/about/learning/liberal_arts.html

    Amie R. McKibban, professor of psychology at the University of Southern Indiana, completed her PhD in community psychology in 2009. She has presented numerous papers and published in diverse areas, ranging from attitudes toward individuals in the LGBT community, sexual health and communication, happiness, community redevelopment, academic dishonesty, and perfectionism. Using a well-known program in a way that mobilizes allies and allows for solutions at each level, she founded and directs a community and campus wide Safe Zone program. In the first few years of her tenure at the University of Southern Indiana, she has received the Willie Effie Thomas award and Phenomenal Women of USI award for her work in social justice, as well as the H. Lee Cooper Core Curriculum award for her excellence in the teaching of psychology.

  • 04 Mar 2017 5:29 PM | Anonymous

    Four Simple Strategies from Cognitive Psychology for the Classroom


    Megan A. Smith (Rhode Island College)

    Christopher R. Madan (Boston College)

    Yana Weinstein (University of Massachusetts Lowell)


    Scientists focusing on educational research questions have a great deal of information that can be utilized in the classroom. However, there is not often bidirectional communication between researchers and practitioners in the field of education as a whole (see Roediger, 2013). In this article, we describe the science behind four evidence-based teaching strategies: (1) providing visual examples, (2) teaching students to explain and to do, (3) spaced practice, and (4) frequent quizzing. Below, we provide concise overview of these strategies and examples of how they can be implemented in the classroom before describing the science behind each strategy:


    1.      Providing visual examples
    • Relevant cognitive concepts: Dual coding
    • Description: Combining pictures with words.
    • Application examples (using social psychology topics):
      • Students can draw examples of factors determining liking or loving. For example, two people who are close vs. far away, two people who are similar vs. different, or a visual depiction of reciprocity
      • Instructors can make sure to provide video depictions of experiments where available to go with verbal descriptions (e.g., Milgram, misattribution of arousal)
    2.      Teaching students to explain and do
    • Relevant cognitive concepts: Elaborative interrogation; Levels of processing; Enactment effect
    • Description: Asking and explaining why a factor or concept is true; asking students to perform an action.
    • Application examples (using social psychology topics):
      • Students can ask and explain what factors contribute to whether one person helps another person.
      • Instructors can provide students with example scenarios of a person in need of help and ask students to describe and explain why they think a passerby may or may not help.
    3.      Spaced practice
    • Relevant cognitive concepts: Spacing; Interleaving; Distributed practice; Optimal lab
    • Description: Creating a study schedule that spreads study activities out over time.
    • Application examples (using social psychology topics):
      • Students can block off time to study for 30 minutes each day rather than only studying right before a test or exam.
      • Instructors can assign online quizzes that interleave questions from various chapters.
    4.      Frequent quizzing
    • Relevant cognitive concepts: Testing effect; Retrieval practice; Retrieval-based learning
    • Description: Bringing learned information to mind from long-term memory.
    • Application examples (using social psychology topics):
      • o   Students can practice writing out everything they know about a topic, for example conformity, obedience, and bystander effects.
      • o   Instructors can give frequent low-stakes quizzes in the classroom or online to encourage retrieval practice.


    Instructors can find free teaching materials for each of these strategies on the Learning Scientists website (www.learningscientists.org/downloadable-materials).

    We focus on these strategies because they were highlighted in a recent policy report from the National Council on Teacher Quality (Pomerance, Greenberg, & Walsh, 2016), which identified key teaching strategies based on evidence from the science of learning. The report found that few of the 48 teacher-training textbooks they examined cover any of these learning principles well–and that none covered more than two of them (but see Thomas & Goering, 2016). These strategies also reiterate recommendations made in an earlier guide commissioned by the U.S. Department of Education (Pashler, Bain, Bottge, Graesser, Koedinger, McDaniel, & Metcalfe, 2007; also see Dunlosky, Rawson, Marsh, Nathan, & Willingham, 2013). Thus, there seems to be a gap between the research – converging evidence from controlled laboratory studies and classroom studies – and practical use of the strategies in education. While there are in-depth reviews on each of these strategies, here we provide a concise, teacher-ready overview of these strategies and how they could be applied in the classroom.


    1. Providing visual examples

    Learning can be substantially enhanced if verbal information is accompanied by visual examples. This coupling of verbal and visual information is supported by the ‘dual-coding theory’ (Paivio, 1986). This theory attributes the mnemonic benefits of providing visual examples to different cognitive processes associated with processing words and images, or even words that describe concrete ideas. This can be particularly useful when teaching abstract concepts (see Figure 1 for an example, http://www.learningscientists.org/dual-coding-example), as associating concrete and abstract terms can improve memory for the abstract information (Madan, Glaholt, & Caplan, 2010).

    Additionally, there is clear evidence that memory for pictures is superior to memory for words (Paivio & Csapo, 1969; 1973). However, this effect is fundamentally distinct from the notion of “learning styles”, where information to be learned is presented in a learner’s preferred modality. This type of differentiation is not supported by cognitive research (Rohrer & Pashler, 2012) and has often been described as a myth or urban legend (Coffield, Moseley, Hall, & Ecclestone, 2004; Hattie & Yates, 2014; Kirschner & van Merriënboer, 2013). Rather than diagnosing each student’s style and matching instruction for each individual, teachers can couple visual examples with text for all students.


    2. Teaching students to explain and to do

    One of the most effective methods to improve learning of information is to have students engage with the material more ‘deeply’, also known as elaboration (Craik & Lockhart, 1972; also see Lockhart & Craik, 1990). Elaboration has been defined in many ways, but most simply it involves connecting new information to pre-existing knowledge. Perhaps William James said it best: “The art of remembering is the art of thinking [...] our conscious effort should not be so much to impress or retain [knowledge] as to connect it with something already there. The connecting is the thinking; and, if we attend clearly to the connection, the connected thing, will certainly be likely to remain within recall” (James, 1899, p. 143). Two forms of elaboration are readily applicable to classroom learning: having students explain why something is the case, and having students perform actions.

    Elaborative processing can be fostered by having students question the material that they are studying; for instance, by asking them to produce their own explanations for why a fact is true, rather than just presenting them with a complete explanation (Pressley, McDaniel, Turnure, Wood, & Ahmad, 1987). This elaboration technique is flexible enough to work in a variety of different learning situations (e.g., for students working alone or in groups, Kahl & Woloshyn, 1994). However, work on elaborative interrogation outside of the lab is just beginning (Smith, Holliday, and Austin, 2010) and we need stronger evidence from the classroom before we can confidently claim that this technique is helpful (Dunlosky et al., 2013). Another relevant technique is that of self-explanation, where students walk themselves through the steps they take during learning. This technique is helpful both when students engage in it spontaneously (Chi, Bassok, Lewis, Reimann, & Glaser, 1989), and also when teachers prompt students to produce the self-explanations (Chi, De Leeuw, Chiu, & LaVancher, 1994).

    When feasible, the most elaborative way to process information is by ‘doing’. When information could either be learned by hearing about an action, watching someone else do the action, or having the student themselves perform the action, retention was best in cases where the student performed the action themselves (Cohen, 1981; Engelkamp & Cohen, 1991). This action component can build upon the previously described dual-coding theory (Engelkamp & Zimmer, 1984; Madan & Singhal, 2012). In the classroom, this type of learning could be supported by hands-on activities (e.g., science experiments, or getting students to draw their own diagrams; Wammes et al., 2016) or field trips to museums or nature sites.


    Read Part II at: http://teachpsych.org/E-xcellence-in-Teaching-Blog/4648286

  • 04 Mar 2017 5:24 PM | Anonymous

    3. Spaced practice

    We often tell our students that cramming “doesn’t work”. That is good advice–but is not entirely true. As many students have discovered, “cramming”–an intense study period that occurs shortly before one’s memory is to be tested–sometimes does work. Cramming often produces adequate performance on an imminent exam (Roediger & Karpicke, 2006); unless the cramming is done instead of sleep, in which case the sleep deprivation outweighs any gains from cramming (GillenO’Neel, Huynh, & Fuligni, 2013). The information learned through cramming, however, will subsequently be rapidly forgotten (Bjork & Bjork, 2011). In order for information to be retained more sustainably and over longer periods of time, it needs to be revisited on multiple occasions spaced out over time. This is known as distributed practice, or the spacing effect, which has been in the literature since Ebbinghaus first discovered it in the late 19th century (Ebbinghaus, 1885/1913). Despite much converging evidence over the past 100 years (see Cepeda, Pashler, Vul, Wixted, & Rohrer, 2006), this practice has not made its way into mainstream education (Kang, 2016).  

    In the cognitive literature, a distinction is made between spacing and interleaving, i.e., switching back and forth between different topics or question types within a topic (Rohrer & Taylor, 2007). That is, Storm, Bjork, and Storm (2010) showed that interleaving produces benefits that cannot entirely be accounted for by spacing. However, in practice, it is hard to imagine an educationally relevant situation in which spacing and interleaving would be dissociated. We propose, then, that the theoretical distinction between spacing and interleaving may not be critical in terms of practical applications. Instead, teachers can focus more generally on trying to provide students with opportunities to space their studying.

    One implementation issue is that spacing hurts performance in the short-term, which makes it less appealing. Students typically feel overconfident when they cram, while spacing out learning leads them to feel relatively less confident (Bjork, 1999); but this is a “desirable difficulty”, which helps learning in the long-term (Bjork, 1994). When making predictions about future performance based on different study schedules, students tend to underestimate the benefits of spacing (Logan, Castel, Haber, & Viehman, 2012). Another reason why spacing might not be used by students as often as we’d like was recently suggested by Kang (2016): this strategy may require more advance planning than simply studying one topic until a saturation point is reached. More research is necessary to fine-tune implementation of spaced study schedules, and would preferably involve teachers in classrooms.


    4. Frequent quizzing

    The use of retrieval practice to aid learning has been a major focus of the applied cognitive literature in the past decade. As with spacing, the finding that testing strengthens memory is not new (Gates, 1917). However, the message that testing helps learning is somewhat politically charged and often lost when teachers hear the word “testing” because this activates ideas related to high-stakes standardized testing. It’s important to note that frequent testing does not have to be presented as a formal quiz; any activity that promotes retrieval of target information should help (e.g., Karpicke, Blunt, Smith, & Karpicke, 2014).

    Although the mechanisms behind the retrieval practice effect are not yet fully understood, the findings are quite clear: when preparing for a test, practicing retrieving information from memory is a much more effective strategy that restudying that information (Roediger & Karpicke, 2006). This is true even when there is no opportunity to receive feedback on the quiz (Smith, Roediger, & Karpicke, 2013), as long as performance on the practice quiz is not too low (Kang, McDermott, & Roediger, 2007). The only notable exception to the retrieval practice effect is when the final test is occurring immediately after study, in which case restudying can sometimes be more effective than testing (Smith et al., 2013). However, unless students are reviewing their notes before walking into the exam room, in general it is quite rare for students to be anticipating an immediate test situation while studying. Thus, in regular exam preparation situations, a strong recommendation can be made from the literature: students ought to practice retrieval.

    A good way to integrate quizzes into regular teaching is to provide opportunities for retrieval practice during learning; quiz questions interspersed during learning produce the same benefit to long-term retention as quiz questions presented at the end of a learning episode such as a lecture (Weinstein, Nunes, & Karpicke, 2016). In addition to providing retrieval practice, this method also boosts learning by maintaining test expectancy throughout the learning experience (Weinstein, Gilmore, Szpunar, & McDermott, 2014). A combined benefit of retrieval practice and spacing can be gained from engaging in retrieval practice multiple times. Creating the specific spacing schedule for a particular educational situation is tricky because it depends how strong the original memory is, and how quickly forgetting is going to happen for that information (Cepeda, Vul, Rohrer, & Wixted, 2008). Without the use of sophisticated software to schedule spacing, a more practical suggestion may be for teachers to include quiz questions from previous topics throughout the semester, in order to facilitate a reasonable amount of spaced practice.




    There is an unending supply of suggestions on how students can learn information more effectively. Here we draw from established cognitive psychology research and distill four simple strategies to enhance classroom learning. These four strategies are: (1) providing visual examples, (2) teaching students to explain and to do, (3) spaced practice, and (4) frequent quizzing. More specifically: (1) Try to present information with both text and pictures; (2) Get students to explain the information they are learning, or if possible, have them act things out; (3) Create opportunities to revisit information over the course of a semester; and (4) Include low-stakes quizzes throughout learning to provide retrieval practice. Critically, each of these strategies is strongly supported by extant research and can be readily implemented in the classroom.




    Bjork, R. A. (1994). Memory and metamemory considerations in the training of human beings. In J. Metcalfe and A. Shimamura (Eds.), Metacognition: Knowing about knowing (pp. 185–205). Cambridge, MA: MIT Press.

    Bjork, R. A. (1999). Assessing our own competence: Heuristics and illusions. In D. Gopher and A. Koriat (Eds.), Attention and Performance XVII. Cognitive regulation of performance: Interaction of theory and application (pp. 435-459). Cambridge, MA: MIT Press.

    Bjork, E. L., & Bjork, R. A. (2011). Making things hard on yourself, but in a good way: Creating desirable difficulties to enhance learning. In M. A. Gernsbacher, R. W. Pew, L. M. Hough, & J. R. Pomerantz (Eds.), Psychology and the real world: Essays illustrating fundamental contributions to society (pp. 56-64). New York: Worth Publishers.

    Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., & Rohrer, D. (2006). Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychological Bulletin, 132, 354-380. doi: 10.1037/0033-2909.132.3.354

    Chi, M. T., Bassok, M., Lewis, M. W., Reimann, P., & Glaser, R. (1989). Self-explanations: How students study and use examples in learning to solve problems. Cognitive Science, 13, 145-182. doi: 10.1207/s15516709cog1302_1

    Chi, M. T., De Leeuw, N., Chiu, M. H., & LaVancher, C. (1994). Eliciting self-explanations improves understanding. Cognitive Science, 18, 439-477. doi: 10.1016/0364-0213(94)90016-7

    Coffield, F., Moseley, D., Hall, E., & Ecclestone, K. (2004). Learning styles and pedagogy in post-16 learning: A systematic and critical review. London: Learning & Skills Research Centre.

    Cohen, R. L. (1981). On the generality of some memory laws. Scandinavian Journal of Psychology, 22, 267–281.

    Craik, F. I. M., & Lockhart, R. S. (1972). Levels of processing: A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11, 671–684.

    Dunlosky, J., Rawson, K. A., Marsh, E. L., Nathan, M. J., & Willingham, D. T. (2013). Improving students’ learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychological Science in the Public Interest, 14, 4-58. doi: 10.1177/1529100612453266

    Ebbinghaus, H. E. (1885/1913). Memory: A contribution to experimental psychology. New York: Teachers College, Columbia University.

    Engelkamp, J., & Cohen, R. L. (1991). Current issues in memory of action events. Psychological Research, 53, 175-182. doi: 10.1007/BF00941384

    Engelkamp, J., & Zimmer, H. D. (1984). Motor programme information as a separable memory unit. Psychological Research, 46, 283–299. doi: 10.1007/BF00308889

    Gates, A. I. (1917). Recitation as a factor in memorizing. New York: The Science Press

    Gillen-O’Neel, C., Huynh, V. W., & Fuligni, A. J. (2013). To study or to sleep? The academic costs of extra studying at the expense of sleep. Child Development, 84, 133-142. doi: 10.1111/j.1467-8624.2012.01834.x

    Hattie, J., & Yates, G. (2014). Visible learning and the science of how we learn. New York: Routledge.

    James, W. (1899). Talks to teachers on psychology: And to students on some of life's ideals. New York: Henry Holt and Company. Accessible from https://ebooks.adelaide.edu.au/j/james/william/talks/.

    Kahl, B., & Woloshyn, V. E. (1994). Using elaborative interrogation to facilitate acquisition of factual information in cooperative learning settings: One good strategy deserves another. Applied Cognitive Psychology, 8, 465-478. doi: 10.1002/acp.2350080505

    Kang, S. H. (2016). Spaced repetition promotes efficient and effective learning policy implications for instruction. Policy Insights from the Behavioral and Brain Sciences, 3, 12-19. doi: 10.1177/2372732215624708.

    Kang, S. H., McDermott, K. B., & Roediger III, H. L. (2007). Test format and corrective feedback modify the effect of testing on long-term retention. European Journal of Cognitive Psychology, 19, 528-558. doi: 10.1080/09541440601056620

    Karpicke, J. D., Blunt, J. R., Smith, M. A., & Karpicke, S. S. (2014). Retrieval-based learning: The need for guided retrieval in elementary school children. Journal of Applied Research in Memory and Cognition, 3, 198-206. doi:10.1016/j.jarmac.2014.07.008

    Kirschner, P. A., & van Merriënboer, J. J. (2013). Do learners really know best? Urban legends in education. Educational Psychologist, 48, 169-183. doi: 10.1080/00461520.2013.804395

    Lockhart, R. S., & Craik, F. I. M. (1990). Levels of processing: A retrospective commentary on a framework for memory research. Canadian Journal of Psychology, 44, 87–112. doi: 10.1037/h0084237

    Logan, J. M., Castel, A. D., Haber, S., & Viehman, E. J. (2012). Metacognition and the spacing effect: the role of repetition, feedback, and instruction on judgments of learning for massed and spaced rehearsal. Metacognition and Learning, 7, 175-195. doi: 10.1007/s11409-012-9090-3

    Madan, C. R., Glaholt, M. G., & Caplan, J. B. (2010). The influence of item properties on association-memory. Journal of Memory and Language, 63, 46-63. doi:10.1016/j.jml.2010.03.001

    Madan, C. R., & Singhal, A. (2012). Using actions to enhance memory: Effects of enactment, gestures, and exercise on human memory. Frontiers in Psychology, 3, 507. doi:10.3389/fpsyg.2012.00507

    Moscovitch, & H. L. Roediger (Eds.), Perspectives on human memory and cognitive aging: Essays in honour of Fergus I. M. Craik (pp. 28-47). Philadelphia: Psychology Press.

    Paivio, A. (1986). Mental representations: A dual coding approach. New York: Oxford University Press.

    Paivio, A., & Csapo, K. (1969). Concrete image and verbal memory codes. Journal of Experimental Psychology, 80, 279-285. doi: 10.1037/h0027273

    Paivio, A., & Csapo, K. (1973). Picture superiority in free recall: Imagery or dual coding? Cognitive Psychology, 5, 176-206. doi: 10.1016/0010-0285(73)90032-7

    Pashler, H., Bain, P. M., Bottge, B. A., Graesser, A., Koedinger, K., McDaniel, M., & Metcalfe, J. (2007). Organizing instruction and study to improve student learning (NCER 2007-2004). Washington, DC: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ncer.ed.gov.

    Pomerance, L., Greenberg, J., and Walsh, K. (January 2016). Learning About Learning: What Every New Teacher Needs to Know. Washington, D.C.: National Council on Teacher Quality. Retrieved from http://www.nctq.org/dmsView/Learning_About_Learning_Report.

    Pressley, M., McDaniel, M. A., Turnure, J. E., Wood, E., & Ahmad, M. (1987). Generation and precision of elaboration: Effects on intentional and incidental learning. Journal of Experimental Psychology: Learning, Memory, and Cognition, 13(2), 291-300. doi: 10.1037/0278-7393.13.2.291

    Roediger, H. L. (2013). Applying cognitive psychology to education: Translational educational science. Psychological Science in the Public Interest, 14,1-3. doi: 10.1177/1529100612454415

    Roediger, H. L., & Gallo, D. A. (2002). Levels of processing: Some unanswered questions. In M. Naveh-Benjamin, M. Moscovitch, & H. L. Roediger (Eds.), Perspectives on human memory and cognitive aging: Essays in honour of Fergus I. M. Craik (pp. 28-47). Philadelphia: Psychology Press.

    Roediger, H. L., & Karpicke, J. D. (2006). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological Science, 17, 249-255. doi: 10.1111/j.1467-9280.2006.01693.x

    Smith, B. L., Holliday, W. G., & Austin, H. W. (2010). Students' comprehension of science textbooks using a questionbased reading strategy. Journal of Research in Science Teaching, 47, 363-379. doi: 10.1002/tea.20378

    Smith, M. A., Roediger, H. L., & Karpicke, J. D. (2013). Covert retrieval practice benefits retention as much as overt retrieval practice. Journal of Experimental Psychology: Learning, Memory, and Cognition, 39, 1712-1725. doi: 10.1037/a0033569

    Storm, B. C., Bjork, R. A., & Storm, J. C. (2010). Optimizing retrieval as a learning event: When and why expanding retrieval practice enhances long-term retention. Memory & Cognition, 38, 244-253. doi: 10.3758/MC.38.2.244

    Rohrer, D., & Pashler, H. (2012). Learning styles: Where's the evidence? Medical Education, 46, 34-35. doi: 10.1111/j.1365-2923.2012.04273.x

    Rohrer, D., & Taylor, K. (2007). The shuffling of mathematics practice problems improves learning. Instructional Science, 35, 481-498. doi: 10.1007/s11251-007-9015-8

    Thomas, P. L., & Goering, C. Z. (2016, March). Review of learning about learning: What every new teacher needs to know. Retrieved from http://nepc.colorado.edu/thinktank/review-teacher-education

    Wammes, J. D., Meade, M. E., & Fernandes, M. A. (2015). The drawing effect: Evidence for reliable and robust memory benefits in free recall. Quarterly Journal of Experimental Psychology, 69, 1752-1776. doi: 10.1080/17470218.2015.1094494

    Weinstein, Y., Gilmore, A. W., Szpunar, K. K., & McDermott, K. B. (2014). The role of test expectancy in the build-up of proactive interference in long-term memory. Journal of Experimental Psychology: Learning, Memory, and Cognition, 40, 1039-1048. doi: 10.1037/a0036164

    Weinstein, Y., Nunes, L. D., & Karpicke, J. D. (2016). On the placement of practice questions during study. Journal of Experimental Psychology: Applied, 22, 72-84. doi: 10.1037/xap0000071

    Megan Smith is an Assistant Professor at Rhode Island College. She received her Master’s in Experimental Psychology at Washington University in St. Louis and her PhD in Cognitive Psychology from Purdue University. Megan’s area of expertise is in human learning and memory, and applying the science of learning in educational contexts. Megan is passionate about bridging the gap between research and practice in education. In an effort to promote more conversations between researchers and practitioners, she co-founded The Learning Scientists (www.learningscientists.org). Her research program focuses on retrieval-based learning strategies, and the way activities promoting retrieval can improve meaningful learning in the classroom. Megan addresses empirical questions such as: What retrieval practice formats promote student learning? What retrieval practice activities work well for different types of learners? And, why does retrieval increase learning?


    Christopher Madan is a Postdoctoral Fellow at Boston College. He received his PhD in Psychology from the University of Alberta. Chris’ area of expertise is in human memory and decision making, particularly in factors that can make some information more memorable. He studies the role of factors intrinsic to the to-be-remembered information, such as emotion and reward, as well as mnemonic strategies, particularly the Method of Loci. His research program is particularly interested in how biases in memory encoding and retrieval can manifest in other cognitive domains. Chris uses a variety of methodological approaches, including cognitive psychology, neuroimaging, and computational modeling to investigate ‘what makes memories last’.


    Yana Weinstein is an Assistant Professor at University of Massachusetts, Lowell. She received her PhD in Psychology from University College London and had 4 years of postdoctoral training at Washington University in St. Louis. The broad goal of her research is to help students make the most of their academic experience. Yana's research interests lie in improving the accuracy of memory performance and the judgments students make about their cognitive functions. Yana tries to pose questions that have direct applied relevance, such as: How can we help students choose optimal study strategies? Why are test scores sometimes so surprising to students? And how does retrieval practice help students learn? She recently co-founded The Learning Scientists (www.learningscientists.org) with Megan Smith.



  • 15 Feb 2017 7:19 PM | Anonymous

    Enhancing Student Learning with Podcasting and Screencasting

    David B. Miller, Ph.D.
    Professor, Department of Psychology
    University of Connecticut

        Portable devices for media consumption became prominent in the 1950s and 1960s with the growing popularity of the transistor radio (Schiffer, 1991). Since then, there has been a cultural shift fostered by the invention of newer technologies such as the Sony Walkman in the 1980s, and in the current century, the Apple iPod and similar personal listening devices. A vast ecosystem of accessories that facilitate portability has co-evolved with these technologies (Darlin, 2006). While these devices were originally intended for listening to musical recordings, other media such as books, newspapers, magazines, movies, and podcasts have since gained popularity in the portable media market.

        Podcasts are digital recordings that can be downloaded from the Internet or from another source, such as Apple’s iTunes Store, from which they are also available for subscription, usually at no cost. Once downloaded, they can be accessed directly on a computer or transferred to a portable digital media player, such as an iPod, iPhone, or any other mobile device capable of playing audio files. (Despite the name, “podcast,” one does not need an Apple “iPod” to use these digital recordings.)

        When podcasts were first introduced around 2004, they were audio recordings. While this has remained the primary format, others have evolved. An “enhanced” podcast contains not only audio, but also a visual component, typically a series of static (i.e., no animations) Microsoft PowerPoint or Apple Keynote screens. Enhanced podcasts also contain a navigation menu. When accessed on a computer via iTunes, a new menu item appears called, “Chapters.” Clicking on this unfurls a list of “chapters,” along with a small visual icon, of each screen composing the podcast. Users can navigate to whichever chapter they want to hear, or can simply allow the podcast to play sequentially. Enhanced podcasts can be created in a variety of ways, but the most popular software packages are Apple GarageBand, which comes bundled with every Macintosh computer as part of the iLife software suite, and a shareware software package from Humble Daisy called ProfCast (http://www.profcast.com). For non-iTunes users, enhanced podcasts can be saved as .mov files playable on the Internet.

        Finally, actual video podcasts have become more prevalent. They are best used only when a video component is essential, because video can greatly increase the file size depending on how it is encoded, its dimensions, and other factors. For example, if the university cancels class because of bad weather, I upload a video podcast of that day’s lecture to keep my class on schedule. In this case, video is essential. Some podcasts, such as speeches by notable individuals, are available either as audio-only or as video. The visual aspect is appealing in such cases, but the audio alone can suffice.

    iCube: Issues In Intro   

        I began my first podcast series in the Fall of 2005, in connection with my 315-student General Psychology course. The main component of iCube: Issues In Intro is a weekly discussion of course material that I conduct with a small group of up to 20 students. The discussions, which typically last 40-50 minutes, are primarily student-driven (Sener, 2007). They ask questions and I respond. Nothing is scripted. These casual discussions take place in a seminar room near my office in which I set up eight microphones connected to an audio mixer, which, in turn, is connected to my laptop computer for capturing the audio.

        Students who participate receive no extra credit for doing so. Some students return every week, and others stop by only a few times in the semester. Because I have to identify a time when both I and a seminar room are available, there are usually many students who would like to participate but cannot due to schedule conflicts. I encourage students to send in questions via email if they are unable to attend, and we address those items in the podcast.

        The participants are highly motivated and willing to invest the extra time. Interestingly, the majority are not psychology majors, but many of them become very engaged in the course content via our podcast discussions and end up either switching majors, incorporating psychology as a double major, or pursuing a minor in psychology. As an added benefit, I’ve become the academic advisor of former podcast participants. In large classes, students and professors often have difficulty getting acquainted with one another, but podcasting greatly facilitates the kind of scholarly interactions that might otherwise not occur in large classroom settings. Having podcast participants as my advisees enables me to better serve them, and, of course, there are additional benefits to the students in terms of having at least one professor who can write somewhat detailed letters of recommendation in the years that follow.

        Perhaps most importantly, these weekly discussions provide a means of personalizing the course, making it seem psychologically “smaller.” The large class sessions are lectures with minimal opportunity for discussion; but, students who participate in the podcast recordings have an opportunity to interact with me (and me with them) in a relatively informal context. Students who routinely listen to the podcasts also report of sense of having a more personal connection with me and with the student participants. While I prefer lecturing with computerized multimedia in my courses, podcasting provides an important means to incorporate active learning for those students seeking such an opportunity (McLoughlin & Lee, 2007).

        In addition to the weekly discussion, there are two other components of iCube: Precasts and Postcasts. Precasts are short, enhanced podcasts (5-15 minutes long) that I record twice weekly (because I lecture twice each week). They’re intended to provide students with important points that I’ll cover in the next lecture. I also play the Precasts before class begins for students who arrive early, which gives them yet another way of accessing the material and also provides a mechanism for “setting up” the lecture that immediately follows.

        The third component of iCube is the Postcasts, which I create sporadically. Postcasts are content modules that I record to clarify difficult concepts, or items that I feel I didn’t cover clearly in class. In recent years, I have uploaded video screencasts (see below) of full lectures to keep the class on track when school is cancelled.

        iCube is accessible via iTunes for free subscription. As is the case with participating in the recording sessions, listening to the podcasts is entirely optional. I make it available as one of several course enhancements to aid in student learning.

        Every semester, I add items to the University course evaluations to ascertain how many students are listening to iCube and whether they believe that these podcasts help them learn the material. Data gathered over the course of eight semesters starting in Fall 2005, indicate that approximately 40% of the class listen more than occasionally to the podcasts. Of that 40%, 76% of the students report that the podcasts enhance their learning. Most of the remaining 24% report that the podcasts were only marginally helpful. The reason that most of the non-listeners give for not accessing the podcasts is that they don’t feel they have enough time do so.

    Animal Behavior Podcasts

        In the Fall of 2006 (one year after launching iCube), I began a second podcast series for my upper-division Animal Behavior course. This course, which used to have a capacity of 50 students, now has a capacity of 150, and is also taught as a lecture. Among the 150 students, there are typically about 10 who are in the University Honors Program. Honors students at UConn may, with an instructor’s permission, convert a non-Honors course to obtain Honors credit. (Students in the upper-division Honors Scholars Program need 12 Honors credits to graduate with Honors, along with other requirements.)

        My Animal Behavior Podcasts series provides an opportunity to earn Honors credit in this course. It’s based on the iCube discussion model, but Honors students who participate are expected to attend regularly. In these 40-50-minute sessions, we discuss animal behavior course content. Like iCube, these discussions are informal and are distributed on iTunes. In recent years, there have been about 14 Honors students each semester earning Honors credit by participating in these podcasts.

    Interactive Discussion vs. Coursecasts

        In higher education, podcasting gained popularity as a means of recording and distributing entire lectures (what I refer to as “coursecasts”). Lecture recording has been around at least since the invention of affordable, portable cassette tape recorders. Today’s coursecasts are much easier to distribute because of their digital format. At some universities, coursecasts can be created by any professor at the flick of a switch when they enter classrooms outfitted with recording equipment. But one wonders about the extent to which such ease of recording has been preceded by forethought regarding course enhancement.

        Some professors fear students might skip class if coursecasts are readily available (Young, 2008). To minimize attendance problems, some professors who do coursecasting have developed counter-strategies, such as giving regular in-class assessments, recording only a portion of each lecture, waiting a week or longer before uploading the recordings, or even eliminating coursecasting altogether if attendance drops significantly.

        My own experience at UConn with both General Psychology and Animal Behavior podcasts is that students not only view these podcasts as genuine enhancements over and above the classroom experience, but also that the podcasts help the students understand the material and become further engaged with course content. Nevertheless, coursecasting appears to dominate higher education podcasts (certainly those available via iTunes U).

        Coursecasting can also be helpful on religious holidays when observant students will not be in class, and when weather conditions are not threatening enough to deter some (but not all) commuting students, yet not bad enough to result in cancelled classes. The result is that students who have legitimate reasons for being absent from a particular lecture will still have the opportunity to access the course content.

        A major issue for coursecasting is the inclusion of copyrighted material in these distributed lectures. Materials that may have been used legally in a classroom through the “fair use” provision of the Copyright Law of the United States should not be distributed in downloadable podcasts. Instructors who record and then distribute lectures are legally required to edit out such materials prior to distribution. Unfortunately, some of the automated recording systems installed in lecture halls make this difficult because the files are immediately uploaded to a server. In situations where coursecasts are editable, instructors need to acquire expertise in editing as well as a willingness to devote the time for such post-production following each lecture. Thus, routine coursecasts not only have questionable value as an educational enhancement but also potentially have legal consequences.

        Coursecasts might provide an enhancement if approached differently. For example, instead of recording in-class lectures, the actual course content could be delivered by recordings of the professor for students to access online on a regular basis. Class time could then be used for discussion, clarification, demonstrations, examples and applications that weren’t included in the recorded podcasts, and student presentations. Perhaps a better way to conceptualize the application of such media for classroom use is to use the term “coursecast” in reference to a recording of a live classroom lecture, and “screencast” as a recording intended to substitute for a live lecture, thereby providing a basis for what has come to be known as a “hybrid” or “flipped” course.


        In a sense, a screencast can be viewed as an evolutionary advance relative to podcasts and coursecasts. Screencasts are dynamic in the sense that they are produced by recording all activity on one’s computer screen with added narration, edited with sometimes powerful post-production tools, and then exported as videos to be uploaded to the Internet for viewing. Software programs such as ScreenFlow (http://telestream.net) and Camtasia Studio (http://techsmith.com) offer powerful, but user-friendly interfaces for producing screencasts.

        Screencasts can range from simple tutorials (e.g., instructions to be followed in a laboratory course), elaborations of points made in class, or even entire lectures and entire courses, as would be the case with a hybrid or flipped course.

        In 2009, I used ScreenFlow to convert my large Animal Behavior lecture course to a hybrid course in which most of the content was delivered online via streaming video. Students were able to access the videos anytime on a password-protected server, and we met once weekly for discussion, questions, and additional course content not covered in the screencasts. The post-production editing tools enabled me to focus students’ attention on particular screen elements, which is not easily done in a live lecture. Additionally, students were able to pause the videos, replay parts if they so desired, and take thorough, high-quality notes.

        The time that it took (well over 400 hours) to produce the screencasts paid off in terms of student engagement in course material and learning. Almost half of the class of 140 students earned course grades of “A,” and not a single student failed the course the first time it was offered in Fall 2009. It’s been offered in this format every Fall since then with similar results.

        What is clear is that technology (podcasts, coursecasts, screencasts, and other innovations), when used properly, can serve as pedagogical enhancements. However, technology should not be used just for the sake of using it, or simply because it happens to be available. Pedagogy must always precede technology.


    Darlin, D. (2006, February 3). The iPod ecosystem. The New York Times, C1.

    McLoughlin, C., & Lee, M. J. W. (2007). Listen and learn: A systematic review of the evidence that podcasting supports learning in higher education. In: C. Montgomerie & J. Seale (Eds.), Proceedings of ED-MEDIA 2007 World Conference on Educational Multimedia, Hypermedia & Telecommunications (pp. 1669-1677). Vancouver, Canada, June 25-29, 2007.

    Schiffer, M. B. (1991). The Portable Radio in American Life. Tuscon: The University of Arizona Press.

    Sener, J. (2007). In search of student-generated content in online education. Retrieved February 15, 2013, from http://www.e-mentor.edu.pl/artykul/index/numer/21/id/467

    Young, J. R. (2008). The lectures are recorded, so why go to class? The Chronicle of Higher Education, 54, A1.


    David Miller is a Professor of Psychology, Associate Department Head, and Coordinator of Undergraduate Studies at the University of Connecticut at Storrs.  He received his Ph.D. at the University of Miami in 1973, and his research has focused on animal behavior, both in the field and in the laboratory.  He was a Postdoctoral Fellow at the North Carolina Division of Mental Health, where he did field research on parent-offspring auditory interactions of several avian species.  In 1977, he became an Alexander von Humboldt Fellow at the University of Bielefeld (Germany) in the Department of Ethology and a participant in a nine-month interdisciplinary conference on “Behavioral Development in Animals and Man” at the Center for Interdisciplinary Research. He returned to the North Carolina Division of Mental Health in 1978 as a Research Associate, where he began a long series of studies on alarm call responsivity of mallard ducklings, which continued when he joined the faculty at the University of Connecticut in 1980.  Beginning around 1990, his long-standing interest in the effective use of multimedia in the classroom expanded and has continued to evolve.  He has received several awards for teaching excellence at the University of Connecticut and, in 1989, was the recipient of The National Psi Chi/Florence L. Denmark Faculty Advisor Award “for outstanding contributions to Psi Chi and psychology.”  He received the high honor of University of Connecticut Teaching Fellow (1997–1998), and, in 1999, his work in multimedia instructional design and classroom implementation was recognized with the Chancellor’s Information Technology Award.  In 2005, he received the University of Connecticut Alumni Association Faculty Excellence Award in Teaching at the Undergraduate Level, as well as the 2005–2006 University of Connecticut Undergraduate Student Government Educator of the Year Award.  In 2007, he received the University of Connecticut Outstanding Student Advisement and Advocacy Award, and his efforts in podcasting were recognized by the national publication, Campus Technology, which awarded him the 2007 Outstanding Innovator Award in Podcasting.  In 2011, he received the Frank Costin Memorial Award from the National Institute on the Teaching of Psychology for promoting quality teaching methods, as illustrated in a poster on screencasting, and, in 2012, the Animal Behavior Society Distinguished Teaching Award.  He has served on several editorial boards and was Editor-in-Chief of the scholarly journal, Bird Behavior for 15 years.  In recent years, Dr. Miller has devoted considerable time in creating computerized, multimedia versions of his animal behavior and introductory psychology courses.  Multimedia production of university-level educational material is one of his foremost activities.  His most recent multimedia project involved a major transformation of his Animal Behavior course into 90 screencast movies, an effort that was also featured in Campus Technology magazine.
  • 02 Feb 2017 9:07 AM | Anonymous

    Ditching the “Disposable Assignment” in Favor of Open Pedagogy

    Rajiv S. Jhangiani

    Kwantlen Polytechnic University

    Ever since George Miller’s famous (1969) APA presidential address, many others have called upon our field to “give psychology away” (e.g., Epstein, 2006; Goldman, 2014; Klatzky, 2009; Lilienfeld, Ammirati, & Landfield, 2009; Tomes, 2000; Zimbardo, 2004). There is arguably no better way to achieve this than by adopting open pedagogy to place the knowledge base of our discipline in as many hands as possible.

    With open pedagogy, students are not just consumers of educational resources but also producers of educational resources. A key aspect of open pedagogy therefore involves replacing “disposable assignments” with “renewable assignments” (Wiley, 2013). Disposable assignments are those that are typically only seen by the instructor. Students often see little point in them (and rarely revisit them) and many instructors despise grading them. David Wiley, an open education pioneer, describes them bluntly:

    They’re assignments that add no value to the world – after a student spends three hours creating it, a teacher spends 30 minutes grading it, and then the student throws it away. Not only do these assignments add no value to the world, they actually suck value out of the world. Talk about an incredible waste of time and brain power (and a potentially huge source of cognitive surplus)! (2013, para. 5)

    By contrast, renewable assignments are those in which the students’ energy and efforts are repurposed by having them generate materials and resources for the “commons,” including future students taking their course and other formal and informal learners around the world. The materials produced might include developing tutorials, wiki entries, or even videos posted online.

    Incorporating openness into pedagogy is simultaneously liberating and terrifying. It challenges instructors to reflect on their practices and move away from the traditional top-down model of pedagogy by assigning open-ended problems and empowering students to act as co-creators (Rosen & Smale, 2015). But whereas it takes a degree of courage to untether oneself from the security and predictability of the staid research essay, once accomplished, the benefits to the learning process are sizable. For one, students and instructors work collaboratively towards creating resources for public consumption, adding tangible value to the world outside of their classroom. Second, students tend to invest more effort and care more deeply about the product when they know that their work has a larger potential audience than just their instructor (Farzan & Kraut, 2013). Third, open pedagogy unleashes the students’ creative potential, allowing them to ascend the rungs of the cognitive process dimension in Bloom’s revised taxonomy (Anderson & Krathwohl, 2001). Here they generate, plan, and produce instead of merely recognizing and recalling, in the process acquiring higher-order cognitive and meta-cognitive skills that will serve them throughout their university education and career. Fourth, depending on the specific nature of the assignment, the resource produced may serve as an enduring electronic portfolio of their academic work that can be shared with others, including potential employers. In this fashion they may showcase their writing skills (e.g., blogs, wiki entries, etc.), multimedia skills (e.g., videos, websites, etc.), or even their ability to integrate and apply research findings (e.g., policy proposals or briefs). And finally, “because any one of these remixes might end up helping next semester’s students finally grasp the concept that has proven so difficult in the past, faculty are willing to invest in feedback and encouragement at a different level” (Wiley, 2013, para. 16).

    Instructors interested in experimenting with open pedagogy might, for example, design course assignments that require students to create a guide for parents on the use of rewards and punishments with young children based on principles from learning theory, design a public service announcement for a local nonprofit organization based on principles from social psychology, build and edit a wiki that might serve as an instructional resource for future students, write questions for an in-class practice quiz ahead of midterm examinations, or publish blog posts that critically analyze depictions of psychological phenomena in popular films. On a larger scale, an excellent example of an organized open pedagogy initiative is the Association for Psychological Science’s (APS) Wikipedia Initiative.

    APS Wikipedia Initiative

    Wikipedia is a free, online encyclopedia, written and edited collaboratively by those who use it. Its English language edition includes about 4.7 million articles and is the sixth most popular website in the world, with nearly 500 million unique visitors every month (“Wikipedia,” n.d.). Its incredible popularity among students, for whom it is often the first resource accessed when looking up background information for a term paper (Head & Eisenberg, 2009; Lim, 2009), is matched only by its equal unpopularity among faculty, who strongly caution against citing its articles or even penalize their students for doing so (Waters, 2007). Some instructors may work with librarians to better instruct their students on how (and why) to access refereed articles from research databases, but this strategy is merely a weak left jab at the problem. The APS Wikipedia Initiative (APSWI), on the other hand, presents a creative and pragmatic right hook.

    Born out of a desire to “deploy the power of Wikipedia to represent scientific psychology as fully and as accurately as possible and thereby to promote the free teaching of psychology worldwide” (“APS Wikipedia Initiative,” n.d.), the APSWI serves to improve the very resource whose use psychology faculty routinely rail against.

    For context, there are currently more than 8,500 articles on Wikipedia devoted to topics in psychology. At the time of this writing, only 63% of these have been assessed through Wikipedia’s peer assessment system. Far more terrifyingly, only 9% of these have achieved “good article” status while the remaining lower quality articles are viewed in excess of 64,000 times every six months (“APS Wikipedia Initiative,” n.d.).

    These sorts of numbers are why, in 2011, then-APS President Mahzarin Banaji called upon psychology faculty to participate in the APSWI as contributors, reviewers, and especially through adopting open pedagogy:

    The likely most effective way to generate contributions, in my opinion, is to include writing for Wikipedia as part of college and graduate-level courses. In this way, professors and students in a class can begin to populate Wikipedia on the topic of the course, taking advantage of the built-in expertise that is contained in that collective, in a semester long time frame. Writing Wikipedia entries from scratch, editing entries, or evaluating them can be a worthwhile learning experience in a standard classroom. Such work can teach students so much — that even the simplest ideas are hard to communicate to general audiences; that logic, strength of argument, flow and clarity of writing, citations of the appropriate literature, and, above all, accuracy need to be mastered in order to be a member of this guild. My request is that for any course that you are about to teach this semester and beyond, that you consider adding contribution to Wikipedia as part of the course’s requirements. (para. 8)

    Many faculty have since responded to Banaji’s call. During the Fall 2011 and Spring 2012 semesters alone, 640 students across 36 classes participated in the APSWI. Collectively, they edited 840 articles – “the rough equivalent of writing a 1,200 page textbook in psychology” (Farzan & Kraut, 2013, p. 5). Participating instructors have ranged from those completely new to Wikipedia (e.g., Hoetger & Bornstein, 2012) to those with extensive experience (e.g., Marentette, 2014), and the classes enrolled have ranged from small seminars (e.g., Karney, 2012) to enormous 1,700 student sections (Joordens, 2012). The APSWI has also been incorporated into courses at all levels, displacing a research paper in an introductory psychology course (Ibrahim, 2012), a literature review in a 200-level cognitive psychology course (Munger, 2012), a research article review in an upper level course on memory (Hoetger & Bornstein, 2012), an essay for a fourth-year course on the history of psychology (Reynolds, 2011), a 15-page paper in a graduate seminar in social psychology (Karney, 2012), and a traditional final paper in a graduate course on clinical neuropsychology (Silton, 2012).

    Naturally, appropriate instruction and support must be provided and the specific assignment (e.g., adding citations, writing or revising articles, being granted “good article” status by the Wikipedia community on the basis of the quality of writing, neutrality, and appropriate sourcing, etc.) must be tailored to the level and ability of the class. For example, introductory psychology students might be best served by working in teams and focusing their efforts on a small number of articles, adding citations, images, and links where necessary, tagging them appropriately when problems are located, and incorporating feedback from their peers and the Wikipedia community. The potential benefits to students from participating in the APSWI include achieving a deeper understanding of the topic (Farzan & Kraut, 2013), learning to evaluate and defend the credibility of their sources (Marentette, 2014), learning to write more concisely and think more critically (Farzan & Kraut, 2013), collaborating with students from other universities and around the world (Karney, 2012), learning to provide as well as receive constructive feedback (Ibrahim, 2012), enhancing digital literacy (Silton, 2012), and learning how to communicate ideas to a general audience (Association for Psychological Science, 2013).

    Although some students begin a little wary of the assignment, they go on to derive excitement, meaning, and even pride from the open nature of their work, as the following instructor testimonials indicate:

    The students also realized they were a valuable asset to Wikipedia. Their thinking and writing skills as well as their access to an extensive academic library were not broadly shared. As knowledge translators, they could also provide a service to the general public by clearly communicating basic concepts about language acquisition. They wondered who their readers might be: parents? teachers? students in developing countries? One thing that the students uniformly loved about this project was the possibility of other people seeing and recognizing their work. (Marentette, 2014, p. 37).

    They felt their work was meaningful because their contributions are shared with the entire world, rather than just their instructor. They liked that their contributions will not end up in a drawer after the semester ends, but will continue to be available to many people as a useful resource. Some students even noted with pride that their contributions might have wider use than some articles published in academic journals. (Ibrahim, 2012, p. 29)

    Of course, participating in the APSWI is not without its challenges, which include developing an appropriate rubric for grading (Silton, 2012), learning the writing style and referencing standards of Wikipedia (Reynolds, 2011), managing the time frame of the assignment (Marentette, 2014), and maintaining flexibility with the assignment guidelines (Hoetger & Bornstein, 2012). Some practical strategies for instructors considering participating in the APSWI include providing a list of topics not yet covered on Wikipedia, gaining experience with posting an article, looking through the sample Wikipedia assignments provided by the APS, making use of the many articles and step-by-step guides for editing Wikipedia articles and participating in the APSWI, and enlisting the help of a campus Wikipedia Ambassador (Hoetger & Bornstein, 2012; Ibrahim, 2012).

    Concluding Thoughts

    Adopting open pedagogy can seem daunting at first but does not have to mean designing an entirely new assignment or working with new media. All that is required is for the students to work towards producing a resource that others will find useful. This could include literature reviews, evidence-based policy recommendations, or practical guides for the application of psychological knowledge (e.g., promoting environmentally responsible behavior, parenting, etc.). However, if an assignment requires students to develop and exercise a new skill, instructors will need to plan to provide instruction and support throughout the process (e.g., it takes some practice to learn how to properly edit Wikipedia articles). Depending on the nature of the assignment, instructors may also have to develop or locate an appropriate grading rubric.

    As mentioned earlier, adopting open pedagogy is simultaneously liberating and terrifying. With traditional (closed) assignments, vague guidelines, a poor design, unclear rubrics, and insufficient support remain hidden, with student evaluations and perhaps a few grey hairs being the only enduring record. With open pedagogy, on the other hand, both successes and failures with the assignment are much more public. But while this opens the instructor to more criticism, it is also an opportunity to share, collaborate, and receive constructive feedback. More importantly, it creates a foundation for our students to begin to invest more deeply, think more critically, work more collaboratively, and communicate more accessibly—exactly the skills needed to be able to “give psychology away.”


    Anderson, L. W., & Krathwohl, D. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. New York: Longman.

    APS Wikipedia Initiative. (n.d.). Retrieved from http://www.apastyle.org/learn/faqs/web-page-no-author.aspx

    Association for Psychological Science [PsychologicalScience]. (2013, May 23). 2013 APS convention video: The benefits of traditional vs. Wikipedia research assignments [Video file]. Retrieved from       https://www.youtube.com/watch?v=6YBdQH0eIEQ&t=66

    Banaji, M. (2011). Harnessing the power of Wikipedia for scientific psychology: A call to action. Observer, 24(2). Retrieved from http://www.psychologicalscience.org/index.php/publications/observer/2011/february-11/harnessing-the-power-of-wikipedia-for-scientific-psychology-a-call-to-action.html

    Epstein, R. (2006). Giving psychology away: A personal journey. Perspectives on Psychological Science, 1(4), 389-400. doi:10.1111/j.1745-6916.2006.00023.x  

    Farzan, R., & Kraut, R. E. (2013). Wikipedia classroom experiment: Bidirectional benefits of students’ engagement in online production communities. CHI'13: Proceedings of the ACM conference on human factors in computing systems (pp. 783-792). New York: ACM Press. doi:10.1145/2470654.2470765

    Goldman, J. G. (2014). Giving psychological science away online. Observer, 27(3), 9-10.

    Head, A. J., & Eisenberg, M. B. (2009, December 1). Lessons learned: How college students seek information in the digital age. Project Information Literacy Progress Report. Retrieved from the Project Information Literacy Website at the University of Washington: http://projectinfolit.org/pdfs/PIL_Fall2009_Year1Report_12_2009.pdf

    Hoetger, L., & Bornstein, B. H. (2012). Enliven students’ assignments with Wikipedia. Observer, 25(4), 44-45.

    Ibrahim, M. (2012). Reflections on Wikipedia in the classroom. Observer, 25(1), 29-30.

    Joordens, S. (2012). Using Wikipedia in a mega classroom: A 1,700 student case study. Wikipedia Symposium.

    Karney, B. (2012). Feedback from the whole world. Observer, 25(3), 45-46.

    Klatzky, R. L. (2009). Giving psychological science away: The role of applications courses. Perspectives on Psychological Science, 4(5), 522-530. doi:10.1111/j.1745-6924.2009.01162.x  

    Lilienfeld, S. O., Ammirati, R., & Landfield, K. (2009). Giving debiasing away: Can psychological research on correcting cognitive errors promote human welfare? Perspectives on Psychological Science, 4(4), 390-398. doi:10.1111/j.1745-6924.2009.01144.x 

    Marentette, P. (2014). Achieving “good article” status in Wikipedia. Observer, 27(3), 25, 37.

    Munger, M. (2012). Improving students’ writing with Wikipedia. Observer, 25(5), 43-45.

    Reynolds, M. (2011). Wikipedia in the classroom. Observer, 24(7). Retrieved from http://www.psychologicalscience.org/index.php/publications/observer/2011/september-11/wikipedia-in-the-classroom.html

    Rosen, J. R., & Smale, M. A. (2015, January 7). Open digital pedagogy = critical pedagogy. Hybrid Pedagogy. Retrieved from http://www.hybridpedagogy.com/journal/open-digital-pedagogy-critical-pedagogy/

    Silton, R. (2012). More than just a grade. Observer, 25(2). Retrieved from http://www.psychologicalscience.org/index.php/publications/observer/2012/february-12/more-than-just-a-grade.html

    Tomes, H. (2000). Giving psychology away. Monitor on Psychology, 31(6). Retrieved from http://www.apa.org/monitor/jun00/itpi.aspx

    Waters, N. (2007). Why you can’t cite Wikipedia in my class. Communications of the ACM, 50(9), 15-17. doi:10.1145/1284621.1284635

    Wikipedia. (n.d.). In Wikipedia. Retrieved January 14, 2015, from

    Wiley, D. (2013). What is open pedagogy? Retrieved from http://opencontent.org/blog/archives/2975

    Zimbardo, P. G. (2004). Does psychology make a significant difference in our lives? American Psychologist, 59(5), 339-351. doi:10.1037/0003-066X.59.5.339 


    Biographical Sketch

    Dr. Rajiv Jhangiani is the Open Studies Teaching Fellow and Psychology Faculty at Kwantlen Polytechnic University in Vancouver, BC, where he conducts research on open education and the scholarship of teaching and learning. A recipient of the Robert E. Knox Master Teacher Award from the University of British Columbia and the Dean of Arts Teaching Excellence award at KPU, Dr. Jhangiani serves as the Senior Open Education Advocacy & Research Fellow with BCcampus, an Associate Editor of Psychology Learning and Teaching, and a faculty workshop facilitator with the Open Textbook Network. Along with the other members of the STP ECP committee, he recently co-edited the e-book A Compendium of Scales for Use in the Scholarship of Teaching and Learning. His forthcoming book is titled Open: The Philosophy and Practices that are Revolutionizing Education and Science (Ubiquity Press).


  • 18 Jan 2017 1:35 PM | Anonymous


    In Pursuit of Teaching Outcroppings:

    Engaging Students with Emotionally Involving Current Events


    Christie Cathey

    Ozarks Technical Community College

              Most of us can likely remember the one experience that caused us to first fall in love with psychology and made us think, “This is the stuff I want to do forever.”  For me, that experience happened 21 years ago this spring, when I took Ralph McKenna’s Advanced Social Psychology class at Hendrix College.  The thing about that class that really hooked me on the discipline was how enjoyable the research process became for me.  Dr. McKenna encouraged original, creative research designs (he would have nothing to do with canned research projects) and our class meetings were these ridiculously fun and engaging brainstorming sessions.

                Dr. McKenna taught us to look to the world around us for unique opportunities to examine human social behavior, encouraging us to be on the constant lookout for “research outcroppings.”  This term, originally coined by Webb, Campbell, Schwartz, Sechrest, and Grove (1981), is a really nice metaphor.  Just as geologic outcroppings, like highways cut through hillsides, allow us to observe aspects of Earth’s strata that would normally remain hidden from view, a research outcropping results when an atypical event in the world exposes normally hidden aspects of human behavior.  For example, the semester I was enrolled in Advanced Social Psychology happened to coincide with the LA riots (sparked by the acquittal of three police officers in the Rodney King case). I remember my classmates and I excitedly considering the possible new vantage points into aspects of social thought and behavior this atypical event may have opened up.

                I’ve always been a fan of this outcropping metaphor, and now, as a teacher of psychology, I like to use what I refer to as “teaching outcroppings.”  These are unexpected, or infrequent events that are inherently involving for students, and that give us an opportunity to truly engage students by helping them see the immediate application of course concepts to the world around them.  Sometimes these teaching outcroppings are difficult to spot, but other times, they appear without effort.

                Early last September an unexpected (but in hindsight, obvious), outcropping revealed itself. I had my Social Psychology class planned out for the entire fall semester and had no intention of making changes. However, one morning, I happened to overhear a campaign ad for a local election playing in the next room, and it really ticked me off.  The ad’s message was in direct opposition to my own values, and it so enraged me that I wondered how I would survive eight more weeks of listening to that garbage.  I then had one of those “when life give you lemons…” realizations, and it occurred to me what a potentially rich teaching outcropping the 2012 election season might be.  I knew then that I needed to quickly plan a new project for my Social Psychology class to take advantage of this fleeting opportunity. 

                The election season provided the perfect teaching outcropping for four distinct reasons.  First, as we all know, the 2012 elections were particularly contentious and emotionally laden.  I knew that if I could find a good way to get students to relate course concepts to the elections, their existing emotional investment in the elections might translate into heightened engagement in the course.  Second, the sheer relentless and omnipresent nature of the persuasive attempts in the media in those final months of the election meant that students couldn’t escape them and would be forced to think about social psychological concepts between class sessions.  Third, my class was composed of students with diverse political attitudes, and I thought this would be a perfect opportunity to have them work together in small groups to experience diversity and to practice civility. Fourth, it would give them a chance to develop an important research skill: the ability to examine emotionally laden social topics in as unbiased a manner as possible.

                Two weeks later, just as we began our coverage of persuasion in my Social Psychology class, I told my students that they would be working in groups to analyze persuasive tactics used in currently running political television advertisements.  Then, in my most obnoxious infomercial voice, I added, “But wait…there’s more!” and announced that they would also be writing and producing political ads of their own, and that on Election Day they would present and discuss their ads for the class.  I knew I was onto something good when a 64-year-old student in the front row immediately exclaimed, “Oh!  This is going to be fun!”

                Over the next five weeks, the five groups of four students each met frequently outside of class. First, they selected two ads from opponents in the same local, state, or national election and then pinpointed the specific persuasive tactics they believed the campaign teams were using in those ads.   While working on their analyses of existing ads, the groups also worked together to conceive of a fictitious political candidate, and to invent details about that candidate’s life and campaign.  I gave students the option of inventing either a third candidate for the same campaign they’d selected for the first part of the assignment or a candidate in an entirely different campaign. Students then chose specific persuasive tactics we had covered in class and used those to produce their own 30-second ad.  I realized that not all students would have video production skills, so I gave them the option of creating either a television or radio ad and told them they even could act out their ad if they really feared technology.  Alas, I underestimated students’ technological adeptness, as no groups went with the “Shakespearean option.”    

                On Election Day, I came to class armed with patriotic-themed cupcakes to help calm students’ public speaking jitters, and we began the 15-minute presentations.  Each group first showed videos of the two current ads they’d selected and presented their analyses of the intended persuasive goal and the effectiveness of each.  They then provided details about their fictitious candidate (e.g., age, gender, political affiliation), and about their candidate’s campaign (e.g., Was it early or late in the campaign? Was the candidate ahead or behind according to polls?), and played their original ad for the class.  Finally, the group gave an in-depth analysis of their original ad, including a discussion of the intended audience, the ad’s overall goal, and at least one persuasive tactic employed in the ad.

                Although I was initially nervous about trying out a new, potentially risky project that involved students working closely in groups for an extended period of time, I believe this project was the most successful (and certainly the most fun) I’ve ever used.  The level of work all groups put into the project far exceeded my expectations.  Their analysis of existing ads was sophisticated and thoughtful, and their original ads were creative and, in some cases, enormously entertaining and humorous.  What’s more, the class really loved the project, and despite the fact that several groups were comprised of members on opposite polar ends of the political spectrum, I am happy to report that not only were there no thrown punches, but that I witnessed true teamwork, high levels of civility, and the formation of strong bonds within groups of very diverse students.  Finally, the class as a whole was the most engaged and excited about learning I’ve experienced in my 15 years of teaching Social Psychology.  Of course, I can’t be certain that this was a result of the election project and its usefulness as a teaching outcropping, but I strongly suspect that it was.

                This project reinforced my belief in the value of seeking out and exploiting teaching outcroppings.  I fully intend to make use of the 2016 election outcropping, but in the meantime, I have amped up my intentional search for others.  This semester, for example, I simply asked students which current events most grab their attention.  The resounding answer was the debate surrounding gun control in the U.S., so I’m building an assignment that takes advantage of students’ high emotional involvement in that issue.  Regardless of the courses we teach, I believe we can all make use of teaching outcroppings; we must only be insightful enough to recognize them when they occur and flexible enough to change our plans in order to take advantage of them.  By recognizing these fleeting events in the world, we can develop creative coursework that grabs and holds students’ attention, and emotionally involves them in their studies.  By doing this, we can not only better engage our students, but, in some cases, we can truly transform a class.    


    References and Suggested Readings


    McKenna, R. J. (1995). The Undergraduate researcher’s handbook: Creative experimentation in social psychology. Boston: Allyn and Bacon.

    Webb, E. J., Campbell, D.T., Schwartz, R. D., Sechrest, L. & Grove, J. B. (1981). Nonreactive measures in the social sciences (2nd ed.). Boston: Houghton Mifflin.

    Christie Cathey received her B.A from Hendrix College and her M.A. and Ph.D. in Social Psychology from the University of Connecticut.  After teaching for nine years at Missouri Southern State University in Joplin, where she was an Associate Professor, she is now Lead Instructor for Introduction to Psychology at Ozarks Technical Community College in Springfield, Missouri.  She was a visiting professor at Tsinghua University in Beijing, China in 2009, and her research interests focus on an application of the Confucian ethical ideal, ren, to pedagogical practices.  She’s passionate about mentoring undergraduate researchers and was an Associate Editor for the Journal of Psychological Inquiry, a student research journal, for six years.

  • 02 Jan 2017 5:58 PM | Anonymous

    From Passive Learner to Active Participant:
    Examining the Effectiveness of Inter-Teaching

    Peter Frost, PhD Southern New Hampshire University

                Typically, inter-teaching requires that random pairs of students answer questions involving application, synthesis and/or critical thinking by teaching each other during a portion of class (Boyce & Hineline, 2002; Saville, et al., 2011). Generally, the professor sets up questions for each inter-teaching session. Students are expected to prepare answers to all questions since they usually don’t know which question will be used during a particular inter-teaching session. During each inter-teaching session, students are randomly assigned to dyads or triads and spend part (as in our approach) or all of class to discuss the question and write-up a response. The professor or student helpers/coaches observe the groups to help correct any misconceptions, or help answer questions through Socratic dialogue. Write-ups of each group’s responses are collected, graded, and typically handed back by the next class meeting. Some versions of inter-teaching also include a peer review process of some sort. Many versions of inter-teaching exist; we describe our version in the Methods section.
                Regardless of the variation used, inter-teaching is intended to encourage students to take ownership of their learning since they are responsible for contributing to their peer partnership and knowing the material well enough to teach it. The peer review process places additional pressure on students to know information ahead of class.
                The version of inter-teaching we used, adopted with some modification from Carroll (2011), also included the use of online practice quizzes (described in more detail in the Methods section). We designed the quizzes to ensure students knew basic and fundamental concepts ahead of each inter-teaching session, using an approach developed by Daniel and Broida (2004) described below.
                 Past studies have shown that courses with inter-teaching lead to higher exam scores (Saville, et al., 2011) and greater long-term recognition memory of course concepts (Saville, Bureau, Eckenrode, Fullerton, Herbert, Maley, Porter, & Zombakis, 2014) than traditional lecture courses. We suspected that inter-teaching would facilitate intrinsic motivation. To test this, we examined whether students using inter-teaching in a section of Cognitive Psychology would find their section more stimulating and worthwhile compared to students using a traditional lecture approach in another section of Cognitive Psychology. As with past studies, we also suspected that students in the inter-teaching section would show evidence for greater learning and retention of course concepts.


                We compared two sections of Cognitive Psychology offered during the Fall 2014 semester. One section (n = 22) was randomly assigned to implement inter-teaching while a second (n = 24) implemented a lecture-based course. Both courses were taught by the same professor, covered the same content, and included the same lecture format.

    Materials and Procedures

                We provided a study guide to the inter-teaching section at the beginning of the semester. A set of between two and five questions was shown for each of seven inter-teaching sessions that were conducted throughout the semester. We informed students that they had to prepare for all of the questions for each session since they would not know which question would be part of an inter-teaching session. Inter-teaching questions were designed to encourage thought, application, or synthesis.
                Each inter-teaching session ran at the beginning of class for about 15 to 30 minutes, depending on the difficulty of the question. We paired students randomly, and they all received the same question. If the students had questions during the session, the teacher would use Socratic dialogue to help prompt an answer (the answer was never provided). Student pairs handed in a written response to the question based on their discussion.
                We gave the students feedback about their written response by the next class. They were also asked to fill out a peer review assessment survey made available on Blackboard.
                Students in the inter-teaching section also took an online practice quiz designed to help them master the facts needed for the inter-teaching sessions. The parameters of our online practice quizzes (based largely on the approach described by Daniel and Broida, 2004) were as follows:
    • A large number of multiple-choice items were included (40 – 100).
    • Students could re-take each quiz as often as they wanted until the due date. The highest grade achieved was recorded.
    • Questions were scrambled, as were answer choices.
    • Once logged in, students had to complete the quiz.
    • The quiz was timed.
    • Students could view only one question at a time.
    • Feedback was restricted to ‘correct’ or ‘incorrect’ for each item.
    A Likert-like Scale survey was given at the end of the semester to assess student motivation for each section of Cognitive Psychology. The questions took on the form as follows:

                    Did you find that time in class was worthwhile (circle one)?

      Not at all   0%  10%  20%  30%  40%  50%  60%  70%  80%  90%  100%  Absolutely


                Although no main effects occurred for Type of Course (Inter-Teaching versus Traditional) nor GPA (upper versus lower GPAs), ps > .05, there was a statistically significant Type of Class by GPA interaction, F(1, 42) = 4.23, p = .046. Inter-teaching appears to have improved the test scores of students in the lower 50th percentile, but not students in the upper 50th percentile.
                Across all questions about student engagement, average Likert-like scale responses associated with inter-teaching were higher than that for standard courses. Participants found that time in class was more worthwhile in the inter-teaching course (M = 82.02, SD = 5.23) than the standard course (M = 55.45, SD = 9.04). The inter-teaching course was found to be more intellectually stimulating (M = 76.25, SD = 7.00) than the standard course (M = 61.90, SD = 9.61). Participants also indicated wanting to learn more in the inter-teaching course (M = 78.66, SD = 10.01) than in the standard course (M = 65.72, SD = 11.63).  We found that the average overall score associated with the student engagement survey was higher for inter-teaching than for the lecture section, t (44) = 15.52, p = .02. Attendance was higher for IT than that for the lecture section (94% attendance on average for IT and 86% attendance for a control group).


                Our findings replicate other studies showing that inter-teaching methods and online practice quizzes can help improve exam scores (Daniel & Broida, 2004; Saville, et al., 2011; Saville, et al., 2014), but our results suggest the benefit is exclusive to students with lower GPAs. Inter-teaching did not improve test scores for higher-performing students, perhaps because their scores were closer to ceiling from the start of the semester.
                As predicted by our hypothesis, inter-teaching led to evidence of enhanced intrinsic motivation as shown by higher ratings (relative to an exclusively lecture-based course) associated with viewing the course as worthwhile and intellectually stimulating, rating the instructional method as helpful, and wanting to learn more. Moreover, inter-teaching was also associated with greater attendance.
                There are many variants of the inter-teaching method. For example, peer evaluation can either be figured into the grade or not (we did not include ratings in peer evaluations as part of the grade); some teachers choose to lend significant time to inter-teaching activities (we only had seven inter-teaching sessions over the semester between 15 and 30 minutes each). Inter-teaching is versatile enough to be adapted to course needs.
                Our inter-teaching approach had several potentially beneficial aspects, but a limitation of our study is that we did not determine the extent to which the different aspects benefited learning and motivation. Future research should analyze how different aspects and variations contribute to the effectiveness of the inter-teaching method. Our findings show that the effectiveness of inter-teaching, both with regard to improving academic performance for lower-performing students and facilitating motivation in all students, makes further research into what makes this method effective worthwhile.


    Boyce, T. E., & Hineline, P. N. (2002). Interteaching: A strategy for enhancing the user-friendliness of behavioral arrangements in the college classroom. The Behavior Analyst, 25, 215–226.

    Carroll, D. (2011, October). Development, application and evaluation of an 'inter-teaching' approach to learning. Paper presented at the meeting of the Northeast Conference for Teachers of Psychology, Fairfield, CT.

    Daniel, D.B, & Broida, J. (2004). Using web-based quizzing to improve exam performance: Lessons learned. Teaching of Psychology, 31(3), 207-208.

    Saville, B.K., Bureau, A., Eckenrode, C., Fullerton, A., Herbert, R., Maley, M., Porter, A. & Zombakis, J. (2014). Interteaching and lecture: A comparison of long-term recognition memory. Teaching of Psychology, 41(4), 325-329. DOI: 10.1177/0098628314549704

    Saville, B. K., Lambert, T., & Robertson, S. (2011). Interteaching: Bringing behavioral education into the 21st century. The Psychological Record, 61, 153–165.


    Peter Frost (Ph.D., Baylor University) is Professor of Psychology at Southern New Hampshire University (SNHU) and a Steering Committee member of the New England Psychological Association. He has been a recipient of the SNHU Excellence in Teaching Award and the SNHU President’s Merit Award. He is a firm believer that undergraduate Psychology majors should collaborate with faculty on original research projects. His current projects with students focus on the effects of using mobile devices on various aspects of higher cognition. Other studies have explored how personality relates to susceptibility to false memory and how faulty reasoning can alter autobiographical memory.

  • 15 Dec 2016 7:11 PM | Anonymous
    Evaluating Alternative Reality Games

    for Introductory Psychology

    J. Mark Cleaveland and Rachel Abril

    Vassar College


                    Game-based learning refers to the use of games in pedagogy.  We typically use game-based learning to increase a learner’s “engagement,” however operationalized, with a problem or content area.  The game in question might explicitly model a particular set of contingencies or do so implicitly.  For example, in the board game, “Freedom: The underground railroad,” players take on the roles of abolitionists who are attempting to aid slaves on their passage to freedom. In doing so, players interact with cards that detail historical events and personages (see Cleaveland, 2014).  Conversely, a game such as “Mastermind” is not explicitly about scientific reasoning, but we can use it to teach aspects of the scientific method implicitly, and then, with discussion or targeted responding, bring out these points explicitly (see Strom and Barolo, 2011).  Another form of game-based learning is given by the “Reacting to the Past” consortium begun by Mark Carnes (see Carnes, 2014).  In these sometimes semester-length games, students role-play the personages and debates of particular historical periods.  Regardless of the specifics however, a fundamental goal of all instances of game-based learning is to re-contextualize traditional pedagogy in creative ways.  Games or texts are no longer passive objects, but repositories of opportunities.  “Mastermind” is no longer a collection of pegs and a board, but also a physical metaphor for the scientific method.  A speech of Demosthenes is no longer only something to learn for a test, but also potential leverage for a team in an upcoming roleplaying debate.  In other words, the best examples of game-based learning can create a pedagogical narrative that naturally blurs the distinction between what happens in the classroom and the student’s day-to-day life.

                In April of 2015, we experimented with game-based learning in an Introductory Psychology class at Vassar College.  Specifically, we designed and ran an “alternate reality game,” or ARG, that we called “Backtrack.” The story, thematically centered on memory and used material covered in earlier lectures. Participation in its narrative was offered as an extra credit opportunity. Students who signed up for the game received an email with a request for help from one of the characters, and by replying to this email began a narrative journey in which their knowledge of memory-related concepts would be highlighted.  Before going into the details of the game, itself, we’d like to explain why we attempted this experiment, and what we mean by “alternate reality game.”

                Introductory Psychology is taught as a single semester survey course at Vassar College. Typical classes are limited to 30 students and meet for approximately 2.5 hours per week across the semester.  We are fortunate in that small classroom sizes allow for more in-class flexibility than is typical of many academic institutions. Nonetheless, the overwhelming amount of content in an Introductory Psychology course, especially if taught in a single semester, places severe constraints on pedagogy.  By necessity, class-time must focus primarily on the systematization of “facts” that will tend, of course, to appear on tests.  As a skill, systematization has its place, however what psychologists actually do is use this systematization in the service of open-ended exploration, constrained by methodology.  It is this latter activity–open-ended exploration with the intent of uncovering heretofore unnoticed contingencies–that is missing from many Introductory Psychology survey courses.  Our goal therefore, was to see if we could come up with an activity that explicitly targeted and reinforced the creative detective work that undergirds our field.  For this reason we turned to ARGs.

                Alternate Reality Games (or ARGs) are games that are based around a single, cohesive narrative. The narrative is constructed by an individual or a group of so-called “puppet masters,” and then then broken into interactive elements that make use of a variety of media. For example, a story might unfold through texts, images, audio, video, or even real-life interaction. Players uncover the narrative through interaction and investigation, and can even have an impact on the outcome of various in-game events.  Given that the narrative of an ARG is “found” more than it is encountered, the lines between reality and fiction tend to blur in this narrative medium.  Some ARGs go so far as never overtly to acknowledge the events as being part of a game. Both players and the game makers are expected to behave as though everything that happens in the game is true.

                The blurring of reality that lies at the center of an ARG narrative creates a uniquely immersive experience. Players are led to believe that every action that they make in the game is significant, that they have a direct impact on the events that transpire and, perhaps most importantly, that they are forming real relationships with the characters that they interact with in the narrative. This illusion creates a level of engagement that may be unmatched by any other kind of game, and can offer a special benefit to education. Using an ARG as a teaching tool can provide students with “real world” applications of psychological concepts. Interaction with concepts from class outside of a classroom setting requires students to draw on their knowledge of course material to puzzle out the story without feeling like they are being formally tested. The hope is that this will provide a stronger connection to the source material, and reinforce the concepts in the minds of the players.

                We ran “Backtrack,” our own ARG, in April 2015.  Because our game was only meant to cover one section of material (i.e., lectures specifically centered on memory), we decided that the game would last five days.  Ultimately we extended this time frame to a week because the players had difficulty determining what they were supposed to do.  The general plot was as follows.  It began with a message to the players from a fellow student identifying herself as “K.” This person claimed that her friend, “J,” was having memory problems, but that he refused to believe her.  “K” asked the students to validate her concerns by directing them to a recording of a memory test that “J” had taken (https://www.youtube.com/watch?v=XR4EmyrU-Us).  Players were required to characterize “J’s” memory deficiencies before “J” contacted them via Skype.  This interaction led the students to an online journal of J’s that was filled with puzzles, coded sections, and general information that provided background on “J.”  For example, a linked paged entitled “CBT” led to a description of a simple cognitive behavioral technique.  By figuring out how to work through the journal, the players ultimately came to a confession that “J” had fatally struck a dog with his car and disposed of the body.  The players then learned that the owner of the dog, a daughter of a family friend, had disappeared in the search for her dog, and that “J” blamed himself for her disappearance.  The game concluded with the players determining that “K” was actually encouraging some of J’s memory problems (e.g., via attempts to plant false memories) in a misguided attempt to help her friend through a difficult time.  The game concluded with an in-person meetup with “K,” and a scavenger hunt to locate an object that would hopefully aid in the retrieval of some of J’s lost memories. 

                The participating students were asked to fill out a questionnaire at the conclusion of the narrative, so that we could obtain a qualitative sense of their experience.  From this questionnaire we learned that students overwhelmingly enjoyed the collaboration with their fellow students that the ARG afforded.  Students also appreciated the central mystery of the narrative and interactions with the story characters. All reported at least some explicit awareness of course concepts embedded in the narrative.  After the game had concluded, one student even sent a follow up email to “comfort” one of the characters.  For these reasons, we feel that an ARG provides an interesting pedagogical tool that deserves further exploration.  That said, our recommendation of this tool comes with certain caveats. 

                First, workload.  An ARG is not an undertaking that can be put together at the last minute.  Creating as much of the material in advance as possible is vital. Whether this entails outlining character interactions to avoid being sidetracked in a chatroom, drafting content to appear in an email or blog post, or creating web pages will vary depending on what media are being used to present the ARG. In most cases, ARGs contain at least one central website and one point of interaction with characters and players. For “Backtrack,” we opted to use a single static website, and contacted the players through email, Skype, text messages, and one in-person character meeting. All of the content for the website was finished and uploaded before the game began. This proved to be immensely helpful once the game itself was underway, because it allowed the puppet master to focus on guiding the players through the narrative itself rather than having to worry about producing new content. For longer games, producing all content may not be as feasible, but at the very least an outline of the planned events and core concepts to which they’re tied should be created before the game is launched.

                Furthermore, the preparatory workload in our case was matched by the work required of the students.  “Backtrack” only lasted for a week but still had to tell a complete narrative and incorporate an assortment of pre-determined course concepts. For this reason, the workload required of students was high, and multiple participants noted this in their feedback. One potential remedy would be to have the game last for a longer period of time to allow students to play once they have dealt with their other commitments. Players also noted that they felt that course material should have been more vital to the advancement through the story rather than utilizing classic ARG ciphers. Several of them mentioned the puzzle that required the players to teach the characters psychological concepts as a memorable instance of the course material being used, indicating that puzzles of this nature would be a wise choice for anyone considering making an educational ARG in the future. Such specificity of puzzles, of course, only serves to increase the potential workload and creative requirements on the part of the game makers. 

                Finally, it should be pointed out that some students found the engagement that is central to ARGs to be difficult.  As mentioned above, our players had difficulty at the beginning of the story in determining how to play the game.  We eventually used both character prompting and feedback from the instructor to teach the students how to engage with the story.  This need to teach the students how to engage via self-generated exploration was interesting and perhaps unsurprising given how little it is emphasized in most Intro Psychology courses.  Our hope, moving forward, is to further amplify this element of the ARG experience, while working to further embody psychological skills and concepts in the narrative, itself.




    Cleaveland (2014).http://boardgamegeek.com/thread/1219031/ professors-playing-games-freedom-underground-railr

    Carnes, M. (2014). Minds on Fire, Harvard University Press: Cambridge, MA.

    Strom & Barolo (2011). Using the game of Mastermind to teach, practice, and discuss scientific reasoning skills. PLOS Biology. DOI: 10.1371/journal.pbio.1000578.

    J. Mark Cleaveland is an associate professor in Vassar College's Psychological Science department and Neuroscience and Behavior program.  At Vassar he teaches courses in the areas of comparative psychology, learning, and introductory psychology. An inveterate gamer, he has long possessed an interest in how games have been used in behavioral modeling and how they might inspire pedagogical frameworks.

    Rachel Abril graduated from Vassar College in 2015 with a Bachelor's of Arts degree in Psychology. She has long been interested in immersive fiction, and has been an active ARG player and designer since early 2010. Backtrack was part of an independent study at Vassar, and creating the website for this project inspired her to continue her studies. She is currently pursuing an accelerated Bachelors/Masters degree in Graphic Information Technology at Arizona State University.

Powered by Wild Apricot Membership Software