Society for the Teaching of Psychology: Division 2 of the American Psychological Association

E-xcellence in Teaching
Editor: Annie S. Ditta

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
  • 05 Nov 2024 9:33 PM | Anonymous member (Administrator)

    Jennifer Samson
    Queens University of Charlotte

    I love teaching Research Methods. When I accepted my present position, I enthusiastically agreed that this would be my primary teaching responsibility. (I think they might have expected me to run away screaming instead.) Our year-long methods sequence is relatively unique in that every year, my 25 or so students propose, design, implement, and report on their own projects in any area of Psychology. Every year there are some unbelievably thoughtful, creative, and interesting projects. But also every year, I have to drag myself through grading the drafts. Even the “good” projects. Especially when I feel like I’m providing the same feedback on draft after draft with little to no improvement.

    How can we help students understand the writing process as iterative instead of “one [draft] and done”? How can we as instructors help students become effective critics of their own writing? How can we keep our grading load manageable and still provide students with the feedback they need? This essay describes trials and errors, lessons learned, and lessons I’m still learning in my search for the elusive answers to these questions.

    Spoiler: There’s not one easy solution.

    Background:

    The year-long research methods sequence I teach is required for all Psychology majors in their junior year at my small, liberal arts university. By the time they get to me, students have completed their first-year writing sequence as well as the specific course prerequisites, including Introduction to Psychology, Statistics and a class we call Information Literacy – reading and writing in a professional Psychology setting, including a focus on literature review. Ultimately, the majority of my students are relatively well-prepared for college-level work but are still learning to write the type of professional academic paper required for the course. It is also worth noting that a sizable minority every year take Information Literacy simultaneously with the fall semester of Methods and so need extra support in my course.

    The research methods sequence is centered around students’ individual empirical projects. In the fall, students complete a four-credit class where we delve into the study of research, emphasizing design for association versus cause/effect and critiquing for different types of validity (designing and critiquing research is an explicit goal for undergraduate students set out by the American Psychological Association, APA, 2023). Concurrently with the class, they complete a two-credit-hour lab where they write a proposal to identify a research question and propose methods to answer it. In the spring, students complete a second four-credit class where they collect data, analyze it, and revise/extend their proposal so it becomes a complete, journal-style empirical paper. They also present their work in a poster session at a local undergraduate conference. The completion of not only the proposal but the entire project and the opportunity for every student to present in a conference setting is a hallmark of our program.

    What I Tried:

    At the beginning of last academic year, I knew I needed to do something different. An influx of late transfers caused the class size to swell by almost a third, and I knew that, short of learning to clone myself, there was no way I could keep up with marking everyone’s drafts in a timely manner (see Ambrose et al. 2010, about importance of constructive, timely feedback for student motivation). Meanwhile, I had already been looking at ways to increase student buy-in for writing as an iterative process and to increase students’ meta-cognitive skills as evaluators of their own work (see Ambrose et al., 2010; Bain, 2004). Therefore, I implemented the following procedures.

    I cancelled lab at key points in the semester (e.g., as outlines were coming together) to instead conduct 20-30 minutes oral check-ins with each student individually during the pre-writing process. In these meetings, we discussed how the ideas were going to be organized within the paper. Meetings, and requiring outlines in the first place, hopefully got students thinking about their papers earlier in the semester than many would have otherwise and encouraged them to engage in prewriting organization rather than diving right into drafting as many are prone to do). I returned minimal written feedback for these preliminary steps and marked primarily completion credit; if they did it thoughtfully and in a timely manner, they earned all the available points for that preliminary step.

    So far, what I was trying was not very different from what I’d done previously. But then, after students turned in their drafts of each section (e.g., literature review, methods), instead of marking it and returning it, I asked them to complete a short self-evaluation. The open-ended questions on the self-evaluation prompted them to focus on the areas of content and organization, common mechanical issues, time management completing the draft, and goals for revision. I then met with each student one-on-one to discuss their drafts. In these 30-minute meetings, we read the draft together and marked some key suggestions using Word’s track changes and comments features. I often used their self-evaluation as a starting place, especially if they had identified strengths or areas for improvement similar to those I noticed. I made a point to not mark the whole paper, but targeted examples. For instance, if my suggestion was for the student to use more parenthetical and fewer narrative (“___ found”) citations, we edited one or two paragraphs together to show them what that might look like.

    At the end of these meetings, we completed the grading rubric (separate from the self-evaluation) together. Students left the meeting with their marked-up paper and (after my first round of meetings where I learned it would be more efficient to record the score and send it on the spot) the completed rubric. Working together to evaluate the drafts not only got them graded more efficiently, but provided students ownership in their learning process and therefore, theoretically, more buy in for the learning process (see Doyle, 2011). The assessment was now part of the learning process (see Bain, 2004).

    At the end of the semester, students submitted not only their final paper, but also a revision reflection (similar to a revise and resubmit letter to the editor) in which they described the feedback they received, what they changed, and what they didn’t change (and why). On this revision reflection, I prompted them to describe the feedback they’d received on each section of their paper and how they’d incorporated it (or not).

    Conclusions So Far:

    Overall, I would say my experiments were a success and this is moving in a good direction, although maybe not there yet. (Will it ever be perfect? Probably not.) Many of the self-evaluations were thoughtful and, anecdotally, I believe more of the students at least registered and gave some thought to the feedback they received. By the end of the academic year, after meetings for the literature review, methods, and results, I noticed that students were doing more of the evaluating as we discussed the rubric for their discussion drafts, rather than waiting for me to tell them what score they earned. I asked the class for their thoughts and, even on the anonymous evaluations, most of them chose not to comment (I’ll take that as, “no complaints”). One student did tell me that they liked having meetings instead of a paper returned so they could ask clarification questions.

    From a professor workload point of view, this approach was exhausting during meeting weeks, when I often had 6-8 meetings per day for several days in a row, but generally much more efficient than grading and returning papers. There were a few (less than 5%) students who delayed scheduling their meetings and/or with whom I had to make special arrangements for an evening or weekend due to athletics, jobs, or other outside commitments taking up most of their days, but we made it work. In the future I should probably be clearer that the onus is on the student to take more initiative and get these scheduled (aka your poor planning is not my emergency, schedule early to have enough choices that will work for you).

    In part because I was forced to stay on schedule, students got their feedback in a timelier manner, even though I spent about the same amount of time on each student (30 minutes per paper to mark vs. 30 minutes meetings). I’m hopeful, although I only have anecdotal evidence thus far, that the feedback was clearer. For example, instead of writing a comment, “be sure to clarify the main idea of this paragraph,” I was able to ask students face to face, “what’s the main idea of this paragraph supposed to be? Yes. Write that.” I am also hopeful that feedback was deeper. It’s easy when I’m reading a paper to get caught up in marking the details, but I found with a one-on-one conversation, I could focus more on discussing the bigger pictures of organization and what points they were trying to make. One area where I saw a marked change was in reference formatting; in a face-to-face situation, I could point to a correctly formatted example and say, “this is correct. How is it different from this other one [that has an error]?”

    In short, I will keep this self-evaluation and oral feedback approach, but with some tweaks. First, I will likely spend some time scaffolding useful self-evaluation, so maybe more students will use the self-evaluation to their best advantage instead of (as I’m sure some did) seeing it as another box to check. For instance, I might do the first self-evaluations in class on the day after drafts are due and maybe show them some of my self-evaluation process on nearly-complete papers. I might also add another meeting, even earlier in the process as students are collecting their potential sources. The biggest change, though, is timing. Last year I cancelled lab on the day the draft was due and held meetings then. This year, I’ll move meetings to the week after the draft due date. I think meeting as a lab on the day the draft is due will allow me to get the students started on next steps more efficiently and having a gap between due date and feedback will allow me to do more skimming ahead and preparation for more effective one-on-one meetings.

    Nothing I’ve written about here is ground-breaking or even particularly innovative. But, sometimes it’s difficult to break from the way we were taught or the way we’re used to doing things. I hope that by sharing my journey so far I might contribute to the conversation as we, as individuals and as a field, strive for that magic solution that will be sustainable for us but still provide our students with the best possible learning experience.

    References:

    Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: 7 research-based principles for smart teaching. Jossey-Bass.

    American Psychological Association [APA]. (2023). APA guidelines for the undergraduate Psychology major. Version 3.0. American Psychological Association. https://www.apa.org/about/policy/undergraduate-psychology-major.pdf

    Bain (2004). What the Best College Teachers Do. Cambridge: Harvard University Press.

    Doyle, T. (2011). Learner-centered teaching: Putting the research on learning into practice. Sterling, VA: Stylus Pub.


  • 09 Sep 2024 8:46 PM | Anonymous member (Administrator)

    V. N. Vimal Rao
    University of Illinois at Urbana-Champaign

    “Thinking like a scientist involves more than just reacting with an open mind. It means being actively open-minded.”      Adam Grant 

    Towards the end of my graduate schooling, I was at a department party and found myself chatting with a first-year student about their research interests. At one point I noticed a chain that they were wearing, and asked them about it. They reacted shyly, almost embarrassed that I had noticed. The chain included a religious symbol. I asked them about how their beliefs inform their research interests, and they seemed surprised at the question. They simply responded, “I’m not sure religion and science have much in common.”

    As a child of immigrants, I grew up in the United States perceived as a ‘hyphenated’ American. As a child, I often saw two versions of myself – the ‘American’ version that most people saw, and a second almost secret version disclosed with family or at cultural events. As an adult, I’m too tired to pretend to be anything other than fully me in everything that I do.

    It is in this spirit that I firmly disagree with my friend that religion and science do not have much in common. If you are both religious and scientific, then they have everything in common – they have you in common.

    It is with this value that one of my projects draws inspiration from. I am Hindu. I enjoy reading and learning about the BhagavadGita and other Vedantic works. In the spirit of being fully me – an educational psychologist and a Hindu – I realized that I could borrow pedagogical insights from the BhagavadGita to help me teach my students Statistics.

    It might seem odd at first to think that modern psychological and educational research can learn a thing or two from religion. But let’s think about it as psychologists:

    1. Many religions have existed for centuries;
    2. They typically include something of a benchmark set of attitudes, beliefs, and practices;
    3. They have encoded into their systems various pedagogies to support the propagation of these attitudes, beliefs, and practices; and
    4. These religious instructional systems have existed for far longer than psychological (or statistical) pedagogies.

     

    Clearly, religion is doing something right in terms of their pedagogical strategies to be able to last centuries. We would be remiss to not at least consider religious pedagogies as potentially viable strategies for our educational objectives. 

    In the case of the BhagavadGita, religious instruction is presented as a dialogue between a student (Arjuna) and a teacher (Krishna). Krishna is hailed as jagadguru, meaning ‘teacher of the world,’ a title bestowed to those teachers whose teachings have worldly impact. While I do not teach the same content as Krishna, surely I can study a jagadguru’s pedagogical strategies. 


    The Structure of the BhagavadGita

    The BhagavadGita is set within the great epic Mahabharata. Specifically, it occurs immediately prior to the outbreak of a war between two sides of a ruling family. In Chapter 1 of the BhagavadGita, Arjuna asks Krishna (who is serving as Arjuna’s charioteer) to take him in front of the opposing army. Arjuna then sees his grandfather, his teachers, and many friends and relatives lined up with the opposing army, and has a crisis of conscience. Arjuna finds himself confused, anxious, and hopeless. It is from this despondence that Arjuna bows to Krishna, pleading for Krishna’s help. 

    Chapter 1 of the BhagavadGita is thus titled “Arjuna Vishada Yoga,” or “Arjuna’s despondency.” Throughout this entire chapter, Krishna stays silent. It is only after Arjuna seeks Krishna’s help, at the start of Chapter 2, that Krishna begins his instruction.

    The implication is clear, and is similar to the old English proverb that ‘You can lead a horse to water, but you can’t make it drink.’ Arjuna would not have been ready or willing to receive Krishna’s instruction prior to experiencing confusion and anxiety. Those feelings created the motivation necessary for Arjuna to steadfastly receive Krishna’s instruction, and earnestly imbibe it into his being. For Krishna’s teachings to have had any effect, Arjuna needed to be ready to do three things: devotedly listen, reflect and contemplate on the teachings, and assimilate the knowledge into his being. Krishna knew Arjuna was ready to do these things only after Arjuna asked for Krishna’s help. 


    My Class

    I teach a very large (500-700 students per section) general education introductory level statistics course. My students are not STEM majors. Most of them take my class because they have to – 99% of 602 students (out of 1019) from the Spring 2024 semester indicated on a survey that they took the course because it was either a major requirement or they were fulfilling a general education requirement. If it weren’t for those requirements, I might not have a course to teach. 

    Consider these students’ motivation. Do they plan to devotedly listen to the content, reflect and contemplate on it, and assimilate statistical thinking into their lives? Do they perceive any need to actually learn statistics? With few exceptions, the answer is no[1]. Prior to Arjuna asking Krishna for help, Krisha stayed silent. My students do not walk into my class because they think they need my help. Following Krishna’s example implies that I too should stay silent. 

    Despite staying silent, Krishna still played an important role in setting the stage for Arjuna’s desire to learn. When Arjuna asked Krishna to drive closer to the opposing army, Krishna could have driven anywhere. However, Krishna chose to drive right in front of Arjuna’s grandfather. By doing so, Krishna created the setting from which Arjuna’s plight would manifest, thereby increasing Arjuna’s motivation to learn and providing the opportunity for instruction.

    Like Krishna taking Arjuna right in front of his grandfather, I too decided that I could create an environment to set the stage for students to develop motivation to learn statistics. To achieve this, I would eschew discussing the syllabus until Day 2, and instead spend Day 1 on telling a series of stories that require statistical thinking to resolve, hoping that students might relate to one, be unable to solve it, and thus develop a desire to learn statistics.

    On Day 1, I tell my students seven different stories, one for each of the content units in the course. Here, I will retell the last of the seven stories I tell my students. This is a true story. 


    My Day 1 Story

    It was a warm August night in Chicago. My grandfather was in the hospital, Day 3 of the current hospitalization due to dizziness and dehydration. The hospital only allowed visitors until 8 p.m., and as I was about to leave, the night nurse came in with some medication for my grandfather.

    The nurse had a pill for my grandfather’s hypertension – he had high blood pressure. I asked what my grandfather’s most recent blood pressure was. “110,” the nurse told me.

    110!? That was not normal for my grandfather. My grandfather measured his blood pressure every morning. He would complete his shower, get dressed, come to the dining room table, comb his hair, pray, measure his blood pressure and pulse, take his medications, and then only begin eating his breakfast. 

    I had spent the last year with him, bearing witness to the BP cuff beeping, inflating, and slowly deflating before giving my grandfather the data for the day, which he recorded in a notebook. Standing in the hospital, talking to the night nurse, I could not remember a single day that my grandfather’s blood pressure was as low as 110.

    I told the nurse that 110 was lower than my grandfather’s typical blood pressure. The nurse explained that they always give the medicine if a patient’s blood pressure is above 130, and won’t give it if the blood pressure is below 100, but when the patient’s blood pressure is between 100-130, it’s at the nurse’s discretion, but they typically give the medication anyway.

    With a blood pressure of 110, did my grandfather really need the medication? I didn’t think so, but the nurse did. If I was wrong and he really did need the medicine, his blood pressure would skyrocket overnight, and he would crash. If, on the other hand, we gave him the medicine but he really didn’t need it, his blood pressure would plummet overnight, and he would crash.

    How would you make a decision about whether to give the medication or not? Would you feel confident about the decision you were making? Would you simply relinquish decision making power and allow the nurse to do whatever they think, even though you know your family best?

    I decided to tell the nurse not to give the medicine, and I felt confident my decision was correct. Despite no medical training, I felt confident because I approached the problem by thinking statistically, and statistically, the answer was clear.

    You are in this class to learn how to think statistically. You are in this class to learn how to apply statistical thinking to the decisions you make in your life. You are in this class because statistics is the science of variability and decision making under uncertainty, and by thinking statistically, you will be better able to navigate the uncertainty you will undoubtedly face in your lives.

    You are in this class so that I can teach you how to think just like me if you ever have to face a situation in a hospital room like I did with my grandfather, and do so calmly and confidently.


    Students’ Reactions to the Story

    When I tell this story to my students on Day 1, I do not tell them my solution. My goal is to get my students to imagine themselves in the scenario, and to think about what they would do. I simply tell them that I know statistics, and that statistics allowed me to calmly make a decision without anxiety, without helplessness, and without despondence. If they find the situation stressful or unnerving, then they need to learn statistics, and I will help them learn how to think statistically. On Day 2 (and again in my last lecture), as a summary of the entire course, I do tell them the solution[2] – this is similar to the structure of the BhagavadGita, in which Krishna summarizes the entire teaching in both Chapter 2 and the final chapter, Chapter 18.

    To evaluate whether this intervention was successful at setting the stage for students’ learning, I conducted three surveys throughout the term – one immediately after the Day 1 lecture, one at the midpoint of the semester, and one immediately before my final lecture. With over 500 responses to each of the three timepoints, I am still in the process of fully analyzing the data. However, it appears that the intervention was indeed successful at motivating at least some students. This is evident from the following example responses to survey items:

    “[The problems] made me want to get an understanding of stats.”

    “It convinced me we need to use statistics for the answers.”

    “[The problems] made me want to learn what statistics does.”

    “I understood I needed to learn stats.”

    Additionally, in the survey data collected prior to the last day of class, this story about my grandfather was by far the story that the students best remembered and saw as important. While there was another story that students said they could imagine themselves in at higher rates, (a story about making a causal inference on whether compression socks can improve your 5k time that I told in the context of my sister and I running together), the fact that students remembered the story about my grandfather’s blood pressure nearly three months after the first lecture and without reinforcement is, I believe, evidence of the story’s efficacy in imparting the necessity and value of learning statistics.

     

    Sources of Pedagogical Inspiration

    This is just one small example of how I strive to draw pedagogical inspiration from anywhere I can, even religion. I do not believe this strategy is unique to a single religion, nor any single source. Another example of pedagogical inspiration I have drawn from religion is from Vedic mahavakyas, i.e., great sayings. These great sayings such as aham brahmasmi meaning “I am brahman” serves a role no different than many other great sayings in all religions. Pedagogically, these short sayings are easy to remember but packed with meaning. They serve as a psychological anchor for content knowledge and further inquiry. What then are our fields’ great sayings? From this inspiration I began teaching my students to say: “Who’s not here?” every time they see a graph, in an attempt to foster a critical statistical literacy habit of mind to question information about the sample and sampling strategy, especially regarding its representativeness and appropriateness for generalization.

    It might seem odd to seek pedagogical inspiration from religion, but it does not seem so odd to me to keep an open mind in terms of potential sources of pedagogical inspiration. Who knows from where revolutionary new ideas can come?

    I believe the best way to support new development and innovation in the teaching of psychology is for each of us to be fully ourselves in all contexts and at all times. Draw on all of your funds of knowledge and apply them generously to your work. Who knows where that may lead? Perhaps, and with any luck, it will lead us forward.

    [1] Only 7% of 602 students (out of 1019) from the Spring 2024 semester indicated that if Statistics was not a required course, they would take it because they believe it is important to learn how to think statistically. 22% indicated that they would take Statistics if it was not a required course because they believe it might help them get a job.

    [2] Based on the past data, and if my grandfather was in a stable condition consistent with how he usually felt over the previous few months, I predicted that my grandfather’s blood pressure should be around 140 – this is a simple model based on the mean. Accounting for variability, I knew that even if he was in stable condition, his blood pressure wouldn’t be exactly 140 – it could be as low as 120 or as high as 160, the typical amount of variation in his blood pressure. Based on this knowledge, I estimated that the RMSE is about 10, and that a middle 95% prediction interval for his blood pressure should be roughly from 120 to 160. If my grandfather was feeling like he normally does, his blood pressure should have been 140. My grandfather’s actual blood pressure was 110. The prediction error was -30. The z-score for the prediction was -3. The prediction was well outside the middle 95% prediction interval for what I expected my grandfather's blood pressure to be. Either my grandfather was feeling completely normal and this measurement was an extraordinary coincidence, or, the hypothesis that my grandfather is feeling like he normally does is not a good hypothesis.

  • 16 Aug 2024 5:20 PM | Anonymous member (Administrator)

    Melissa C. Rothstein
    The University of Rhode Island

    Matriculating into a Behavioral Science Psychology PhD program at 21 years old, I eagerly joined the Health and Alcohol Related Problems (HARP) lab to work under the guidance of Dr. Amy Stamates. Fast forward to my third year of the program, I am not just a student but the instructor of record for an advanced statistics and research methods course at the University of Rhode Island. My passion for research methods and statistics, coupled with my steadfast dedication to ongoing learning, empowers me to connect with students and foster a dynamic and engaging learning environment. Despite the occasional confusion that I still blend in with undergraduates (and, at times, get mistaken for one), I’ve honed the skill of blazer camouflage - an invisibility cloak for looking my age in academia.

    I currently teach Applied Methods in Psychological Research (a 400-level course), where students undertake the challenge of crafting a psychological manuscript comparable to a published journal article. With a class size typically comprising around 15-20 students, this manageable number enables me to provide personalized attention and facilitate hands-on learning experiences tailored to students needs and interests. Throughout the semester, students engage in the collection, cleaning, and analysis of empirical data. The culmination of their efforts results in a written manuscript, which is showcased orally as a presentation at the end of the semester. The assignments provided below have been personally developed, drawn from my undergraduate experience at SUNY Purchase College, or obtained from past instructors at the University of Rhode Island. 

    The Data & The Data Cleaning 

    We utilize Qualtrics to administer a survey to undergraduate students, encompassing various questionnaires chosen by students, covering topics such as happiness and exercise. To enhance the students’ practical skills, I incorporate demonstrations on employing Qualtrics effectively for survey administration. This includes guidance on designing well-structured questionnaires and navigating various features within the Qualtrics platform. Students are explicitly informed that the data collected is solely for educational purposes, as it lacks approval from the Institutional Review Board (IRB) for broader dissemination. Following the initiation of data collection, students undertake cleaning the dataset generated by Qualtrics and coding the questionnaires to prepare for subsequent analyses addressing their research inquiries. Instruction covers data cleaning techniques (e.g., addressing normality, outliers, and missing data) and coding procedures (e.g., sum scores, reverse scoring). Post data cleaning and prior to conducting analyses for their research questions, students are introduced to and explore preliminary analyses, including missing data analysis and reliability analysis.

    The QMRI

    QMRI’s (Question, Method, Results, Implications) serves as a valuable tool for conducting literature reviews and crafting the introduction section of a manuscript. I first encountered QMRI during my undergraduate years at SUNY Purchase College, where it played a pivotal role in my understanding of scientific writing. Now, I incorporate this assignment into my own course. More details can be found here: https://www.purchase.edu/live/files/1244-the-literature-reviewpdf. Students are provided with a template to answer key questions based on the journal articles they read and cite in their introductions. QMRI’s aid in paraphrasing and summarizing, proving particularly helpful for students in the manuscript writing process. The template comprises the following components:

    Q: What is the research question/aims? What is the hypothesis?

    M: What is the method (participants, measures, procedures)? What are the independent and dependent variables?

    R: What are the results of the experiment in lay terms?

    I: What are the implications of the results? Why is this experiment important?

    The IRB Protocol Form

    In my experience, students often gain theoretical knowledge about Institutional Review Boards (IRB) and ethics but lack practical immersion in the process of obtaining approval for an empirical study on human subjects. Consequently, a lab assignment in my course requires students to complete an IRB protocol form for their proposed study (even though gaining approval is not required for students to be able to carry out their research in the course). Research suggests that writing protocols has the potential to function as an educational tool in various domains, such as clarifying and refining research questions, conducting literature reviews, enhancing writing clarity, and ensuring adherence to ethical principles in research (Balon et al., 2019). In this assignment, students work in small groups, typically four to five members, to collaboratively fill out a protocol form and submit it for “IRB approval.” I review the protocols and provide feedback (though my feedback is not as thorough as what the IRB would provide). This exercise encourages students to thoughtfully consider the intricacies of their cross-sectional study, providing valuable insights into the steps researchers take to achieve ethical data collection.

    The Peer Review

    Peer review is an integral part of the learning process in my course because it offers valuable feedback from both their peers and the instructor. This feedback includes constructive criticism, insights, and suggestions aimed at enhancing the quality of students’ work. In fact, research shows that students who were more critical of their peers’ writing tended to achieve higher grades on their own writing (Yalch et al., 2019). All students receive training on how to provide insightful and professional peer review feedback before this assignment. Upon selecting research topics and receiving instruction on providing peer reviews, students are grouped based on shared interests. In these groups, students review each other’s work in two rounds: (1) introduction and method sections and (2) results and discussion sections. Each group consists of three students, with each student reviewing the work of two peers. Written reviews consist of 1-2 pages, encompassing a summary of the research, impression of the paper, and identification of any major or minor issues. Students are also asked to post comments and use tracked changes in the document while reviewing to provide more direct feedback. Additionally, students evaluate their reviewers based on the timeliness, professionalism, and helpfulness of the feedback received. This evaluation is factored into students’ grades to account for the feedback provided and received.

    The Scaffolding

    Using scaffolding for a complex assignment such as writing a psychological manuscript has been beneficial for both students and myself. Scaffolding involves organizing assignments and course materials systematically to align with course learning objectives and ensuring clear communication of goals and processes to students. More information on this approach can be found here: http://www.brooklyn.cuny.edu/web/aca_facultywac/Workshops-AssignmentScaffolding-120412.pdf. At the beginning of the semester, students are tasked with formulating research questions and hypotheses based on survey topics. Subsequently, they develop an outline for their manuscripts before progressing to drafting the content. I divide the manuscript assignment into three parts: (1) a draft introduction and method sections, (2) a draft of the results and discussion sections, and (3) the final manuscript encompassing the title page, abstract, introduction, method, results, discussion, references, tables, and figures. Grading is reserved solely for the final manuscript. Feedback is provided on the drafts, concurrent with peer reviews, to guide students in scientific writing and enhance their skills. Training students on manuscript writing is timely and requires a lot of thoughtful effort. My feedback is usually centered around creating coherent writing, using scientific language, and accurately reporting statistical analyses. To further support students in their learning journey, I encourage an iterative process for manuscript development via scaffolding as described previously. After the initial drafts and feedback stages, students are given an opportunity for revisions before submitting the final manuscript.

    Furthermore, the scaffolding approach extends to collaborative learning experiences, where students engage in discussions and workshops focused on key elements of manuscript writing. These collaborative sessions foster a supportive environment for peer learning. By integrating scaffolding and iterative practices, the aim is to empower students not only in producing high-quality manuscripts but also in fostering a comprehensive understanding of the research and writing processes inherent in psychological studies.

    The Statistics: Guess That Test

    Throughout the semester, students are introduced to a range of statistical tests applicable to addressing their research questions. Since students have already acquired the mathematical foundations for these tests prior to the class in other research methods and statistics courses, the emphasis in this course shifts towards fostering a conceptual understanding. Before taking this course, students would have taken a Quantitative Methods in Psychology course, a Research Methods and Design in the Behavioral Sciences course, and related laboratory classes. The primary focus is on ensuring that students can confidently determine the appropriate statistical test for different research scenarios, emphasizing practical applications and decision-making in test selection. To enhance conceptual understanding, I employ interactive methods such as real-world examples and class discussions to allow students to apply their knowledge to practical situations. This reinforces their ability to discern the most suitable statistical test for a given research context. Here are a couple of example questions below: 

    A psychologist is interested in assessing whether there is a significant difference in anxiety levels before and after a therapy intervention within the same group of participants. What statistical test should the psychologist use and why?

    Solution: Paired-samples t-test, because it enables comparisons between related measurements (pre- and post-intervention anxiety levels) within the same group of participants, facilitating the assessment of if the therapy intervention led to a significant change. 

    You are investigating the relationship between stress levels (measured on a Likert scale) on satisfaction with life (measured on a Likert scale). What kind of statistical test would you run? 

    Solution: Linear regression, because this analysis is suitable for predictive modeling (predicting the value of the dependent variable based on the independent variable) and analyzing relationships between continuous variables. However, given our utilization of cross-sectional data, students commonly employ terms like 'relationship' or 'association' to characterize such connections, rather than using language indicative of prediction or causation.

    The Last Class: All About Graduate School 

    The majority of students enrolled in this class are actively in the process of applying to or preparing their applications for graduate school. As part of the curriculum, I dedicate the last class to discussing master’s and PhD programs in psychology to raise awareness and provide insight into the various opportunities available within the field of psychology graduate programs. I delve into key aspects of the application process, including important considerations when choosing between master’s and PhD programs, crafting a compelling personal statement, and securing strong letters of recommendation. The goal is not only to spread awareness but to provide students with the knowledge and resources needed for a successful transition to graduate studies in psychology. 

    In my role as an instructor, I strive to create an inclusive and collaborative learning environment where students feel inspired and empowered to actively engage in the field of psychology. Recognizing the diverse backgrounds and perspectives within the classroom, I encourage open discussions and harness the wealth of collective experiences. In addition to teaching Applied Methods in Psychological Research, I actively mentor students from my class and lab in their individual research endeavors. This mentorship extends beyond the classroom, providing students with personalized support and fostering a sense of community within the psychology department. As I navigate the dual roles of student and instructor, I remain committed to fostering a learning environment where curiosity thrives, critical thinking is encouraged, and each student feels empowered to explore the domains of psychological research.

    References

    Balon, R., Guerrero, A.P.S., Coverdale, J.H., Brenner, A.M., Louie, A.K., Roberts L.W. …  (2019). Institutional review board approval as an educational tool. Academic Psychiatry, 43, 285-289. https://doi.org/10.1007/s40596-019-01027-9

    Yalch, M. M., Vitale, E. M., & Kevin Ford, J. (2019). Benefits of peer review on students’ writing. Psychology Learning & Teaching, 18(3), 317-325.          https://doi.org/10.1177/1475725719835070

  • 02 Aug 2024 4:23 PM | Anonymous member (Administrator)

    Alexis Grosofsky, Beloit College
    Jordan R. Wagge, Avila University
    Jared G. Branch, University of Utah


    Empirical research articles are an ideal pedagogical medium for helping teach core methodological and statistical concepts to psychology students. Rather than relying on fabricated descriptions of tools like surveys, experiments, and statistical tests, instructors can use full (but short!) research reports to ground these topics in real-world applications. This essay describes an open education resource (OER) we created called “Psychological Literacy for Undergraduate Methods and Statistics” (PLUMS) -- a collection of brief empirical articles to teach methodology and statistics to psychology undergraduates. The articles are accompanied by targeted factual and discussion questions about the research and include information about the design(s), analysis(ses), and any graphical/tabular displays. The methodological and statistical information is cross-referenced by “tags” (e.g., figures and graphs like bar graphs, statistical analyses like regression analysis, methodologies like convenience sampling, and subfields like social psychology), allowing instructors to select empirical articles to coincide with the topic(s) being covered.

    Research methods and statistics are the heart of psychology. Regardless of what subfield of psychology you select, each of them involves research and statistics. After all, our discipline is an empirical science. Norcross and colleagues (2016) collected data using their Undergraduate Study in Psychology (USP) questionnaire and found that (as of 2014) almost all baccalaureate programs required courses in research methods (98%) and statistics (96%). Thus, it is very important that we do a good job teaching students about these topics. This is a difficult task given that many undergraduates find these courses daunting and often try to put them off as long as possible. Students often do not think that research methods and statistics are real psychology. Instead, “real psychology,” to many undergraduates, is learned through content courses such as social, clinical, cognitive, or developmental.

    Despite what students may think, the American Psychological Association (APA) definitely believes that research methods and statistics are important. In their “Guidelines for the Undergraduate Psychology Major, Version 3.0” research methods and statistics are covered in two of their five goals:

    • Goal 2 “Scientific Inquiry and Critical Thinking” has more attention to statistical reasoning than in previous versions;

    and

    • Goal 4 “Communication, Psychological Literacy, and Technology Skills” describes communicating effectively and demonstrating psychological literacy.

    Both of these goals are ones that our Psychological Literacy for Undergraduate Methods and Statistics (PLUMS) project addresses.

    The fact that research methods and statistics are so fundamental to our discipline, coupled with their recognition by the APA, underscores the need to enhance the teaching of research methods and statistics. Students should come away from these classes realizing how important research methods and statistics are to the empirical science of psychology. They should also come away from these courses being (and feeling) competent in their understanding of these vital topics.

    Our idea of having students read real-life examples of research methods and statistics in empirical articles is supported by work done by Lewandowski and colleagues (2017). They describe how they have students read an empirical article covering the design students are learning about before introducing that design to the students. The idea is that students will learn the material better if their interest is captured first as demonstrated by Sizemore and Lewandowski (2011) who found that lessons about confounds were more successful in capturing students’ interest when they were framed around clinical depression rather than memory.

    A book very similar to PLUMS was published by Milinki (2000, 2006). This text introduced articles by methodological technique (e.g., survey research, quasi-experimental research). Once a technique has been selected, the instructor then selects from the 2-5 articles within that technique. Both the second and third authors have used articles from Milinki’s book when teaching Research Methods / Statistics. They observed that using actual empirical reports resulted in their students showing more engagement than when they did not use such articles. The text does have some limitations. First, this text has not been updated since the second edition was published in 2006. Additionally, the organization requires instructors to select only by methodological technique (rather than by statistical technique or other relevant tags).

    We sought to not only update Milinki’s (2000, 2006) work but also to expand upon it. Our project involved the following: first, we selected recent articles for 15 subfields in psychology (see Table 1).

    Table 1

    Subfields included in PLUMS

    _____________________________________________________________________

    Cognitive

    Cross-Cultural

    Development

    Disorders

    Drugs

    Emotion & Motivation

    Learning

    Marketing

    Memory

    Neuroscience

    Personality

    Sensation & Perception

    Sleep

    Social

    Stress & Health

    ______________________________________________________________________

    Each article includes the reference making it easy to find the original article. Additionally, we wrote targeted factual and discussion questions about the research for each article. The factual questions are accompanied by the correct answer as well as the page number where the answer is found and can serve as reading quizzes. For example, “How did the researchers collect data about age preferences?” [they used an Implicit Association Test (IAT), p. 957]. The discussion questions then go beyond simple factual questions and require students to think critically about the reading. For example, “Can you think of another way to conduct this type of research that does not involve using the IAT?” These can be used for classroom or LMS-based discussions. We also include information about the design(s), analyses, and graphical/tabular displays to allow for cross-referencing of information, allowing instructors multiple ways to select empirical articles to coincide with whatever topic or technique is being introduced to their class.

    We envision instructors being able to not only complement current class topics but also to have additional options such as:

    • Assigning some of the articles as extra credit activities (e.g., having students answer the factual and/or discussion questions posed).
    • Using the articles to serve as jumping-off points for students to create a research proposal as a capstone project in a research methods and/or statistics course.
    • Enriching content courses with topical empirical articles related to the course’s subfield.

    The first author hand-selected 8 - 12 articles within each of 15 subfields of psychology, based on presumed undergraduate student readability and recency (publication year). We had undergraduate psychology students read and rate all of the selected articles. The students provided ease-of-reading ratings (on a 5-point Likert scale: 1 = easiest, 5 = hardest; M = 2.0, SD = .48) as well as interest ratings (again on a 5-point Likert scale: 1 = no interest, 5 = most interest, M = 4.5, SD = .71). We were successful in finding 5 articles for each included subfield that were rated as both relatively easy to read as well as interesting. In cases of conflict between the ratings, we prioritized selecting articles that were rated as easier to read rather than more interesting.

    We created a system for instructors or students to submit articles (and the corresponding metadata) so that the materials are regularly updated and enriched. Contributions will be reviewed by members of the project making it a peer-reviewed process. Accepted submissions would be acknowledged as a contribution to the project that could be listed on an instructor's or student’s CV.

    Empirical article libraries, such as the one we built, explicitly help improve student competence with methodology and statistics by using real, published data that they may encounter as fledgling producers or consumers of research. Conducting research comparing having students read articles from our project (specifically selected to be brief, readable, and interesting) vs. using a traditional textbook would be relatively easy to do. In fact, several faculty members could collaborate on such research. We hope that including some of these articles will also make the topics of research methods and statistics classes (which can be dry) more enjoyable.

    We believe that incorporating empirical research will help to counter the perception that methods and statistics are boring (and isolated) subjects rather than the heart of the science of psychology. As instructors, we should be determined to have our students become better consumers of research/statistics and be more aware of what different research designs can (and cannot) tell us. This is especially important given that about 75% of students do not go on to graduate school (Lewandowski et al., 2017), and therefore must learn these skills as undergraduates. For instance, some of these discussion questions speak to applied issues (e.g., “How might we try to decrease bias against older individuals?”). Being psychologically literate will help all of our students become better citizens and better able to know what questions to ask when confronted with data.

    Our project is available at https://sites.google.com/beloit.edu/plums/home. We hope you find it useful.

    References

    Norcross, J.C., Hailstorks, R., Aiken, L.S., Pfund, R.A., Stamm, K.E., & Christidis, P. (2016). Undergraduate Study in Psychology: Curriculum and Assessment. American Psychologist, 71(2), 89-101. doi: 10.137/a0040095

    Lewandowski, G.W., Ciarocco, N.J., & Strohmetz, D.B. (2017). Chapter 23: Research Methods 2.0: A New Approach for Today’s Students. In R. Obeid, A. Schartz, C. Shane-Simpson, & P.J. Brooks (Eds.) How We Teach Now: The GSTA Guide to Student-Centered Teaching. Retrieved from the Society for Teaching of Psychology web site: https://teachpsych.org/ebooks/howweteachnow

    Milinki, A. (2000, 2006). A Cross Section of Psychological Research: Journal Articles for Discussion and Evaluation. Pyrczak Publishing.

    Sizemore, O. J., & Lewandowski, G. W. (2011). Lesson learned: Using clinical examples for teaching research methods. Psychology Learning & Teaching, 10(1), 25-31. https://doi.org/10.2304/plat.2011.10.1.25


  • 16 Jul 2024 5:31 PM | Anonymous member (Administrator)

    Lisa Dierker
    Wesleyan University


    My Story

    I was still in my 20s when I arrived at Wesleyan University, fresh off a 3-year post-doctoral fellowship at the Yale School of Medicine. When asked to teach a research methods course, I had what felt like a brilliant idea driving home from the grocery store one day. I would not use a textbook and I would not deliver lectures. My own classroom training had been ineffective and uninspiring. As I tell my students, I learned 20 different kinds of post hoc tests but didn’t understand when or why to actually use one. So, instead of drowning my own students in information the way I had been drowned, I decided to get them involved with large, real-world data sets and support them in conducting original research. I would teach them what they needed to know when they needed to know it and not before. Their own questions would drive the learning and I would help them to experience the research process from start to finish. Passion-Driven Statistics was born!

    Ten years later, it would become a multidisciplinary introductory statistics course at Wesleyan and a National Science Foundation funded model serving thousands of students across disciplines and educational environments in the United States and Internationally (e.g., Canada, Ghana, Nigeria, Philippines, Peru, United Kingdom, and still reaching). Passion-Driven Statistics is now a widely used project-based curriculum that has been implemented as a statistics course, a research methods course, a data science course, a capstone experience, and a summer research boot camp. Liberal arts colleges, large state universities, regional colleges and universities, medical schools, community colleges, and high schools have all successfully implemented the model.

    The curriculum has been found to attract higher rates of under-represented minority (URM) students compared to a traditional statistics course and students enrolled in Passion-Driven Statistics are more likely to report increased confidence in working with data and increased interest in pursuing advanced statistics coursework (Dierker et al., 2018). This project-based approach also promotes further training in statistics. Using causal inference techniques to achieve matched comparisons across three different statistics courses, students originally enrolled in Passion-Driven Statistics were significantly more likely to take at least one additional undergraduate course focused on statistical concepts, applied data analysis, and/or the use of statistical software compared to students taking either an activity-based psychology statistics course or a math statistics course (Nazzaro, et al., 2020). In more recent research Passion-Driven Statistics has been found to be associated post-graduation with a higher likelihood of holding a job in which a primary responsibility includes working with data, greater confidence in working with data, and a higher likelihood of earning more than $100K annually (Dierker et al., in press).

    A New Role

    I always thought that I understood the ingredients that make Passion-Driven Statistics so empowering, and if asked, I would have told you about the opportunity to ask your own research questions, or I would have pointed to its just-in-time and need-to-know approach to content knowledge, or even its focus on technical skills in the service of disciplinary content and critical thinking. This year, I stepped back in to teach the course after several years away from it. Seeing it with fresh eyes more than 20 years after that first spark of inspiration made me realize that so much of its power comes from the simple act of new learners teaching newer learners.

    I used to be the “new learner,” understanding exactly what it felt like to encounter and struggle with the abstract concepts, disciplinary jargon, mathematical complexity, and the arcane programming syntax involved in authentic research. Two decades later, I find that my role in the course has changed. I am no longer a new learner, and as much as I try to recreate that space and those feelings in myself, the “curse of knowledge” and my hard-won expertise hold me back. Now, I am recognizing an entirely new role in providing support to those former Passion-Driven Statistics students who have generously stepped in as peer mentors, warmly guiding our newest generation of students in the same empowering way that I was able to all those years ago. They are now the new learners teaching our newer learners from a place of empathy, passion, patience, high expectations, and mutual support. Every day in class, I see them using their new learners’ superpowers to inspire others, to explain concepts by getting to the simpler, more digestible parts faster, and to understand students’ perspectives in a deeply genuine way. I have loved watching them hone their skills in listening, adapt to the needs of the individual students and nurture them in ways that meaningfully impacted their own educational trajectory when they played the role of the newer learner.

    Working this semester with some of the current peer mentors, Joyce Sun, Erin Byrne, and Luis Perez, has reminded me that Passion-Driven Statistics is as much a culture as it is a course. It is a space where no one needs to know everything, where we can all bring our best stuff, and where moral support and compassionate engagement allow our students to become the heroes of their own learning. Together, we take students out of their comfort zone and then love them through the fallout by creating an inviting classroom and an experience that gives students a safe and supportive space to get things wrong before they get them right.

    While my role as expert in this space may continue to be necessary and even valuable on rare occasions, it is also wholly insufficient. It is only together with new learners, our newer learners, and expert voices that we hold the necessary and sufficient ingredients to change lives in the data analytics space. I know, it sounds rather dramatic, but it is! 

    And if that were not enough, I am also marveling at the chorus that I have continued to hear from peer mentors across the years, that they “learn more as a peer mentor” than they did when taking the Passion-Driven Statistics course for the first time. Though secondary and post-secondary education continues to resist the power of learning through teaching, it is the most untapped, cost positive tool that we currently have as educators. I believe that it is stronger even than the current promises of AI. Peer mentors may serve as volunteers, be paid through student work programs or training grants, or receive course credit as teaching assistants or through course designations (e.g., statistics education practicum). It does not have to be a promise for the future. We have everything that we need right now.

    The Next Step

    You might be interested to learn that my time away from teaching Passion-Driven Statistics has been spent designing a new project-based curriculum aimed at reimagining General Education. The goal of this new initiative is to expose students to a wide range of digital skills as they learn traditional disciplinary content. Within our digital “Introduction to Psychology” course, students explore concepts and content in the field of psychology through video storytelling, programming, data visualization, web development, design and more. This novel curriculum is aimed at solidifying new content knowledge, exposing students to modern digital tools, and providing them with the opportunity to create new learning artifacts.

    And with this, I have found myself a new learner again, not just conquering new content outside of my research subdiscipline, but learning new tools, new skills, new design principles and being useful again the way only a new learner can be. All this newness is of course accompanied by uncertainty, vulnerability, and the distinct possibility of utter failure. It is hard and that is what I love about it. I find myself feeling inspired again and eager to bound out of bed in the morning to face new challenges and to find the transformative experience that I first found in the Passion-Driven Statistics classroom all those years ago.

    I am always eager to network with passionate instructors excited about things we have not even imagined yet. Please feel free to reach out at ldierker@wesleyan.edu.

    Resources for Passion-Driven Statistics are available at https://passiondrivenstatistics.com/. Some that you might find particularly useful include a free e-book and translation code aimed at supporting the use of diverse statistical software. Resources for Digital Intro are available at https://digitalintro.wescreates.wesleyan.edu/. I encourage you to take advantage of our introductory psychology lessons and project videos on our Youtube Channel. I am also happy to share a new project sharing platform, OpenLab, where students can get inspired, post learning artifacts, and share their work and learning by creating a free digital portfolio. Follow us on Instagram or check us out on LinkedIn to learn more!

  • 15 Apr 2024 6:42 PM | Anonymous member (Administrator)

    Rachel T. Walker
    University of the Incarnate Word
    Click here for a link to the article with figures

    When I was an undergraduate student in biology, I decided to take a statistics course in psychology. I didn’t realize at that time that I would later be teaching this course in graduate school, and I couldn’t imagine that statistics would be one of my favorite courses to teach. Statistics can be a challenging subject for many students, but effective teaching can make a significant difference in how it's perceived (Pan & Tang, 2004). Over the years I have taught this course using a variety of teaching strategies based on the department layout of the course. As I progressed in teaching this course, I wanted the course to be flexible and responsive to students’ needs and create an active and effective learning experience for teaching behavioral statistics.

    Over the years, I have continued to ask questions related to effective teaching strategies. What if I embedded videos or journal articles related to the real-world application of statistics? How could formative assessments such as quizzes, discussions, and polls during the course gauge students’ understanding? How can I use hands-on applications to illustrate the concepts of the material? Can I combine traditional lectures with interactive elements? Could I use a technology integration like SPSS (a software package used for the analysis of statistical data) to provide a hands-on project? How can I use scaffolded learning to break down complex statistical concepts? I will share some of the ways I have addressed these questions.

    What if I embedded videos and research related to the real-world application of statistics?

    I embed videos and research materials using a mixture of resources. Here are several examples of how I use short videos within the lecture. I show the videos during class, but students can also access them outside of class to confirm their understanding of the material.

    I incorporate various Crash Course Statistics videos into the semester, offering detailed examples that illustrate the practical applications of specific statistical concepts in our daily lives. Before the start of the semester, I reach out to students and shared a crash course statistics video that provides the purpose of statistics; for example, how meteorologists use statistical methods to analyze historical weather data, identify patterns, and make predictions about future weather conditions, and how companies use statistics to aid in analyzing consumer behavior, preferences, and trends. I incorporate additionalCrash Course Statistics videos to provider a preview of specific statistical concepts, such as central tendency, before diving into the lecture content. For instance, before the central tendency lecture, I share a video that provides an overview of how these statistics can determine the center of both normal and skewed distributions.

    Crash Course Statistics Preview

    https://youtu.be/zouPoc49xbk?si=bBGlQy3SviHhirAH

    Mean, Median, and Mode: Measures of Central Tendency: Crash Course Statistics #3

    https://youtu.be/kn83BA7cRNM?si=arSRn7zQJddDpOhj

    In one of the classes, I cover the four levels of measurement along with fundamental definitions and a few examples. Following that, I present a brief video offering visual insights into the distinctions among the measurement scales.

    Data Science & Statistics: Levels of measurement

    https://youtu.be/eghn__C7JLQ?si=mOoqzh-k-adUtNz6

    Another instance related to the use of a short video involves the application of bar graphs. Initially, I instruct students on the X- and Y-axes to depict data. Students acquire the skills to construct histograms and bar charts and interpret their representations. Once they grasp the fundamentals of bar graphs, I introduce a video that provides real-world instances of commonly shared misleading graphs.

    How to spot a misleading graph

    https://youtu.be/E91bGT9BjYk?si=4Rn8keUpH5yGpVC2

    In addition to videos, I also distribute sections of a journal article, allowing students the chance to practice reading and interpreting the results section. I first provide students with the abstract of the article to offer a brief overview of the article, highlighting the main objectives, methods, results, and conclusions of the work. I share the results section to provide an overview of the structure of the results on the statistic that relates to the lecture. This is usually the first time that students are introduced to reading the results of a scientific article related to psychology. This process assists students in understanding the format of how statistics are reported in a journal article and the use of APA format. In other psychology courses, students will be required to summarize scientific articles and understand the methods and analyses.

    How could formative assessments such as quizzes, discussions, and polls during the course gauge student understanding?

    Quick quizzes are embedded throughout the lecture to test student understanding after each small section of content. These questions, taken from Cengage’s instructor materials related to the textbook (Essentials of Statistics for the Behavioral Sciences 10th ed., Gravetter et al.) could be multiple choice, true or false, or applied research questions. This process allows students to confirm they understand the course material before we continue to move forward in the chapter.

    I incorporate discussion group assignments in the course to encourage active engagement amount students. Throughout the semester, I offer six discussion board opportunities, where students submit their discussion topics and respond to posts from their peers.

    Here are several examples of sources that could be used for creating discussion group assignments:

    1) A majority of Americans have heard of ChatGPT, but few have tried it themselves. Integrate the information from the tables into your overall understanding of the material.

    https://www.pewresearch.org/short-reads/2023/05/24/a-majority-of-americans-have-heard-of-chatgpt-but-few-have-tried-it-themselves/

    2) How to defend yourself against misleading statistics in the news.

    Integrate the information in the video in your overall understanding of misleading statistics.

    https://youtu.be/mJ63-bQc9Xg?si=CqIubt8xxzHLFtx8

    3) Correlating Barriers to Medication Adherence With Trait Anxiety, Social Stigma, and Peer Support in College Students With Chronic Illness

    Indicate how the information from the tables and result section into your overall understanding of the material.

    https://www.psichi.org/page/273JNFall2022#.Y8R15hXMK3A

    Directions for Response: Make sure your responses are well thought out and each provides at least 3 sentences for each section. Respond to each of the following questions: Describe the topic provided by this resource. What did you find interesting? How would this relate to the real world? What did you find challenging to understand?

    Directions for Replies to colleagues should be at least 3 sentences as well. Reply to another student's post: Replies can include your thoughts about the student's perception of the source or your additional thoughts on the topic related to the source.

    Moreover, I employ Poll Everywhere in diverse manners within a lecture. For example, at the beginning of a lecture on descriptive statistics, students are asked “What type of social media is used the most in the U.S.?” Once students submit their thoughts, I show students the data related to this question that did not support most of their responses for adults. However, I then provide data on teens' use of social media that is closer to their responses. After the discussion, I lecture on descriptive statistics.

    Here are the links I shared from PEW and discussed the changes over time.

    https://www.pewresearch.org/internet/2021/04/07/social-media-use-in-2021/

    https://www.pewresearch.org/journalism/fact-sheet/social-media-and-news-fact-sheet/

    https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/

    https://www.pewresearch.org/internet/2023/12/11/teens-social-media-and-technology-2023/

    I also use Poll Everywhere towards the end if a lecture to ensure that students understand the content. For example, rate your level of understanding of how to calculate an independent t-test. If students respond that they are struggling with this issue it provides useful feedback and students can ask specific questions regarding their issue.

    How can I use hands-on applications to the concepts of the material?

    Here is an example of how I utilize hands-on applications in class. First, I teach students how to read and understand a research scenario, determining information such as the alternative hypothesis, the alpha level, and the variables provided.

    During a lecture, I present how to use that information in the 4-step process for hypothesis testing.

    1. State null and alternative hypotheses.

    2. Identify the critical region based on alpha level, one or two-tailed hypothesis, and degrees of freedom.

    3. Compute the statistics by showing all calculations.

    4. Draw out the distribution with the critical and test statistic. Conclude and report the findings in APA format.

    After students took notes on this process, I provide them with another research scenario to solve during class. While students are working through the 4-step process, I assist them along the way. For example, if a student wants to know if they are on the correct path, they might ask if their critical region is correct. If the student is incorrect, instead of saying no, I ask them a question.. I asked them to show me how they came to that conclusion. This process allows the student to find the correct answer in most cases. Students can then proceed to complete homework questions using the hands-on applications introduced in class. In many real-world scenarios, the use of statistical software and tools has become standard due to their efficiency, accuracy, and ability to handle large datasets. This process of manual calculations can be more effective in conveying the step-by-step process, contributing to better conceptual communication. However, in some work situations small datasets and manual calculations can be quicker than setting up and using statistical software.

    Can I combine traditional lectures with interactive elements? Could I use a technology integration like SPSS to provide a hands-on project?

    How can I alter my previous teaching of behavioral statistics? I did something I thought I would never do. I had to remove some of the content to provide students with the opportunity to learn how to analyze, interpret, and summarize their results by integrating technology. I use SPSS, but other types of software can be used. such as Microsoft Excel. I've excluded lectures covering paired-t tests, two-way ANOVA, and Regression. While these statistics are referenced in a lecture, students won't receive in-depth information about these subjects. Our department offers an elective course in Advanced Statistics, providing students with the opportunity to explore and delve into more intricate statistical concepts. In addition, this change allowed me to use those class times to embed a lab component into the lectures.

    I provide students with a preexisting dataset that I collected earlier, which they use in the lab component of the class. This data is employed for descriptive statistics, independent t-test, one-way ANOVA, and correlations. I familiarize students with the broader subject of the research they will be examining, which involves personality and social networking. Subsequently, I clarify the variables and their measurements, such as gregariousness and the frequency of social media usage. In the lab, I guide them through the SPSS layout to enhance their understanding of the software's functionality and then provide lab for each of the four types of statistics that will be analyzed in SPSS. For example, after I teach the independent t-test, I will have a lab focused on how to calculate the independent t-test in SPSS, how to interpret the outcome, and how to write up the findings in the APA format. I provide handouts for the lab that include an introduction, the steps to complete in SPSS, an example of the output, and a paragraph of the findings of the example. As I explain this process, students follow along by mimicking my steps. Subsequently, I task students with forming hypotheses derived from the measured variables. In the assignment, students are required to generate two hypotheses. I review each hypothesis before examining the analysis of the first one in the lab. Afterward, I provide feedback on the results of each student's first hypothesis before the conclusion of the lab session. Throughout the lab, I employ the Socratic method to facilitate learning and guide students in completing the assignment related to the second hypothesis outside of class.

    How can I use scaffolded learning to break down complex statistical concepts?

    Teaching a course using scaffolding involves providing structured support to students as they learn new concepts, gradually removing this support as they gain mastery. Here's my step-by-step guide on how I implement scaffolding in a course:

    1) Assess prior knowledge: I use poll everywhere at the beginning of a lecture.

    2) Break down the information: I define terms, provide steps for analysis, and utilize quizzes.

    3) Provide guidance: I allow students individual practice in and out of the classroom.

    4) Encourage collaboration: I embed collaboration with the instructor and other students.

    5) Continuous assessment: I assess in-class calculations, poll everywhere, and quizzes.

    6) Gradual release of responsibility: I utilize the Socratic method in the lecture and lab.

    7) Applications to real-world tasks: I offer discussions on real-world situations and provide students with the opportunity to analyze, interpret, and report on existing data.

    8) Flexibility: I utilize an adaptive based on various levels of support needed.

    It is essential to teach statistics according to students' needs and foster an active and effective learning experience for various reasons. This includes the utilization of active learning methods, such as hands-on activities and engaging discussions, to keep students motivated and involved in the learning process. Additionally, enhancing understanding by presenting practical situations, connecting statistical concepts to real-world scenarios, and equipping students with proactive skills and problem-solving abilities are key objectives in this approach.

    In summary, teaching statistics in a way that addresses students' needs and incorporates active learning methodologies enhances the overall learning experience, making the subject more accessible, engaging, and applicable to students' academic and professional pursuits.

    I consistently adapt and modify the course design in response to student feedback and through collaboration with fellow instructors. This ongoing process makes teaching this course a continuous and rewarding experience. This story never ends… which makes this course still one of my favorite courses.

    Reference

    Pan, W., & Tang, M. (2005). Students' perceptions on factors of statistics anxiety and instructional strategies. Journal of Instructional Psychology, 32(3), 205.


  • 20 Mar 2024 2:15 PM | Anonymous member (Administrator)

    Mona Corinna Griesberg
    FernUniversität in Hagen, Germany

    During my psychology bachelor’s program in Germany, classes were mostly teacher-centered lectures that allowed little student engagement. Fortunately, psychology classes at a small liberal arts college in Michigan, USA, introduced me to less hierarchical but feminist teaching formats. In the beginning of her feminist psychology course, Dr. Karyn Boatwright shared her feminist teaching philosophy aiming to create collaborative learning communities (Enns et al., 2005; Sinacore & Boatwright, 2005). She gave me the opportunity to facilitate a social action project focused on sexism research. In the following, I describe the project and reflect on its benefits and challenges. By sharing my experience as a teaching assistant, I hope to encourage fellow educators to create more wholesome learning opportunities for students to gain research and feminist leadership experience.

    Feminist Leadership & Project Goals

    Dr. Karyn Boatwright teaches Feminist Psychology of Women every winter term. Within ten weeks, two classes of circa twenty students each meet three times a week to discuss feminist issues from a psychological perspective. An intergral part of the course is students’ participation in social action projects, making up thirty percent of their grading. In the beginning of the term, students received a list of possible social action projects and informed Dr. Boatwright about their preferences. For example, the projects were about creating a more inclusive environment at the campus gym, a theatre performance on reproductive rights or feminist peer-support groups for students. I offered a collaborative  research group focusing on sexism research. Each social action group consisted of five to ten students and met outside of class to work on a social justice issue. The projects aimed to promote community engagement, political activism, and long-term social change. In the following, I describe the social action group that I facilitated as a teaching assistant: a collaborative research group focusing on ambivalent sexism. As the facilitator of the collaborative research group, I endeavored to follow my professor’s example and apply feminist leadership principles. That meant creating collaborative learning spaces that were - contrary to my former and many others experiences in higher education - based on relationship building and welcomed expressions of emotions, intuition, and vulnerability. I wanted to minimize hierarchies between the students and myself but allow us to build trust and connection. Furthermore, the goal was to empower the students, to foster awareness for the needs of marginalized communities and to increase an understanding for how feminist research can contribute to social justice and improved well-being. 

    Academia and research have long been less accessible for people with marginalized identities (DeBlaere, 2020). I hoped the project would help students to feel less intimidated by research, to connect research to personal experiences, and to build confidence knowing that they have much to contribute to academic spaces. However, the project was not directly aimed at encouraging students to stay in academia or build a career in research. Instead, the project offered opportunities for students to build their curiosity by finding and exploring topics about which they cared deeply and wanted to learn more. I wanted to open their minds to the different ways in which research can be conducted, so they have a broader view of what research and academia can look like, can contribute to change within the fields, and can make more informed decision career choices.

    Project Activities

    Since the project was carried out during the covid-19 pandemic, it was limited to synchronous and asynchronous online learning. Our weekly online meetings gave the project the necessary structure. In preparation for these meetings, students did their individual  literature review on ambivalent research (Glick & Fiske, 2018). I recommended that they would invest up to two hours per week in their research. In an online drive, students could view recommended scientific research papers and other materials like podcast episodes. Moreover, I encouraged students to look for alternative materials to engage with the project topic in a way that would suit their interests. If they found interesting alternative materials, they added them to our online drive.

    During our weekly meetings, we discussed their individual research of the past week.  I invited students to pose questions and to share their newly gained knowledge. For example, if someone didn’t understand the statistics of a research paper they had read, we looked at it again. We also shared observations of our daily life that aligned with the research content we had learnt about. Towards the end of the term, we used out group meetings to plan our final online event. Even though students’ individual research was the basis for their learning, the regular meetings created a communal learning experience.

    In addition to our weekly project meeting, I organized online meetings with international researchers who worked on topics related to ambivalent sexism. Students attended on a voluntary basis and asked questions about the research and academic life. It was also an opportunity to practice networking, an important skill for career development. 

    Furthermore, I met students one-on-one online, at least twice throughout the semester. The first time was a chance to get to know each other better and discuss expectations for the research project and their first impressions. Towards the end of the project, we met again to reflect on the overall learning experience, to exchange feedback and discuss their grading.

    At the end of the term, our project group hosted a public, online event to which all students of Feminist Psychology and other interested community members were invited. At the event, the project members presented what they had learnt about ambivalent sexism and engaged the audience in a discussion. It was an opportunity to share their new knowledge, to practice presenting research in an appealing way and to get feedback from other community members about how the scientific concepts related to their personal experiences.

    Grading

    The grading was based on their attendance of the group meetings, their independent research and participation in the final online event about our project. In the beginning of the project, I explained to the students that they would keep track of their engagement themself. At the end of the term, I met with each student, we reflected on the group project and they told me how they graded their own engagement. Unless it was very different from my impression, the grade was set and contributed thirty percent to their final grading of the whole Feminist Psychology course.

    Most students had been very engaged in our group project and graded themselves accordingly. There was a student whose engagement seemed low to me in the beginning of the project. After the first few weeks, we met one-on-one and discussed how she could improve her learning experience and contribute more to our group learning. If educators get the impression that a student isn’t engaged and capacities allow, I would advise educators to do the same time: get in touch with the student, try and find out how they feel about their current engagement and depending on their interest, reflect on how they may engage more.

    Learning Through Group Facilitation

    For me as a teaching assistant, the project was a fantastic opportunity to practice feminist group leadership, project development and social justice research. I engaged in networking and community building and learned about student supervision, skills that proved useful in graduate school and my further career. Besides, it was simply fun to get to know the students and to learn from them in challenging discussions. For example, we talked about how our families’ dynamics had contributed to us internalizing gender roles. We also reflected on the image of research and how it had influenced our own academic aspirations.

    In parallel to the social action project, I was working on my bachelor’s thesis about ambivalent sexism and the engagement with my peers helped my motivation and creativity. Furthermore, the group facilitation and collaboration with Dr. Karyn Boatwright enhanced my passion for researching and teaching psychology and combining these with social justice work. The positive student feedback also encouraged her to continue working with me.

    Towards More Wholesome Learning Experiences

    The success of the ambivalent sexism research project led to another social action project that I facilitated in 2023.This time, seven students took part and the project centered upon lesbian, gay, bisexual, transgender, queer and related (LGBTQ+) research. Because the pandemic restrictions had been loosened, the project was conducted on-campus creating new opportunities: In addition to independent research, weekly group meetings and one-on-one meetings, I invited students to engage in further learning activities. For example, some students and I attended external events together, a game night for community building at a local non-governmental organization and a presentation of recent LGBTQ+ research at a neighboring university. We also met for a crafting evening on campus where a queer artist joined us online to teach us crafting queer zines. A few of us met for an intimate self-care morning including meditation and talking circles which created special bonding moments between the students. These activities were all voluntary and there was no grading penalty if students did not join. 

    Lastly, I want to highlight a memorable project event: Through an LGBTQ+ organization, we got in touch with bi- and pansexual women from the local community and invited them to join one of the feminist psychology classes. Four women agreed to meet and share their experiences and perspectives with the class. We met in room on campus that allows comfortable seating in a circle next to a fireplace. Providing hot chocolate, tea and snacks contributed to the rather informal and comfortable setting. The students in our group project had prepared questions about the women’s identities, wishes, experiences of discrimination and coping strategies and facilitated the classes. The speakers shared many personal insights, for example, about religious communities and female, sexual empowerment. Their openness allowed for an empowering experience for all attendees. Many students explicitly mentioned the positive impact of the event in their course reflections.

    Compared to the ambivalent sexism research project, the LGBTQ+ research project took a more holistic learning approach by including independent research, group meetings focusing on scientific research as well as artistic expression and self-care, one-on-one meetings, online meetings with international researchers, an in-class meeting with local community members and further voluntary events off-campus. These voluntary activities were offered to the students to explore their academic and personal interests, find inspiration, experience a sense of community, to gain confidence in scientific discussions and in their research skills, to build connections to international researchers and more. It was the students’ responsibility to decide how and how much they could and wanted to invest in the project and in the group. This freedom was essential to avoid emotional overload and to create positive experiences in research engagement.

    Future Directions and Considerations

    In other educational settings, similar projects may require different degrees of structure and flexibility. We implemented the group projects in Feminist Psychology of Women at a small liberal arts college in the Midwest. It’s a relatively small student body with small cohorts and classes. Less than ten students participated in each social action project. Other lecturers and teaching assistants may not have the resources to invest the time and effort into a few students. However, I would like to encourage lecturers in higher education to acknowledge the resources that their teaching assistants, advanced students and interns bring to the table. Sharing teaching and leadership responsibilities can not only ease the lecturer's work but also create new learning opportunities for the group facilitators. Each social action group in Feminist Psychology was facilitated by a teaching assistant. We had the responsibility of overseeing the progress of our project group. The lecturer Dr. Karyn Boatwright met with us, teaching assistants, weekly to talk about how the projects were going and if we needed any additional support.

    With the support of Dr. Karyn Boatwright and with few bureaucratic barriers, I had much freedom in developing and facilitating the projects. This may be different at other higher education institutions where, for example, the curricula and grading guidelines may be stricter. Other challenges may be insurance and safety when attending external events with students or inviting external guests on-campus. Before the project, lecturers and teaching assistants should make sure they know about the risks and limits of working on their project outside of campus. They should be mindful that working on social justice issues might come with different risks for different students and they may need assistance in navigating those risks. For example, being associated with LGBTQ+ topics might be dangerous in communities that hold strong anti-LGBTQ+ attitudes. Therefore, the local and academic environment should be considered when selecting the research topic and activities.

    The teaching assistants should consider students’ multifaceted identities and positions. Their former experiences and knowledge can differ as well as expectations and needs for the project. The group constellation will lead to a particular power dynamic within the group. Certain intersecting identities will be underrepresented or less visible throughout the project. Thus, teaching assistants should consider how they can make space for those perspectives in the group discourse. For example, they can recommend research materials that thematize underrepresented identities and experiences. Encouraging students to bring in their own interests and alternative materials can also help to diversify the learning content. Still, lecturers and teaching assistants should be aware that conflict mediation might be needed. Since students’ learning curves and their opinions on social justice issues differ, it is important to address early on that the group should work towards a comfortable learning environment for everyone. To achieve this, it can help to collectively set some ground rules in the beginning of the project.  

    I hope naming these possible challenges does not discourage educators from questioning if and how they can apply my suggestions to their work. Every educational setting has its challenges and limits. Therefore, each project and group will be different and require adaptation. Nonetheless, I see much potential in this approach of collaborative student research groups. To start small, educators might consider the following: do they have the capacities to create multidimensional learning experiences for their students? Would the students be interested in project work on social justice topics? Are there possibilities of local or virtual community engagement? Can they share learning responsibilities with teaching assistants and students?  How can they make research less intimidating and academia more accessible for a variety of students? Overall, how can you create collaborative learning spaces? I believe that answering these questions, can move higher education towards create more enjoyable and wholesome learning experiences for the students and educators.

     

    References

    DeBlaere, C. (2020). Defining myself in: My early career journey. Women & Therapy, 43(1-2), 144-156. https://doi.org/10.1080/02703149.2019.1684672

    Glick, P., & Fiske, S. T. (2018). The ambivalent sexism inventory: Differentiating hostile and  benevolent sexism. Social Cognition (pp. 116-160). Routledge.

    Enns, C. Z., Sinacore, A. L., Acevedo, V., Akçali, Ö., Ali, S. R., Ancis, J. R., Anctil, T. M., Boatwright, K. J., Boyer, M. C., Byars-Winston, A. M., Fassinger, R. E., Forrest, L. M., Hensler-McGinnis, N. F., Larson, H. A., Nepomuceno, C. A., & Tao, K. W. (2005). Integrating Multicultural and Feminist Pedagogies: Personal Perspectives on Positionality, Challenges, and Benefits. C. Z. Enns & A. L. Sinacore (Eds.), Teaching and social justice: Integrating multicultural and feminist theories in the classroom (pp. 177–196). American Psychological Association. https://doi.org/10.1037/10929-011

    Sinacore, A. L., & Boatwright, K. J. (2005). The Feminist Classroom: Feminist Strategies and Student Responses. C. Z. Enns & A. L. Sinacore (Eds.), Teaching and social justice: Integrating multicultural and feminist theories in the classroom (pp. 109–124). American Psychological Association. https://doi.org/10.1037/10929-007

     

    Acknowledgements: I would like to thank Dr. Karyn Boatwright for her invaluable trust, support and supervision throughout the projects as well as her feedback on this essay. I would also like to thank the amazing students who participated in the projects and all researchers and community members who enabled the depth and variety of our learning.
  • 05 Feb 2024 7:47 PM | Anonymous member (Administrator)

    Brooke O. Breaux
    University of Louisiana at Lafayette

    My department’s Psychological Science course has two primary objectives: 1) for students to start building the underlying knowledge that they will need to become a producer of psychology, and 2) for students to become more familiar with psychology as a major and a discipline. Psychological Science—designed for second semester freshman who have taken only an introductory psychology course—was integrated into my department’s 2020-2021 curriculum. Our intention was for Psychological Science to be taught as a traditional in person course, but due to the precautions taken by my university in the midst of the COVID-19 pandemic I taught this course first as a synchronous online course and then as a hyflex course with students deciding whether to attend class in person or online and then, finally, as a fully in person course. Setting aside the complexities of teaching the course in formats different than the one we had in mind when developing the course, Psychological Science itself is ambitious. At a minimum, students enrolled in this course are required to complete a pre-course and a post-course assessment, to take exams and/or quizzes, to construct an actionable plan for their professional development and career exploration, to earn a research ethics certification (i.e., Undergraduate Training on Human Subjects Research through the Collaborative Institutional Training Initiative (CITI), to serve as a participant in actual psychological research, and to write a brief APA Style research proposal. Faculty assigned to teach this course are required to cover topics ranging from psychology as a discipline—including degrees and careers in psychology—to psychology as a science—including research methods, research ethics, and APA Style writing. When teaching this course for the first time, I made the incorrect assumption that if my goal was to have my students write quality research proposals, all I needed to do as an instructor was to provide them with the relevant research design concepts and a clear assignment rubric. What I learned that first semester was that such an approach was insufficient for many of my students and that they needed significantly more scaffolding to produce what I would consider to be a quality product.

    I have now taught this course six times and have dramatically changed the way in which I teach research methods. The approach I have developed is highly scaffolded, involving a sequence of three assignments. For each of the assignments, I have constructed explicit instructions, aligned the delivery of course topics with the assignment deadline, and eliminated unnecessary complexity; however, before diving into a more detailed discussion of my efforts to make the writing of an APA Style research proposal a much more integral part of the course, I thought it would be useful to discuss the development of our Psychological Science course, the integral role it plays in my department’s current curriculum, and our efforts to standardize certain elements of the course.

    Curricular-Level Enhancements: How Did We Get Here?

                When I was hired as a faculty member, undergraduate psychology majors did not take our Introduction to Psychology course. They took two courses designed for majors: one focused more on the basic science of psychology and the other focused more on the applied aspects of psychology. After several semesters of teaching the basic science half of this introductory course sequence, I advocated for a change in our curriculum. This change was supported by the faculty members teaching these introductory courses for majors, who agreed that our curriculum lacked a true research methods course, that we could do a better job of preparing students for our Psychological Statistics course, and that the order in which we introduced certain topics and assessed certain learning outcomes in our curriculum could be improved. To illustrate this last point, it is helpful to know that students enrolled in our basic science of psychology course for majors were typically freshman who were taking the course during their first semester in college. This is the same semester in which the majority of students take their first general education English writing course, which requires them to write papers in MLA Style. Then, during the same semester, our basic science of psychology course for majors introduced students to the discipline of psychology; taught them about some of the major themes, concepts, and findings related to basic science topics, such as the biological psychology and cognitive psychology; and required them to write an APA Style literature review. It is no wonder, then, that many students found it difficult to be successful in this course. I knew that our department could provide students with a better introductory learning experience and that such a change could also serve to strengthen our curriculum.

    We went with a change that would require our majors to take a Psychological Science course but only after taking our Introduction to Psychology course. The decision to have all students take our Introduction to Psychology course was supported by documents such as “Strengthening the Common Core of the Introductory Psychology Course” in which the American Psychological Association (2014) explains that there is no evidence in the literature to suggest that having two introductory psychology courses—one for majors and one for nonmajors—is needed. The decision to create a new course for majors was supported by Stoloff et al. (2010), who suggest that departments that want a more robust Introductory Psychology course for their majors can modify other requirements and sequencing. For example, departments that want to provide more early experiences might be better served by creating another course, such as one that addresses research methods (Stoloff et al., 2010), career preparation (Atchley, Hooker, Kroska, & Gilmour, 2012; Brinthaupt, 2010; Thomas & McDaniel, 2004), preparation for the major (Atchley et al., 2012; Dillinger & Landrum, 2002), or writing in the major (Goddard, 2003)” (American Psychological Association [APA], 2014, p. 20). To this end, we determined that students would benefit from the creation of a required Psychological Science course designed to target these specific objectives.

    Psychological Science is a critical course in our curriculum, providing students with a solid foundation in research methods and serving as a prerequisite for our required Psychology Statistics course. Because of its foundational nature in our curriculum and because it would inevitably be taught by a variety of faculty members, we determined that a minimum standardization of the course would be necessary to ensure similar outcomes across all students. Included in our standardization of this course is the requirement for all students to complete a brief APA Style research proposal, consisting of an APA Style title page, introduction with APA Style citations, method section, and APA Style reference entries; however, what we did not specify was a means by which faculty are to achieve this objective. There are two faculty members who regularly teach Psychology Science, but other faculty members are assigned to teach this course as needed. Everyone who teaches Psychological Science is considered a member of our standardization committee. The role of this committee is to address any issues a faculty member has with the standardization and resolve these issues by updating or changing the standardization.

    Course-Level Enhancements: What Am I Doing?

    Teaching psychological research methods to undergraduates who have only had an introductory psychology course is challenging, and requiring undergraduate students to complete research proposals within such a course can be overwhelming for everyone involved, especially when the class is not small (i.e., around 45 students), does not include a laboratory component, and takes place during a 15-week semester. In the context of research methods courses, project-based learning experiences, such as writing a research proposal, are generally encouraged; however, because the assignments described in the literature tend to focus on more advanced students (e.g., Chamberlain, 1986), I used trial-and-error to develop an approach that enables students to more effectively and efficiently produce quality research proposals. Interestingly, my intuitions ended up aligning with strategies that have been advocated by other instructors, such as reducing unnecessary complexity, especially as it relates to research design (e.g., Yoder, 1979), and offering students the opportunity to practice producing quality writing (e.g., Ishak & Salter, 2017).

    My initial approach to teaching this course was to provide lectures on the relevant topics in the order that they appear in the textbook, expecting students to incorporate this information into their research proposal document. My students found this part of the process exceedingly difficult and this strategy resulted in research proposal that did not meet my expectations; therefore, I created a three-stage (i.e., Introduction Section, Method Section, and Appendices), step-by-step process for developing a research proposal. The instructions for each section are contained within step-by-step documents that are made available to students on our learning management system. To reduce unnecessary complexity I reordered the course topics so that the concepts read about in the textbook and discussed in class would be directly relevant to the part of the research proposal that students are currently working on, and students are explicitly told which step in the step-by-step documents the textbook readings and lecture materials are relevant to. I also created a grading form that aligns with the step-by-step document, which enables me to provide timely feedback at each stage.

    Anyone interested in how I have aligned lecture topics, APA course objectives, and development of an introduction section, method section, and appendices can access this information in the form of a poster I presented at the APS-STP 2023 Teaching Institute (Breaux, 2023; https://www.dropbox.com/scl/fi/m2mfy7dot5vtdthrswk36/2-APS-2023-Teaching-Poster-BREAUX.pdf?rlkey=fe7e9wnhg9ucm99pu5kffacrw&dl=0).  Actual resources that I used during the Spring 2023 semester, such as the step-by-step guidelines (e.g., “Introduction Section Instructions”) and grading forms (e.g., “Introduction Section Rubric”), can be found in the main folder I created for the APS-STP 2023 Teaching Institute (Breaux, 2023; https://www.dropbox.com/sh/0mc2al4pcshfa90/AABtl7wI32Q8UD8UIJgQ0PDha?dl=0). Readers are invited to use or modify the resources provided for educational purposes only.

    I have also made the literature review portion of the introduction more manageable by requiring students to cite only four empirical research articles. This approach allows students to focus on basic skills, such as integrating information from different sources and using APA Style citations appropriately. It also helps students avoid both accidental plagiarism (often due to insufficient paraphrasing skills) and intentional plagiarism (often due to issues with time management). Another change that I made was to have the whole class focus on the same topic. I always try to select a topic that psychology undergraduates could relate to on a personal level, such as the extent to which college students believe psychological myths (e.g., Hughes et al., 2015) and the extent to which college students engage in self-care (e.g., Zahniser et al., 2017). I have found that topics related to the teaching of psychology and social psychology tend to be more accessible to students at this stage in their academic careers and that topics related to biological psychology and cognitive psychology are the most difficult. Pre-selecting a topic for the semester affords two primary benefits: Students can start reading the empirical literature sooner, and I can address issues specific to the topic during class time. My current approach to teaching Psychological Science shares similarities with Passion Driven Statistics (https://passiondrivenstatistics.wescreates.wesleyan.edu/), which is a project-based approach to teaching statistics that focuses on providing students with only as much information as they need to successfully complete the current tasks they have been assigned.

    Conclusion

    These improvements have made teaching psychological research methods to undergraduates who have only had an introductory psychology course feel much more manageable. Even though my evidence is primarily anecdotal, students seem less intimidated by the research proposal process because they are more aware of my expectations and the ways in which I want them to utilize the course materials when working on their research proposal. I hope that my experience can inspire other faculty members not only to continue improving their own courses to meet the needs of students but also to advocate for broader curriculum changes in their own departments, and I hope that what I have learned along the way can be used by others to improve how we teach psychological research methods to undergraduates.

    References

    American Psychological Association. (2014). Strengthening the common core of the introductory psychology course. Washington, DC: American Psychological Association, Board of Educational Affairs. Retrieved from https://www.apa.org/ed/governance/bea/intro-psychreport.pdf

    Breaux, B. O. (2023, May 23-24). Benefiting from explicit instruction, content alignment, and strategic simplification [Poster presentation]. APS-STP 2023 Teaching Institute, Washington, D.C., United States. https://www.dropbox.com/sh/0mc2al4pcshfa90/AABtl7wI32Q8UD8UIJgQ0PDha?dl=0

    Chamberlain, K. (1986). Teaching the practical research course. Teaching of Psychology, 13(4), 204-207. https://doi.org/10.1207/s15328023top1304_8 

    Hughes, S., Lyddy, F., Kaplan, R., Nichols, A. L., Miller, H., Saad, C. G., Dukes, K., & Lynch, A.-J. (2015). Highly prevalent but not always persistent: Undergraduate and graduate student’s misconceptions about psychology. Teaching of Psychology, 42(1), 34–42. https://doi.org/10.1177/0098628314562677

    Ishak, S., & Salter, N. P. (2017). Undergraduate psychological writing: A best practices guide and national survey. Teaching of Psychology, 44(1), 5–17. https://doi.org/10.1177/0098628316677491

    Stoloff, M., McCarthy, M., Keller, L., Varfolomeeva, V., Lynch, J., Makara, K., Simmons, S., & Smiley, W. (2010). The undergraduate psychology major: An examination of structure and sequence. Teaching of Psychology, 37(1), 4–15. https://doi.org/10.1080/00986280903426274

    Yoder, J. (1979). Teaching students to do research. Teaching of Psychology, 6(2), 85-88. https://doi.org/10.1207/s15328023top0602_7

    Zahniser, E., Rupert, P. A., & Dorociak, K. E. (2017). Self-care in clinical psychology graduate training. Training and Education in Professional Psychology, 11(4), 283–289. https://doi.org/10.1037/tep0000172

  • 27 Nov 2023 3:13 PM | Anonymous member (Administrator)

    Amanda W. Joyce
    Murray State University

    Psychological research methods can be a dreaded course for students and instructors alike.  Students report negative emotions from and perceptions about about research, they struggle to see the relevance of research-related material, and they are concerned about the complexity of the research process, all of which can negatively impact their understanding of the course content (Balloo, 2019; Murtonen et al., 2008; Rancer et al., 2013).  Similarly, instructors broadly report concerns about student tardiness, dishonesty, inattention to material, and lack of preparation (Fazily et al., 2018; Lashley & de Meneses, 2001) which could be exacerbated in challenging courses like research methods. 

    Thus, innovative techniques are needed to improve student and instructor experiences in research methods.  Frequently, this innovation comes in the form of applied, active learning that is directly relevant to student experiences—characteristics which have long been touted as beneficial for student learning (Ball & Pelco, 2006; Etengoff, 2023).  In fact, a recent study drawing upon interviews of experienced research methods instructors heavily emphasized the benefits of allowing students to apply what they learned, particularly through hands-on research experiences (Lewthwaite & Nind, 2016).

    Involving students in hands-on research experiences, however, can present still more challenges.  Individual student projects can lead to a heavy grading burden for instructors, and partnered or group projects can be fraught with interpersonal complaints and social loafing.  The purpose of this essay is to explore an option for whole-class collaborative data collection that still allows students individually to propose, analyze, write about, and present data on a project of their own personal choosing.  The collaborative data collection process encourages accountability and teamwork.

    The Project

    Pedagogical Context

    At my university, psychological research methods and statistics are taught in a combined three-course sequence, with the third course focusing on hands-on data collection in what is generally the students’ first research project.  Enrollment for this third course is typically 15 students, all Psychology majors.  The learning objectives for this course require successfully navigating the research process (e.g., “Generate an original research question,” “Conduct a research study in accord with APA’s ethical principles,” etc.).  Thus, the learning objectives of the course, as well as the teaching technique I propose here, encourage students to navigate the research process, from idea generation to final presentation. 

    The Research Project: What Works for Me

                I have personally had great luck with an approach to teaching research methods that intermixes individual and group work while leading students through their first ever quantitative research project.  I have found it to increase individual accountability and teamwork while reducing many of the headaches associated with individual or paired data collection.  I provide a brief overview of the project below.  I am also happy to share course resources with interested readers.

    Students’ experience with hands-on active learning through research occurs through a semester-long research project that occurs in three main phases: (1) individual idea generation, (2) group questionnaire and database creation, and (3) individual data analyses and presentation.

    Individual Idea Generation

                Students begin the semester by individually generating research questions.  Research shows that students have better learning experiences when they work on projects that are personally meaningful (Andresen et al., 2020), and I have found this to be true in my classes as well.  We spend several class periods discussing the contents of a strong research hypothesis that would be testable under the constraints of semester-long project collected on students at their university.  For instance, we discuss how longitudinal hypotheses or hypotheses about overly-specific populations who we are unlikely to recruit on campus (i.e., the elderly; fraternity members who have been diagnosed with schizophrenia) would not be appropriate.  I also limit students to correlational (as opposed to experimental) research designs, which work best within our collaborative data collection process that emphasizes surveys as the primary data collection method.  During the first week of classes, students submit a list of five research questions that they are interested in exploring, which means that they are generating ideas before they have had the benefit of all class discussions on the topic, but generally one or two of their ideas are appropriate, and I am able to guide them toward those ideas. 

    Then (week 3) students submit a final research question for approval before they dive into their topic of interest.  A librarian visits the course to teach students about how to use library resources to find peer-reviewed journal articles on their topics of interest, and students use this information to find five or more articles (week 5), which they summarize and later synthesize into an introduction section for their research paper (week 7).  

    Group Questionnaire and Database Creation

                Students then gather measures relevant to their individual research hypotheses.  They often overlap with their peers in their topics of interest, meaning that there is overlap, too, in the measures that they may choose.  For instance, one student may be interested in anxiety and sleep quality, while another is interested in fraternity and sorority membership and anxiety, and yet another is interested in sleep quality and religiosity.  I encourage students with overlapping topics to work together to find common measures so as to reduce their burden in working with said measures, and I find that they are happy to take this opportunity for reduced workload.  When students happen to not have variables in common with their peers, I encourage them to use brief measures, such as short-form versions of scales rather than full scales, so as to reduce participant burden.

    Students submit their measures (week 6) and, after I have reviewed each of them, we spend a class period gathering each measure into a class-wide shared Google Doc that will later become the questionnaire packet that participants receive.  Combining the measures into a single document during class ensures that everyone has the ability to closely supervise the process and catch any potential errors, like missing items or typographical errors, particularly in overlapping measures, for which several students are very closely monitoring.

    Throughout the semester, students learn about the ethical aspects of research, and they have been working through ethical certification (CITI Training).  Thus, as soon as measures are gathered, we are ready to submit our project, as a single application, to our institutional review board (IRB) for approval.  I submit the application on students’ behalf, but I include the measures and hypotheses that they have provided to me, and we spend one class discussing the contents and importance of the IRB application and process.

    In the one to two weeks (usually weeks 7 and 8) needed for IRB approval, the class prepares for data collection.  First, we learn about the data collection process and how to write about it.  Students learn departmental policies for data collection, including how to reserve rooms, how to use our participant management system (SONA), and more, and they write drafts of methods sections for their final paper. 

    When students begin collecting data (usually week 8 or 9), they host research sessions individually, but they collect on the full research packet that was approved by the IRB.  In other words, even though students collect data individually, they collect data relevant to everyone.  This means that they have the ability to share research materials, that they can cover each other’s research sessions in case of emergency, and that they feel a personal accountability to the group to do good research.  It also means that they can have a large sample size, typically 100 or more students drawn from our department’s research participant pool.  I emphasize throughout the semester how we are a team working toward a common goal, and I find that students will often organically support one another in ways that I haven’t anticipated, such as offering up suggestions about where to find free or cheap printing for research materials.

    Similarly, we crowdsource data management.  We spend several class periods building a shared class database in Google Sheets.  Students are responsible for creating a key for their individual measures so that everyone knows how data should be entered for all measures.  Again, in a combination of individual and group efforts, each student is responsible for entering all data that they collect, meaning that they are helping to support not only their own research interests but also their peers’.  This shared data entry strategy is another way in which I find students embracing the collaborative nature of this type of work—many will offer to cover data entry for another student when they know the other student is overwhelmed with their participant workload.

    Individual Analyses and Presentation

    When students finish data collection (week 11 or 12), we can begin the data analysis process.  Students are reminded as a group how to run the most common analyses (calculating a scale score from Likert data, determining participant demographics, running a reliability analysis, correlations, and t-tests).  Then there are several in-class workdays during which students can practice these analyses on their own data.  Each student is responsible for analyzing data relevant to their own research hypothesis.  I float around the computer lab to provide support to students with questions, but as there is only one of me, they find additional support in their classmates.  Students often answer one another’s questions and double-check analyses.  This is easily the most rewarding part of the semester, hearing students teaching and encouraging one another, and cheering when they see statistically significant results.

    Following analyses, students are responsible for sharing their results in a final research paper.  They previously submitted a draft of an introduction (week 7) and method section (week 9).  The initial method draft was written at a time when they did not know their participants’ characteristics, so in that draft, they left placeholders for these numbers.  Thus, one of their first tasks after data analyses is to write a new draft of their methods section with these placeholders replaced with actual data.  They submit this alongside their results section (week 12) with a discussion section to follow roughly two weeks later.  While writing generally can’t be completed fully in class, students have several in-class writing days so that they can consult with the instructor and their peers when questions arise. 

    Students then learn about data presentation and create a draft poster to be submitted during the last week of class. Again, because students are working on individual research hypotheses, each of these paper and poster drafts are individual, but students have the benefit of receiving feedback from peers and the instructor on drafts at all stages, meaning that final projects are often in phenomenal shape.

    Students submit their finished products early during finals week, and then individually present their research to the class during the final examination period.  This is another very encouraging part of the semester, as students learn more about their peers’ projects and offer encouragement for their hard work.  Furthermore, because the work was approved by the IRB, students are in a very good position to later take their research projects to other venues, such as on-campus undergraduate research conferences and/or regional professional conferences, to share their findings with a broader audience.

    The Outcome

                The structure of the class research project, intermixing group and individual components is, admittedly overwhelming sometimes, particularly if an individual student must miss class frequently, such as in cases of student athletes.  In those cases, the students’ lack of attendance has the potential to harbor everyone’s progress on the collaborative project, so a fair amount of instructor foresight and flexibility is necessary in order to accommodate those absences and ensure that the project can still move forward.  That said, I have found the collaboration to be worthwhile.  Grades, attendance, and course evaluations have increased since I began collaborative data collection, as have student accountability and teamwork.  As students move in and out of group and individual efforts, they see the ways in which they efforts impact themselves and others, and they embrace the process of working toward a common goal.

                More than that, students recognize the ways in which collaboration has allowed them to more effectively manage their time so that they are not duplicating efforts.  For instance, by pooling their data collection, they avoid saturating the research pool and have access to many more participants than they would if they had collected data individually.  Similarly, from the instructor perspective, students’ collaboration allows me to more efficiently work with them (for instance, allowing me to work with one IRB application instead of 15), so that I can free up time to provide more detailed feedback on drafts throughout the semester, which also benefits the students.

    Teamwork makes dreamwork.  Gone are the days of spending countless office hours listening to students complain about how their research partner isn’t doing their fair share of the work.  Gone, too, are the days of trying to grade results sections based on data collected from 7 participants.  Instead, I see students working together and holding themselves to a high standard, and I see their efforts resulting in extraordinary outcomes.  I hope that others can find relief and excitement in a similar approach.

    References

    Andresen, L., Boud, D., & Cohen, R. (2020). Experience-based learning. In Understanding adult education and training (pp. 225-239). Routledge.

    Ball, C. T., & Pelco, L. E. (2006). Teaching research methods to undergraduate psychology students using an active cooperative learning approach. International Journal of Teaching and Learning in Higher Education, 17(2), 147-154.

    Balloo, K. (2019). Students’ difficulties during research methods training acting as potential barriers to their development of scientific thinking. Redefining scientific thinking for higher education: Higher-order thinking, evidence-based reasoning and research skills, 107-137. https://doi.org/10.1007/978-3-030-24215-2_5

    Etengoff, C. (2023). Reframing psychological research methods courses as tools for social justice education. Teaching of Psychology, 50(2), 184-190. https://doi.org/10.1177/00986283221097404

    Fazli, A., Imani, E., & Abedini, S. (2018). Faculty members' experience of student ethical problems: A qualitative research with a phenomenological approach. Electronic Journal of General Medicine, 15(3). https://doi.org/10.29333/ejgm/84952

    Lashley, F. R., & de Meneses, M. (2001). Student civility in nursing programs: A national survey. Journal of Professional Nursing, 17(2), 81-86. https://doi.org/10.1053/jpnu.2001.22271

    Lewthwaite, S., & Nind, M. (2016). Teaching research methods in the social sciences: Expert perspectives on pedagogy and practice. British Journal of Educational Studies, 64(4), 413-430. https://doi.org/10.1080/00071005.2016.1197882

    Murtonen, M., Olkinuora, E., Tynjälä, P., & Lehtinen, E. (2008). “Do I need research skills in working life?”: University students’ motivation and difficulties in quantitative methods courses. Higher Education, 56, 599-612. https://doi.org/10.1007/s10734-008-9113-9

    Rancer, A. S., Durbin, J. M., & Lin, Y. (2013). Teaching communication research methods: Student perceptions of topic difficulty, topic understanding, and their relationship with math anxiety. Communication Research Reports, 30(3), 242-251. https://doi.org/10.1080/08824096.2013.806259


  • 19 Jul 2023 7:12 PM | Anonymous member (Administrator)

    Daniel A. Clark, Madelynn D. Shell, & Andria F. Schwegler
    Texas A&M University--Central Texas

    *Note: For the version with the figure included, please follow this link: https://www.dropbox.com/s/9fis1ey479l1vdl/6.%20June_Clark%20et%20al.docx?dl=0

    Learning about research and statistics may be a much-maligned element of any undergraduate psychology program from the perspective of students (Harlow et al., 2009), but it is also widely viewed as an important element in psychological literacy (APA, 2013). On the faculty side, teaching these courses is often cited as challenging due to the amount of material required (Ciarocco et al., 2017). Instead of both faculty and students suffering in silence while engaging in these courses, we decided to take steps to improve how we teach all of our research-oriented undergraduate courses with the goal of distributing some of the content in the research methods course across other courses leading up to it. This redistribution of the workload was intended to ensure that students have equitable preparation for research methods and that students leave the program with equivalent experiences.

    To start the process, full-time faculty in the undergraduate psychology program began meeting regularly to discuss the desired alignment across the research course sequence (i.e., writing in psychology, statistics, and research methods) and rewrite the course learning outcomes in a manner that captured what we were doing in our individual classes. As academics, we did not always agree on everything, but we were inspired by a desire to improve our teaching and our students’ learning to find common ground. Putting the students’ learning ahead of our own idiosyncratic preferences enabled us to listen to each other’s perspectives, consider multiple ways to achieve a goal, and make decisions based on research across our respective content areas to facilitate learning. Such collaboration acknowledges that each faculty member has the academic freedom to teach using the methodology that they feel is best, but it also recognizes that courses do not exist in a vacuum (for further discussion see Cain, 2014). Courses exist in the context of programs which requires that faculty members come together at the program level to: 1) articulate the scope and quality of education we are providing to our students and 2) develop alignment across the curriculum so students acquire the same basic skills regardless of instructor, enabling them to graduate from the program with comparable knowledge and experiences. On a personal level, we were also seeking to reduce our own frustrations from teaching the research methods course with students who were not adequately prepared for it.

    Step 1: Start with the End in Mind

    We started by looking at the big picture: skills that were necessary for students to ultimately be successful in the research methods course and their psychology degree in general, rather than being bogged down by individual course outcomes and descriptions.  Consistent with previous research on teaching research methodology (Ciarocco et al., 2017; Gurung & Stoa, 2020), we found that our end goals for student performance in the course and in the program aligned quite well despite some differences in structure and content. For example, we agreed that we wanted our students to conduct IRB-approved human subjects research and collect real world data, a high impact practice (American Association of Colleges & Universities, 2013). The larger goal was for these research projects to provide grist for student conference presentations and graduate school applications. Our discussions regarding how we could set our students up for success led to the articulation of fairly specific skills (see Figure 1) that also resulted in clarifying some wording in the program learning outcomes. These specific skills fit our needs well though others might find that broader, more general wording allows for individual variation between faculty.

    Figure 1. Skill alignment across three research-oriented courses

    Step 2: Back Track to the Beginning

    Our program is housed in a regional, upper-level university that offers only junior and senior level courses in partnership with 2-year colleges. The undergraduate psychology degree includes three, four-credit hour research-oriented courses that students take in sequence: writing in psychology, statistics, and research methods. Research methods is a content-heavy class, particularly when designing original research and collecting data as part of the course, so we decided to introduce some of the research methods skills in the prerequisite courses. For example, in many universities, learning APA style starts in introductory or general psychology courses (Fallahi et al., 2006; Gurung et al., 2016). Because our university does not offer introductory-level courses, we added teaching of these skills to the first course in the research sequence, writing in psychology. In addition, we added basic research design to the writing in psychology course, as evidence suggests this can improve scientific reasoning in students at the introductory level (Becker-Blease et al., 2021). These skills prepare students for critically reading research articles not only in the writing in psychology course but across the curriculum.

    In addition to shifting skills to the beginning of the program, we moved some skills to the second course in the sequence, statistics, which students take prior to research methods. For example, students often enter research methods not knowing how to write statistical analyses in APA style, create online surveys, or clean and format data in a spreadsheet. These skills are essential to successfully completing the research project in research methods. Instead of waiting to introduce these skills in research methods, we modified the lab portion of the statistics course to include instruction in these areas. Thus, students come into research methods with an introduction to many of the basic skills they will use.

    Step 3. Ground the Plan in Learning Research

    These revisions have improved consistency and quality across our program because they are aligned with current knowledge about learning. In our discussions, we brought to bear years of research that has documented learning effects that should be incorporated into education. We know that prior knowledge improves subsequent learning, likely by reducing cognitive load (Simonsmeier et al., 2021). Spacing and retrieval practice also enhance learning (Latimier et al., 2021). By introducing important skills in earlier courses, we have made more effective use of these known mechanisms to facilitate learning. For example, as can be seen in Figure 1, relevant aspects of APA style were revisited in all three of the research-oriented courses in the curriculum. Although research methods instructors teach APA style, they now know that these skills have been introduced in previous courses and are able to focus on transfer and application of these skills rather than teaching a brand-new skill. The goal of this explicit attention to introduction/encoding, spacing, interleaving, and retrieval of information is for subsequent learning in research methods to be easier and more long lasting for students.

    Step 4. Put it in Writing

    After the end skills and curriculum map were sketched out in the first three steps, it was time to put those changes into writing so we could communicate them clearly to our students. We expanded and rewrote the course learning outcomes and the course descriptions so that they directly aligned to each of the program learning outcomes and reflected the scaffolded structure of the content students were expected to demonstrate. We also reviewed course prerequisites to ensure students were acquiring the material in the order we had designed. Using required prerequisites helped ensure that students enrolled in courses to build up their prior knowledge (Lauer et al., 2006). Finally, we discussed required assessments in each course. Although these were minimized to prioritize faculty academic freedom, we identified some core assessments that needed to be included in our courses. For example, a key outcome in research methods was writing a full research manuscript in proper APA style.  

    Conclusion

    By aligning our course learning outcomes with program learning outcomes and identifying exactly where in the program these concepts were introduced and reinforced, we know that students are exposed to basic knowledge before entering research methods. We are also assured that when students graduate from our program, regardless of the section they completed, they are all equipped with the same basic skillset. As a 100% transfer institution, our students come to us with very diverse backgrounds and preparation. Ensuring that every student has the same exposure to essential skills such as APA style, survey development, and statistical analysis before research methods facilitates the process of the data-collection project.  Importantly, this plan embeds the high-impact practice of undergraduate research into the required curriculum, creating equitable access and opportunities for all students which have been chronic problems with implementation of these experiences (Zilvinskis et al., 2022). By focusing on broader program and course learning outcomes and using these to align our research-oriented curriculum, we were able to provide our students with a better, more consistent experience, without infringing on faculty academic freedom to choose how they teach these outcomes. We found this was a satisfying blend of faculty subject matter expertise and a collective articulation of expectations and standards that benefitted both our faculty and our students.

     


     

    References

     

    American Association of Colleges and Universities. (2013). High Impact Practices. Retrieved from: https://www.aacu.org/trending-topics/high-impact

    American Psychological Association. (2013). APA Guidelines for the undergraduate psychology major: Version 2.0. Retrieved from: https://www.apa.org/ed/precollege/about/undergraduate-major

    American Psychological Association. (2011). Principles for quality undergraduate education in psychology. Washington, DC: Author. Retrieved from http://www.apa.org/education/undergrad/ principles.aspx

    Becker-Blease, K., Stevens, C., Witkow, M. R., & Almuaybid, A. (2021). Teaching modules boost scientific reasoning skills in small and large lecture introductory psychology classrooms. Scholarship of Teaching and Learning in Psychology, 7(1), 2–13. https://doi.org/10.1037/STL0000173

    Cain, T. R. (2014, November). Assessment and academic freedom: In concert, not Conflict. (Occasional Paper #22). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment. https://www.learningoutcomesassessment.org/wp-content/uploads/2019/02/OccasionalPaper22.pdf

    Ciarocco, N. J., Strohmetz, D. B., & Lewandowski, G. W. (2017). What’s the point? Faculty perceptions of research methods courses. Scholarship of Teaching and Learning in Psychology, 3(2), 116–131. https://doi.org/10.1037/STL0000085

    Fallahi, C. R., Wood, R. M., Austad, C. S., & Fallahi, H. (2006). A program for improving undergraduate psychology students’ basic writing skills. Teaching of Psychology, 33(3), 171–175. https://doi.org/10.1207/s15328023top3303_3

    Gurung, R. A. R., Hackathorn, J., Enns, C., Frantz, S., Cacioppo, J. T., Loop, T., & Freeman, J. E. (2016). Strengthening introductory psychology: A new model for teaching the introductory course. American Psychologist, 71(2), 112–124. https://doi.org/10.1037/A0040012

    Gurung, R. A. R., & Stoa, R. (2020). A national survey of teaching and learning research methods: Important concepts and faculty and student perspectives. Teaching of Psychology, 47(2), 111–120. https://doi.org/10.1177/0098628320901374

    Harlow, L. L., Burkholder, G. J., & Morrow, J. A. (2009). Evaluating attitudes, skill, and performance in a learning-enhanced quantitative methods course: A structural modeling approach. Structure Equation Modeling. https://doi.org/10.1207/S15328007SEM0903_6

    Latimier, A., Peyre, H., & Ramus, F. (2021). A meta-analytic review of the benefit of spacing out retrieval practice episodes on retention. Educational Psychology Review, 33, 959–978. https://doi.org/10.1007/s10648-020-09572-8

    Lauer, J. B., Rajecki, D. W., & Minke, K. A. (2006). Statistics and methodology courses: Interdepartmental variability in undergraduate majors’ first enrollments. Teaching of Psychology, 33(1), 24–30. https://doi.org/10.1207/s15328023top3301_6

    Simonsmeier, B. A., Flaig, M., Deiglmayr, A., Schalk, L., & Schneider, M. (2021). Domain-specific prior knowledge and learning: A meta-analysis. Educational Psychologist. https://doi.org/10.1080/00461520.2021.1939700

    Zilvinskis, J., Kinzie, J., Daday, J., O’Donnell, K., & Vande Zande, C. (2022). Introduction: When done well – 14 years of chasing an admonition. In J. Zilvinskis, J. Kinzie, J. Daday, K. O’Donnell, & C. Vande Zande (Eds.), Delivering on the promise of high-impact practices: Research and models for achieving equity, fidelity, impact, and scale (pp. 1-12). Stylus.


<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
Powered by Wild Apricot Membership Software