E-xcellence in Teaching
Editors: Manisha Sawhney & Natalie Ciarocco

  • 15 Aug 2017 11:31 AM | Anonymous

    That’s What She Said: Educating Students about Plagiarism


    Elizabeth A. Sheehan

    Georgia State University


    Dealing with plagiarism is one of the more unpleasant aspects of our job as instructors. There is the sinking feeling you get when you suspect plagiarism, the moment that your Google search returns the exact passage from your student’s paper, the uncomfortable conversation with the student, the documentation to your department, and the potential hearing with the honor board. I would venture to say most of us have either dealt with these ourselves or at least supported another colleague through the process. These cases range from the cringe-worthy (e.g. copying directly from an instructor’s own published article, turning in a paper written by another student in a past semester) to the more minor infringements (e.g. unintentionally omitting quotation marks around a direct quote).

    At the teaching conferences I attended over the last few years, there seems to have been more emphasis on learning outcome assessment and reliance on the APA’s learning outcomes for undergraduates with a psychology major (APA, 2007). One of those outcomes is for students to “demonstrate effective writing skills in various formats” (p. 18). There also never seems to be a lack of presentations on how to incorporate writing assignments into your courses. Increasing writing assignments in your courses might mean increasing the chance you will encounter plagiarism; however, we might be able to prevent some of these cases with a greater focus on educating our students about plagiarism. Moreover, educating our students about plagiarism helps us address other APA learning outcomes about ethical behavior.



    To decrease plagiarism, a good place to start would be to try to understand WHY students plagiarize. At the last meeting of the National Institute on the Teaching of Psychology, I led a Participant Idea Exchange (PIE) on educating students about plagiarism (Sheehan, 2013). These PIE sessions are roundtable discussions on a topic. My group generated the following list of potential reasons students plagiarize:

    • ·         difficulty comprehending a reading;
    • ·         rushing through an assignment;
    • ·         convenience;
    • ·         cultural misunderstanding;
    • ·         poor understanding of the definition of plagiarism;
    • ·         not knowing how to integrate/synthesize/paraphrase;
    • ·         plagiarism is all around us in society; and
    • ·         not confident in their ability to write.

    You may be familiar with some of these, especially time constraints, difficulty with reading comprehension, and the inability to paraphrase. The idea of culture stood out to me from the PIE discussion. First, some cases of plagiarism could be due to cultural misunderstanding. Stowers and Hummel (2011) provide some examples of how students from an Eastern culture may view the use of another’s work. For instance, they assert some Asian students may see it as a sign of disrespect to paraphrase or change someone else’s words.

    A second example of culture is how plagiarism takes place all around us in society. We regularly use the functions of copy and paste on our computers in many different settings. People re-post others’ writing on their Facebook pages, re-blog someone else’s blog entry, forward youTube videos to friends, etc. Usually these events can be accomplished through one or two clicks. While these aren’t examples of academic writing, they do provide precedents that we have to overcome in our courses.



    We had a discussion about plagiarism in my department, and our faculty reported a number of problems in pursuing cases of plagiarism, including some cases not being reported at all, faculty handling cases on their own, cases meeting our discipline’s definition of plagiarism being overturned by the college, not knowing the university reporting procedures, etc. It was clear we needed consistency and clarity. We also decided we wanted to focus less on policing, and to favor educating our students to prevent future plagiarism. You could probably guess that this led to a subcommittee (and the idea for my PIE). Our subcommittee created a standard definition of plagiarism that went into all syllabi, a writing workshop on plagiarism, a quiz, a contract for students, a flow chart of how to report plagiarism, and class activities to teach the identification of proper paraphrasing and citations. These materials (Lamoreaux, Darnell, Sheehan, & Tusher, 2012) are publicly available on the Society for Teaching of Psychology website (http://teachpsych.org/Resources/Documents/otrp/resources/plagiarism/Educating%20Students%20about%20Plagiarism.pdf).

                At my PIE, I asked other faculty how they educated their students about plagiarism. Below are the techniques they listed:

    • a quiz on plagiarism;
    • a quiz on student handbook;
    • list policies in the syllabus on paraphrasing and/or a link to school policy;
    • discussion on the first day of class;
    • starting early in introductory classes or freshman year before students are allowed to register for classes; and
    • using technology (e.g. Turnitin or SafeAssign).

    One quiz recommended by multiple instructors is available through Indiana University, and can be found at https://www.indiana.edu/~istd/. At this site, students can complete a tutorial on plagiarism, see examples, take a quiz, and get a certificate of completion. My department uses this site as a part of our plagiarism training for students.

                A lot of us put policies on plagiarism in the syllabus and reference it on the first day of class; however, this alone is not enough. First, we can’t always rely on students to read it or to follow a link to the university policy. Second, we can’t assume they will understand the policy. Gullifer and Tyson (2010) present data demonstrating students have a great deal of confusion over what constitutes plagiarism despite online access to a policy. Students in their study also reported wanting education on plagiarism. These findings are also corroborated by data from Holt (2012).

                Holt provided basic information about plagiarism to a control group of students and training in paraphrasing to an intervention group. The control group received a definition of plagiarism in the syllabus, a link to the university policy, one example of proper paraphrasing, and a 10-minute demonstration of improper phrasing in class. The intervention group received training in paraphrasing and proper citations, along with assignments in class. As you might expect, the group with additional training was able to identify plagiarism more accurately than those without training. This study identified reasons for unintentional plagiarism as well. For example, students thought that quotations were not needed or materials didn’t have to be paraphrased if a citation was provided.

                Something as simple as a weekly paraphrasing activity can help. For 6 weeks of the semester, Barry (2006) gave students a paragraph from a famous developmental theorist. Students had to paraphrase the passage and provide a proper citation. After completing the activity, students’ definitions of plagiarism were more complex than those offered at the onset of the study. Not only did they define plagiarism as “taking someone else’s idea”, they added “not giving credit” to their definition. This isn’t necessarily evidence that this activity would reduce the number plagiarism cases, but it is evidence of students gaining a better understanding of plagiarism.

                You could also incorporate plagiarism as a theme in your course. Estow, Lawrence, and Adams (2011) designed a research methods class where the assignments and projects in the class related to the topic of plagiarism. For example, their students designed a survey about plagiarism, collected data, and wrote a research report on their findings in one set of assignments. The researchers compared the progress of this class to one with the same assignments but a different theme. The students in the plagiarism-themed course were able to better identify plagiarism and generate more strategies for avoiding plagiarism.

    Plagiarism is scary, for both professionals and students. The consequences can be steep. It has resulted in failed assignments, expulsion from school, revoked degrees, and even ended careers. Students often tell me how terrified they are of unintentional plagiarism; Gullifer and Tyson’s participants also expressed fear of unintentional plagiarism and the consequences of plagiarism. Implementing some of these fairly simple ideas in our courses will enhance our students understanding of plagiarism. A better-informed student should be less fearful, more confident in their ability to write, and less likely to plagiarize.




    American Psychological Association. (2007). APA guidelines for the undergraduate psychology major. Retrieved from http://www.apa.org/ed/precollege/about/psymajor-guidelines.pdf

    Barry, E. (2006). Can paraphrasing practice help students define plagiarism? College Student Journal, 40(2), 377-384.

    Estow, S., Lawrence, E. K., & Adams, K.A. (2011). Practice makes perfect: Improving students’ skills in understanding and avoiding plagiarism with a themed methods course. Teaching of Psychology, 38(4), 255-258.

    Gullifer, J., & Tyson, G.A. (2010). Exploring university students’ perceptions of plagiarism: A focus group study. Studies in Higher Education, 35(4), 463-481.

    Holt, E. (2012). Education improves plagiarism detection by biology undergraduates. BioScience, 62(6), 585-592.

    Lamoreaux, M., Darnell, K., Sheehan, E., & Tusher, C. (2012). Educating students about plagiarism. Retrieved from  Office of Teaching Resources in Psychology for Society for the Teaching of Psychology website: http://teachpsych.org/Resources/Documents/otrp/resources/plagiarism/Educating Students about Plagiarism.pdf

    Sheehan, E. A. (2013, January). Kick plagiarism to the curb: How to educate students before they head down that road. Participant Idea Exchange conducted at the National Institute on the Teaching of Psychology, St. Pete Beach, Fl.

    Stowers, R. H., & Hummel, J. Y. (2011) The use of technology to combat plagiarism in business communication classes. Business Communication Quarterly, 74(2), 164-169.



    Elizabeth Sheehan is a Lecturer at Georgia State University. She earned her PhD in Psychology from Emory University in Cognition and Development. She currently teaches Intro Psychology, an integrated version of Research Methods and Statistics, and Forensic Psychology. She has presented her work on designing study abroad programs, teaching with technology, and incorporating writing assignments into courses at teaching conferences, such as the Southeastern Conference on Teaching of Psychology and the Developmental Science Teaching Institute for the Society for Research in Child Development.


  • 01 Aug 2017 8:36 AM | Anonymous

    Supporting Students Using Balanced In-Class Small Groups


    Hung-Tao Michael Chen

    Eastern Kentucky University


    The usage of in-class small groups has been shown to improve students’ learning experience (Johnson & Johnson, 2002). Although many studies have demonstrated this effect, few studies have looked at how the specific composition of group members could support students who are at risk of dropping out from college. This essay describes a pilot study that uses the College Persistence Questionnaire to group students (Davidson, Beck & Milligan, 2009). Preliminary results are inconclusive, showing that high performing students might be benefitting more from the small groups than low performing students. 


    Creating Small Groups in the Classroom

    Student persistence has been one of the greatest challenges faced in higher education (Seidman, 2005; Tinto, 2006; Tinto 2010). While many researchers have identified students who are at risk of dropping out and proposed intervention strategies, few have looked at the effectiveness of balanced in-class small groups to promote peer networking and support. Conventionally, most instructors who use small groups in the classroom would form the groups by random selection or allow the students to form their own groups. The author of this essay proposes, instead, to form the small groups by first identifying students who have high risk of dropping out from college and group these students with those who are not at risk. These “balanced” small groups should provide students with greater peer support in the classroom.

    We have all encountered students who are underperforming in the classroom and are at risk of dropping out. Factors that include personal, cultural, economic, and social forces all affect a student’s ability to persist in college (Tinto, 2006). Strategies such as building learning communities and cohort systems have been implemented by many universities to improve student retention rate (Tinto, 2010). The problem with many of these retention strategies is that they generally require institutional support and substantial financial backing to ensure success and longevity. Is there a strategy that an instructor could easily implement in the classroom, does not require major course re-design and does not require financial support?

    One strategy that only requires a small investment from the instructor is the usage of balanced small groups in the classroom. The usage of small groups in the classroom is not a new idea and it has proven to be an effective way of promoting learning (Johnson & Johnson, 2002, 2015). Past research has also shown that peer support would increase a student’s college persistence (Eckles & Stradely, 2012; Skahill, 2002). However, not much research has been done to address the usage of small groups to support students who are at risk of dropping out from college. When students are randomly grouped or form groups of their own, there will inevitably be a few groups that are comprised of students who are all at high risk of dropping out. The idea behind the balanced small groups is simple—students who are at high risk and low risk of dropping out should be evenly distributed across all groups. If the cognitive and social mechanisms behind the effectiveness of small groups hold true, then students who are at lower risk of dropping out should be able to support and anchor students who are at higher risk of dropping out. This idea is based on the social interdependence theory that people, when placed in cooperative groups with a positive environment, will help each other to achieve a common goal (Johsons & Johnson, 2015).


    Implementing and Evaluating the Idea

    The first step in creating balanced small groups is to identify and classify students who are at high risk, moderate risk, and low risk of dropping out. The author of this essay used a modified version of the College Persistence Questionnaire (CPQ) to gauge students’ likelihood of persisting in college at the beginning of the semester (Davidson, Beck & Milligan, 2009). The original CPQ by Davidson and colleagues was modified to fit the specific characteristics of the author’s home institution. The modified questionnaire was built in Qualtrics and distributed to the students at the beginning of the semester. It should be noted that the author of this essay adopted a “flipped classroom” teaching model, where at least half of the class period involved small group problem solving (Lage, Platt & Treglia, 2000). The students had to work together to solve short answer questions and multiple choice quizzes. Each group had to turn in one copy of the short answer worksheet and one copy of the multiple choice quiz at the end of every class period. The in-person class met twice a week for 75 minutes each. The first 30 minutes of the class was in the form of a lecture with interactive clicker questions. The other 45 minutes was used to solve an in-class worksheet and a multiple choice quiz question in groups of four. Students were allowed to use their notes while solving the worksheet but they were not allowed to use their notes while completing the multiple choice quiz during the final fifteen minutes of class. A total of four undergraduate teaching assistants who were not enrolled in the specific class assisted with the small group problem solving portion of the class.

    After students’ response for the CPQ had been collected, the author calculated a cumulative score for each student based on the student’s response on the questionnaire. The students were then divided into four categories: those in the bottom 25th percentile, those in the 26th-50th percentile, those in the 51st to 75th percentile, and those in the top 76th percentile. Those in the top 76th percentile were students who were at very low risk of dropping out, those in the bottom 25th percentile were the ones who were at high risk of dropping out. The class had a total of 80 students; half of the students were put into balanced small groups using their CPQ scores and half of the students were placed into small groups randomly, regardless of their CPQ score.  Each group had four students. The balanced groups one student from each of the four CPQ categories; the random groups were created based on student ID number. The students stayed in the same group throughout the semester and they were encouraged to collaborate with each other. The author of this essay used a variety of bonus points and team building tasks throughout the semester to help the students foster a positive and cooperative learning environment (Johnson & Johnson 2015).

      This method of balanced small groups was first piloted during the Spring 2015 semester at a large state university. The results were inconclusive because the comparison between the random groups and the balanced groups did not yield any significant difference. The general trend of the means, however, seemed to show that students who were already at low risk of dropping out were benefitting more from the balanced small groups than students who were at high risk of dropping out. Future studies should probably compare balanced groups with students of varying risk levels, against matched groups where students of similar risk levels were grouped together. Qualitative data and survey data should also be gathered in addition to student performance data. There was also the concern that the balanced-group manipulation appeared to benefit the higher performing students more than the lower performing students who were at high risk of dropping out. This was probably a result of social loafing effect where the high performing students were doing most of the work in the class. The worksheets and the quizzes were graded per group but they should have been issued and graded on an individual basis. Future studies should design the assessments such that every student is held equally responsible. This way, any effect of social loafing should be minimized.


    Author’s note: This essay was based on a study presented at a poster session at the Society for the Teaching of Psychology’s 15th Annual Conference. Decatur, GA, October 2016. 




    Davidson, W. B., Beck, H. P., & Milligan, M. (2009). The College Persistence Questionnaire: Development and validation of an instrument that predicts student attrition. Journal of College Student Development, 50(4), 373-390.

    Eckles, J. E., & Stradley, E. G. (2012). A social network analysis of student retention using archival data. Social Psychology of Education15(2), 165-180.

    Johnson, D. W., & Johnson, R. T. (2002). Learning together and alone: Overview and metaanalysis. Asia Pacific Journal of Education22(1), 95-105.

    Johnson, D. W., & Johnson, R. T.  (2015). Theoretical approaches to cooperative learning.  In R. Gillies (Ed.), Collaborative learning:  Developments in research and practice (pp. 17-46).  New York:  Nova. 

    Lage, M. J., Platt, G. J., & Treglia, M. (2000). Inverting the classroom: A gateway to creating an inclusive learning environment. The Journal of Economic Education31(1), 30-43.

    Seidman, A. (2005). College student retention: Formula for student success (ACE/Praeger series on higher education; American Council on Education/Praeger series on higher education). Westport, CT: Praeger Publishers. 

    Skahill, M. P. (2002). The role of social support network in college persistence among freshman students. The Journal of College Student Retention: Research, Theory, and Practice, 4(1), 39–52.

    Tinto, V. (2006). Research and practice of student retention: what next?. Journal of College Student Retention: Research, Theory & Practice, 8(1), 1-19.

    Tinto, V. (2010). From theory to action: Exploring the institutional conditions for student retention. In J. C. Smart (Ed.), Higher education: Handbook of theory and research (pp. 51-89). Netherlands: Springer.


    H.-T. Michael Chen is an Assistant Professor of Psychology at Eastern Kentucky University in Richmond, KY. He graduated from Berea College with a degree in Biology, and earned his M.S. and Ph.D. in Experimental Psychology from the University of Kentucky. He teaches courses in research methods, cognition, and human factors. His research interests include teaching strategies in the classroom and the design of better educational technologies.


  • 16 Jul 2017 1:29 PM | Anonymous
    STP’s SoTL Writing Workshop: A.K.A. How I Wrote a Paper in Two Days

    Michelle A. Drouin

    Indiana University–Purdue University Fort Wayne


    In this paper, I describe my experiences with the Society for the Teaching of Psychology’s (STP) Scholarship of Teaching and Learning (SoTL) Writing Workshop. I first describe the obstacles preventing me from joining such efforts and then describe the process and structure of STP’s Writing Workshop. As a result of my participation, I not only wrote a manuscript from (practically) start to finish in two days, but I also finished three other SoTL papers and developed and implemented a SoTL Writing Retreat on my own campus.

    It is very difficult to say “no” to Regan Gurung. He is charming and charismatic, and as the former President of STP, he is kind of a psychology celebrity. So in May, 2012, when Regan invited me to apply to the STP’s Scholarship of Teaching and Learning (SoTL) Writing Workshop (www.teachpsych.org/conferences/writing/index.php#.UcpdcZzNnUk), try as I might, I could not say “no.”

     “But it’s hard for me to travel,” I said. “I have two young children, five and three.”

    “Perfect! Mine are six and four,” Regan responded.

    “I actually have a lot of projects going on, so I am really doing well on my SoTL writing,” I countered.

    Regan smiled, “Are they finished? You owe it to teachers and students everywhere to get them out.”

    “Teachers and students everywhere?” I pondered, “That’s a lot of people depending on me... .”

    “Ok, I’m in” I replied.

    Thus began my journey with STP’s SoTL Writing Workshop.



    The Obstacles

    As I look back on that day, I can clearly identify the obstacles that were keeping me from engaging in writing workshops generally and this one specifically:

    • I thought I had SoTL writing figured out. I had a few SoTL research papers published and had written two invited book chapters. Although I did not consider myself an expert in SoTL, I was certainly one of the SoTL leaders at my university. I knew I could do the work, so I really did not know what the SoTL Writing Workshop could do for me.
    • I did not think I had the time for a workshop. I was already time pressed—hence the many unfinished projects—so how would I find the time to travel and participate in a workshop?
    • I thought that unfinished projects were a normal part of academic life. My colleague (who has been in his position for 9 years) still has an unfinished project from graduate school. I have many unfinished projects, and as the years go by, that list is growing, especially for SoTL projects. I accepted this as a normal part of my academic journey.
    • I am actually a good, prolific writer. I don’t struggle with writing. I spend much of my academic work time writing both disciplinary and pedagogical papers, and I am successful in getting my work published. According to the 2010-11 UCLA Higher Education Research Institute Faculty Survey, only about 20% of faculty at all baccalaureate institutions had five or more papers accepted or published in the last two years (Hurtado, Eagan, Pryor, Whang, & Tran, 2012), and I am pleased to say that I am in that 20%.


    Despite my many internal protests, I engaged. Two weeks later, I was describing via email my various unfinished SoTL projects to my three fellow group members and reading Optimizing Teaching and Learning: Practicing Pedagogical Research (Gurung & Schwartz, 2009), which Regan sent to workshop participants. I was also learning more about the workshop through email and had received a participant timeline with “soft deadlines to make the workshop most effective”:

    May:  Introductions and basic idea sharing.

    June-August:  Preliminary consultations.

    August 30th:  Project proposal/status—Write a 1-2 page proposal for the topics you would like to research. If there is data collected, then list key hypotheses driving the study and draft a method section.

    September 15th: Complete a preliminary literature search for articles relating to topic of interest or study conducted (outline Intro section).

    Oct 1st: Final report on activity/project status due to Mentors.

                                                 (R. Gurung, personal communication, May 29, 2012)

    Through this email correspondence, I also learned two important things: (1) that the mentors would provide follow-up consultations and draft reading (or other types of assistance) post-workshop, and (2) that the goal of the workshop was to have a SoTL publication submitted by the end of the 2012-13 academic year. As I hoped to finish at least one of my papers by that deadline, I thought this was a realistic goal for me. However, one of the hurdles I faced during my preliminary consultations with Regan was trying to decide which of my many projects to bring to the workshop.

    Getting Organized

    At the time of our initial correspondence, I had SEVEN unfinished SoTL projects. I was already in the writing phase of an online lecture paper and decided to finish that one outside of the SoTL writing workshop; the workshop only accelerated my timeline. Thereafter, I turned my focus to three others: an iPad project, an online decision tree for psychology majors project, and a lecture capture project. In preparation for the August 30deadline, I was overzealous and finished and submitted the decision tree paper, which left me with five papers to complete and nothing firm to bring to the writing workshop. At this point I had to reassess and emailed Regan in desperation—“what project should I now bring to the SoTL writing workshop?”

    Regan replied, “Given that you are progressing well, how about you aim to send a plan of what YOU hope to have done on EACH of the 3-4 topics.  A few sentences on each so you have a clear picture of goals.” (R. Gurung, personal communication, August 29, 2012).

    At this point, I finally committed to paper the goals I had for my various SoTL writing projects and constructed a table that would guide me through the rest of the process. In this table, I listed my five unfinished projects and the goals I had for them for the October workshop (summarized here):

    • iPad cohort & lecture capture projects: Data analyzed; results and methods sections written, literature review mostly done
    • Research assistantship, blogs as learning tools, and research review and presentation projects: Data cleaned; sources gathered

    Creating this table gave me clarity. This was the first time in my academic career that I had actually listed all of my ongoing projects and created goals for each. Until this point, the projects were all quite nebulous—I did not even know how many unfinished SoTL projects I had. After I created the table, I had a visual reminder of my goals, and this was a breakthrough. As I thought about my goals, I knew that if I could arrive at the writing workshop with at least cleaned data sets and relevant sources gathered, I would be able to make the most of the personalized statistical consultations and also be able to get advice on publication. Minimally, this is what I hoped to accomplish, and in the end, this is what I had accomplished when I boarded the plane for Atlanta in October, 2012.

    Attending the Conference

    Early in my career, I heard a rumor about two professors who would get together and complete manuscripts (from start to finish) in a weekend. I remember the questions that rushed through my head at the time—“How did they do it? What did they do to prepare for this writing extravaganza? Did they each work independently, or did they work collaboratively?” Because the source of this rumor had so few answers, I dismissed it as urban legend. However, now I know that this feat can be accomplished.

    When I arrived in Atlanta for the SoTL writing conference, I had 733 words (mostly methods), a cleaned data set, and sources gathered for a manuscript on the effects of using lecture capture in an introductory psychology course. I focused on this paper because after cleaning the data sets of three other projects (research assistantship, blogs, and research review), I decided I needed to collect more data. Meanwhile, although I had enough data for the iPad project, it was not specific enough to psychology to make use of the mentorship I was about to receive. Thus, my lecture capture project became my official SoTL workshop baby.

    The SoTL writing conference runs concurrently with STP’s Best Practices Conference, so we were able to attend the keynote addresses for the Best Practices Conference; however, the rest of the time we were to devote ourselves to our SoTL projects. The structure of the conference was:

    Day 1: Evening arrival, dinner, presentation on doing SoTL research by Regan Gurung, large-group introductions with explanations of our SoTL projects.

    Day 2: Writing, individual consultations with mentor, individual consultations with statistician and ToP editor.

    Day 3: Writing, presentation by Drew Christopher (Editor, Teaching of Psychology) on getting published, departure in the afternoon.

    I spent most of my time writing, in the hotel lobby, side by side with other workshop participants, pausing at times to ask them their feedback on something that I had written but mostly just in my own private writing abyss. I had a few consultations with Regan, where he pointed me to relevant sources and asked me to include additional information. I talked through my statistical analyses with Georjeanna Wilson-Doenges, who helped me see that what I was actually proposing was a mediation model. And I also spoke at length with Drew Christopher, who encouraged us all to be tenacious with our papers. When I boarded the plane to go home, I had 5,697 words and a paper that was nearly complete. A few days later, I sent it to Regan for feedback, and approximately one week later, I sent it out for review.


                A few months later, my paper (Drouin, 2014) was accepted with minor revisions for publication in Teaching of Psychology. However, this was not the only positive outcome of my SoTL writing workshop experience. Two other papers I prepared as part of this process (lecture format study and iPad project) have now been accepted for publication, and I am currently revising another (online decision tree) in response to a revise and resubmit decision. This is the greatest number of SoTL papers I have even written in a one-year time frame and is equivalent to the number of SoTL articles I had accepted before I joined this workshop.

                These accomplishments are overshadowed though by my biggest take-home of the conference. In May, 2013, just one year after my initial conversation with Regan, I coordinated my own SoTL Writing Retreat on my campus. We had 12 participants, working side-by-side with four experienced SoTL mentors, a statistical consultant, and librarians, who assisted with source gathering and finding publication venues. Sponsored by IPFW's Committee for the Advancement of Scholarly Teaching and Learning Excellence, this SoTL writing retreat was the first of its kind on our campus and was a great success. Although I did not follow the STP Writing Workshop model exactly (e.g., due to time constraints, we did not provide consultations in advance, and we also did not create a firm structure for follow-up consultations), we included key elements that were helpful in making the workshop a success for me. More specifically:

    1. We had an application process. Participants were asked to describe the projects they were working on, where they were in the process, and what they hoped to accomplish during the retreat.
    2. Participants were paired with mentors who had knowledge of the content area or data collection method. Based on the applications, we formed mini-groups composed of people who were working on similar projects or using similar data collection methods, and we matched mentors with writers on this basis.
    3. The writing retreat lasted only two days. Longer writing workshops or writing lockdowns that have meetings over weeks or months, like those highlighted by Belcher (2009) or Jakobsen and Lee (2012), certainly have their strengths, but my university already had writing groups, and I had never engaged because I feared the long commitment. Workshops of a limited duration are perfect for commitment-phobes like me, and because this model had worked for me with STP’s workshop, I wanted others to be able to experience this model.
    4. It was a retreat, with large chunks of time devoted to writing. We had only two short workshops on IRB proposals and publication venues; the rest of the time was devoted to manuscript writing or other types of SoTL writing activities (e.g., writing an IRB proposal, writing out a plan for the research).

    Feedback on the workshop was overwhelmingly positive, but I did have suggestions to do more preparatory work with participants before the retreat, which aligns well with STP’s model. Overall, participants appreciated the time devoted exclusively to working on their projects and the synergy we created during those two days in the campus library. It was inspirational for me, and in a sense, I felt that I was paying it forward.

    As I closed the writing workshop, I chose my words carefully: Echoes of a year before and foreshadowing for the essay you are now reading— “This is important work. You owe it to students and teachers everywhere to get it out.”


    Drouin, M. (2014). If you record it, some won’t come: Using lecture capture in introductory psychology. Teaching of Psychology, 41(1), 11-19.

    Hurtado, S., Eagan, M. K., Pryor, J. H., Whang, H., & Tran, S. (2012). Undergraduate teaching faculty: The 2010–2011 HERI Faculty Survey. Los Angeles: Higher Education Research Institute, UCLA.

    Gurung, R. A. R., & Schwartz, E. (2009).Optimizing teaching and learning: Pedagogical research in practice. Malden, MA: Blackwell.

    Jakobsen, K. V., & Lee, M. R. (2012). Faculty writing lockdowns. In J. Holmes, S.C. Baker, & J. R. Stowell (Eds.), Essays from E-xcellence in Teaching (Vol. 11, pp. 26–29). Retrieved from the Society for the Teaching of Psychology Web site: http://teachpsych.org/ebooks/eit2011/index.php

    Michelle Drouin earned her bachelor’s degree in psychology from Cornell University and her DPhil in Experimental Psychology from University of Oxford, England. She is an associate professor of psychology at Indiana University-Purdue University Fort Wayne and teaches courses in introductory psychology, developmental psychology (child and lifespan), social and personality development, and language development. Her research, both disciplinary and pedagogical, is focused on literacy, language, and the ways in which technology affects communication and learning. She has written numerous pedagogical papers and invited book chapters focused mainly on online teaching and the integration of technology in the classroom.


  • 02 Jul 2017 4:41 PM | Anonymous

    Flipped out: Methods and outcomes of flipping abnormal psychology

    Amanda K Sommerfeld, Ph.D.
    Washington College


    The Background

    Abnormal psychology is taught in virtually every undergraduate psychology department across country (Perlman & McCann, 1999). However, despite its popularity, the course is not immune from critiques. Like many college courses, abnormal psychology is often lecture-based (Benjamin, 2002). Although such a pedagogical approach is popular among faculty because of its effectiveness in maximizing content delivery (Kendra, Cattaneo, & Mohr, 2012), in some cases lectures also may be less effective than other methods for promoting learning (c.f., Halonen, 2005).

    Abnormal psychology courses have also been critiqued as lacking both context and nuance. As Norcross, Sommer, and Clifford (2001) note, in abnormal psychology classes, “the painful human experience of psychopathology is frequently overshadowed by descriptions of disembodied symptoms and impersonal treatment” (p. 126). As a result, despite many professors’ intentions to use abnormal psychology courses to decrease stigma (Kendra et al., 2012) and increase student understanding of the contextual factors that shape psychiatric conditions (Lafosse & Zinser, 2002), courses may fall short of these desired outcomes. That was certainly my experience when I first taught abnormal psychology.


    The Issue

    Psychopathology I (PSY 233) is a core course for students who are majoring in psychology with a clinical/counseling concentration at my college. Because of this, as well as the content, the class is frequently filled to capacity (40 students). When I inherited the class in Fall 2014, I kept using what Benjamin (2002) refers to as “the Velveeta (cheese) of teaching methods” (p. 57), otherwise known as a lecture-centered approach (which is comparable to the cheesy foodstuff in that despite the fact that no one admits to liking it, it remains the most popular pedagogical approach; Halonen, 2005). I enhanced the class with media critiques, group projects, and in-class discussions, however class time remained lecture-driven.

    According to my students the course was successful. Students gave high ratings on course evaluation items (rated from 1=strongly disagree to 5=strongly agree) such as “The use of teaching aids was effective” (μ=4.9) and “The instructor answered questions in class in a patient and helpful manner“ (μ=4.9). Students’ qualitative feedback supported these ratings.

                Despite this positive feedback, I was dissatisfied with several aspects of the course. For example, lower student ratings on items such as, “I learned a great deal in this class” (μ=4.6) and “The course raised challenging questions or issues” (μ=4.6), led me to wonder if students were basing their assessments on how much they liked the course rather than their actual learning. What is more, at the end of the semester I didn’t feel confident that I’d met my objective of challenging students to consider how cultural norms and biases contribute to psychiatric conditions. As a result, I was left with the sense that, because of the format, I had reduced the course content to a list of diagnostic criteria, leaving little time for acknowledging symptom variation, challenging stereotypes, or encouraging the development of advocacy attitudes. To combat these shortcomings, I decided to change the class radically, and, with the support of a grant from my college’s Cromwell Center for Teaching and Learning, I flipped—or inverted—the class.


    The Solution

    There is no single definition of flipped instruction (He, Holton, Farkas, & Warschauer, 2016). However, the underlying intent of the approach is to move lecture-based material outside of class, leaving in-class time for “face to face engagement between students and teachers” (Forsey, Low, & Glance, 2013, p. 472). This is commonly achieved by delivering course content before class meetings using recorded lectures, podcasts, or videos. Material is then applied during face-to-face meetings through discussions, activities, and hands-on demonstrations.

    To date, the research on flipped instruction is incomplete. As O’Flaherty and Phillips (2015) note, few studies have “actually demonstrated robust evidence to support that the flipped learning approach is more effective than conventional teaching methods” (p. 94). Despite this, anecdotal evidence is encouraging, with some studies claiming that flipped instruction results in greater student engagement (c.f., Jamaludin & Osman, 2014) and higher test scores and overall grades (c.f., Mason, Shuman, & Cook, 2013). Based on this available evidence, and the issues that I observed in the first iteration of Psychopathology I, flipping the class seemed a worthwhile venture.


    The Implementation

                Flipping Psychopathology I required me to create two sets of materials: out-of-class and in-class. The bulk of class content (i.e., diagnostic criteria, prevalence rates, treatment approaches, etc.) was delivered outside of class through video lectures that were uploaded to the course’s online learning platform. For the first iteration of the flipped course, these lectures were simple, with my voice recorded over PowerPoint slides using SnagIt. These videos were limited to ten minutes so students could easily review information. Prior to class students were required to watch between one and three videos and complete an online quiz. The quizzes were intended to encourage mastery, so students were able to repeat the quizzes multiple times.

    In-class time was focused on application and discussion (Pluta, Richards, & Mutnick, 2012). This required me to create individual and group activities for each class meeting. Sample activities included having students evaluate media depictions of psychiatric disorders for accuracy, writing vignettes of imaginary clients, and discussing the systemic factors that affect how clients manifest symptoms.

                I evaluated the effectiveness of the flipped versus traditional instruction based on data collected at two times: following a lecture-based course in Fall 2014 (N=27) and following a flipped-style course in Fall 2015 (N=34). Data I collected at both points in time included student test scores and grades, student course evaluations, student responses to questions developed for the Web Learning Project (Calderon, Ginsberg, & Ciabocchi, 2012), and instructor reflections.


    The Outcomes

                Data from the traditional and flipped offerings of Psychopathology I suggested the pedagogical change affected outcomes in three domains: student learning, student engagement, and instructor experience.


    Impacts on student learning

                Researchers suggest that flipped instruction is successful because students are able to learn and review pre-class material on their own time and at their own pace (McDonald & Smith, 2013). Many of my students agreed with this assessment, sharing on course evaluations that, “I like how the videos were before class. It allowed for deeper understanding of the material because I can pause, write down questions, and review as needed.” Accordingly, students also rated the “adequacy of resources” as significantly higher than students in the lecture class (t(54)=-2.11, p=.04).

                In opposition to the literature, the accessibility of material outside of class did not translate into higher grades for my students. In fact, although there were no significant differences in final grades between the two classes, students in the flipped class had significantly lower exam grades than students in the lecture-based class (t(58)=2.42, p=.02). What is more, student responses to the item “I learned a lot in this course” were lower in the flipped course (μ=4.3) than in the lecture course (μ=4.6).

                It is possible that some of the student learning drawbacks of the flipped class were related to perceptions of the difficulty of the course. In comparison to students in the lecture class, students in the flipped class rated the course as having a significantly higher “workload” (t(56)=-6.02, p=.00) and being more “difficult” (t(55)=-3.19, p=.00). Further, student qualitative feedback reinforced these ratings, suggesting the flipped style made learning more difficult for some students.

    This perception runs contrary to previous studies that have suggested students perceive flipped courses as less difficult than courses taught using traditional methods (He et al., 2016). However similar to previous research (c.f., O’Flaherty & Phillips, 2015), students in my flipped course suggested the difficulty predominantly stemmed from the increased responsibility they felt: “The flip style makes learning just a little bit harder because it puts all the responsibility on what you do outside of the classroom.”


    Impacts on student engagement

                The fundamental purpose behind flipped instruction is to use in-class time for active learning. Given this, some of the feedback from students in the flipped class led me to question the effectiveness of my in-class activities. For example, students in the flipped course rated the “learning value of in-class materials” significantly lower than students in the lecture course (t(56)=2.326, p=.02). These data were supported by comments such as, “class meetings are interesting but not necessarily informative.”

    Based on these data, it seems that my implementation of flipped pedagogy may have fallen short because of how I structured face-to-face meetings. It may be that, similar to O’Flaherty and Phillips’ (2015) findings in their scoping review, I failed to explain the link between the pre-class activities and the face-to-face sessions. As a result, the in-class material may not have engaged the students.

                With that said, data also suggested students interacted more in the flipped class, which may have facilitated student engagement. For example, students in the flipped course rated the amount and quality of “interaction with other students” as significantly greater than students in the lecture course (t(56)=-6.06, p=.00). Student comments reinforced these data, with one student noting, “I like that we get more time to ask questions in class,” and another mentioning that “the interaction during class time helps to solidify the information.”


    Impact on instructor

                Researchers who study flipped instruction routinely note how demanding it is on instructors. That was certainly my experience in flipping Psychopathology I. Similar to other instructors’ experiences, it took considerable planning and preparation for me to design engaging, interactive in-class activities (c.f., Mason et al., 2013). A great deal of lead-in time was also required to record and edit lectures in advance of class meetings.

                The process of making the videos was also complicated by the limited technical support available to me. Although I consulted with members of the academic technology team at my institution, they did not have the time or resources to help me record or edit the videos. As a result, I had to learn how to use the software and troubleshoot issues on my own. Perhaps because of the amount of time and expertise required to create even simple videos, it is not surprising that researchers have recommended having a support staff or technical team available (c.f., Ferreri & O’Connor, 2013).

                Despite these issues, I also found the flipped course had multiple strengths. Most importantly, because students could access and review the lectures before class meetings they were less concerned with taking notes in class. This freed up the students to listen to their classmates, contribute to discussions, and engage fully in activities. As a result, a greater proportion of students participated in the flipped versus the traditional class.

                Finally, I also found the flipped class provided students with increased opportunities to consider more nuanced issues related to psychiatric disorders. In particular, because the students were introduced to diagnostic criteria and prevalence rates prior to class, they were more prepared to apply and critique that material in class, opening up discussions about stigma, social norms, and systemic forms of privilege and oppression that affect psychological health and illness.


    The conclusions

    The flipped version of Psychopathology I had both strengths and weaknesses. Students appreciated the opportunities for review that the flipped style provided, were better able to consider the nuances of psychiatric conditions, and were more engaged during in-class meetings. On the other hand, some students reported the flipped style made learning more difficult and I found the flipped course took more time to prepare. Given these data, it is not possible to say flipping Psychopathology I improved the course as a whole, at least not after the first offering. However, with revision the flipped course could hold considerable promise to help students develop more critical perspectives on topics relevant to abnormal psychology.




    Benjamin, L. (2002). Lecturing. In S.F. Davis & W. Buskist (Eds.), The teaching of psychology: Essays in honor of Wilbert J. McKeachie and Charles L. Brewer. Mahwah, NH: Lawrence Erlbaum Associates, Inc.

    Calderon, O., Ginsberg, A.P., & Ciabocchi, L. (2012). Multidimensional assessment of pilot blended learning programs: Maximizing program effectiveness based on student and faculty feedback. Journal of Asynchronous Learning Networks, 16(3), 23-37.

    Ferreri, S., & O'Connor (2013). Instructional design and assessment: Redesign of a large lecture course into a small-group learning course. American Journal of Pharmaceutical Education, 77(1), 19.

    Forsey, M., Low, M., & Glance, D. (2013). Flipping the sociology classroom: Towards a practice of online pedagogy. Journal of Sociology, 49(4), 471-485.

    Halonen, J.S. (2005). Abnormal psychology as liberating art and science. Journal of Social and Clinical Psychology, 24(1), 41-50.

    He, W., Holton, A., Farkas, G., & Warschauer, M. (2016). The effects of flipped instruction on out-of-class study time, exam performance, and student perceptions. Learning and Instruction, 45, 61-71.

    Jamaludin, R., & Osman, S. Z. (2014). The use of a flipped classroom to enhance engagement and promote active learning. Journal of Education and Practice, 5(2), 124–131.

    Kendra, M.S., & Cattaneo, L.B., & Mohr, J.J. (2011). Teaching abnormal psychology to improve attitudes toward mental illness and help-seeking. Teaching of Psychology, 39(1), 57-61.

    Lafosse, J.M., & Zinser, M.C. (2002. A case-conference exercise to facilitate understanding of paradigms in abnormal psychology. Teaching of Psychology, 29(3), 220-222.

    Mason, G., Shuman, T., & Cook, K. (2013). Comparing the effectiveness of an inverted classroom to a traditional classroom in an upper-division engineering course. IEEE Transactions on Education, 56(4), 430435.

    McDonald, K., & Smith, C. M. (2013). The flipped classroom for professional development: Part I. Benefits and strategies. The Journal of Continuing Education in Nursing, 44(10), 437.

    O’Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. Internet and Higher Education, 25, 85-95.

    Norcross, J.C., Sommer, R., & Clifford, J.S. (2001). Incorporating published autobiographies into the abnormal psychology course. Teaching of Psychology, 28(2), 125-128.

    Perlman, B., & McCann, L.I. (1999). The most frequently listed courses in the undergraduate psychology curriculum. Teaching of Psychology, 26(3), 177-182.

    Pluta, W., Richards, B., & Mutnick, A. (2013). PBL and beyond: Trends in collaborative learning. Teaching and Learning in Medicine, 25(S1), S9S16.

  • 15 Jun 2017 1:15 PM | Anonymous
    Using The iPad In Your Academic Workflow:
    Best iPad Productivity Tools For Your Classroom Practices

    David Berg, Ph.D.
    Community College Of Philadelphia

    This document is based on workshops I presented at 35th Annual National Institute on the Teaching of Psychology.

    Introduction To “Using The iPad In Your Academic Workflow”
    In the academic world, our workflow involves a number of different elements which may include planning and scheduling, project management, reading and writing, information management (gathering, sorting, storing), collaboration (students, colleagues, department, college, and organizations), participation in meetings and committees, and interfacing with cyberspace (email and web). We could add many more things to the list, however it’s best to emphasize that workflow for the iPad looks like old S→P→O Psychology. The workflow starts with the INPUT (stimulus) into the iPad from either your computer (via iTunes sync), from the cloud (via DropBox or WiFi), or from your thoughts and ideas. The workflow ends with the OUTPUT back to your computer, to the cloud, to a projector, or perhaps to a printer. OUTPUT can take many forms: written and marked-up documents, media (audio/video/artistic/photos), presentation materials, podcasts, collaborative documents, and so on. What goes on in the middle is the PROCESSING which entails the use of many interconnected tools or apps on the iPad itself -- the majority of this essay focuses on the Process.

    iPad In The Classroom
    Over the past two years or so, more and more faculty have been making use of the iPad as the “tool of choice” in their academic lives. As the iPad (and iOS) have matured, we’ve seen greater numbers adapting the device for their personal use. What about the iPad in the classroom? Beyond some simple usage, most faculty have not tapped the full potential of the iPad—still relying on laptops, smart carts, and the classroom smart podium (nice if your classroom has one). My favorite classroom is currently outfitted with 1976-era technology: a 27” wall-mounted monitor with attached VHS/DVD player (that works most of the time). Schlepping the smart cart from A/V services around the campus is a Herculean chore not for the faint of heart; getting all of the parts working and set up for class...well...resistance is futile!

    So I made an executive decision. Though on a shoe string budget, I decided that I would not upgrade my old laptop but invest in the new tablet technology instead, and adapt it to both my classroom needs and my academic workflow. Mind you, I have a decent up-to-date desktop computer that provides a way around some of the content creation issues that come up regarding the use of tablet computing.

    The next section is aimed at the professional user who wants to make the most out of using the iPad in the classroom. It does not cover classes in colleges that give everyone an iPad (we should only be so lucky), but rather how to make use of the iPad as your go-to-technology.

    The four biggest issues usually raised when we discuss using the iPad are: Content Creation vs. Consumption, Laptop vs. iPad, Device Integration, and College vs. High School teaching. When the original iPad was first released, it really functioned as a superb consumption device—great for personal use but lacking in many ways to create content. Times have changed! You can create to your heart’s content albeit with some limitations in a few areas; however, there isn’t much that you can’t do. Probably (for academics) the most serious limitations are in creating major presentations (PowerPoint and Keynote), developing large media projects, and other areas such as business applications (large excel spreadsheets and such). You can do these things, but not with the same ease as on a laptop or desktop computer.

    Of course this brings us to the next issue of Laptop vs. iPad. The iPad excels as a portable device whether at college, in the classroom, at home, or for travel. In a classroom, the iPad can be connected to any monitor or projector with ease, and further it can be used as a whiteboard making for an interactive class. The laptop may be preferential in terms of data management, content creation of presentations and media, or for research and data. If you need to make a decision, think in terms of what your needs are rather than in terms of what device to buy. I have a wonderful desktop machine so I have given up my old laptop in favor of my iPad; when I retire, I will give up the desktop machine. If you do not have access to a good working computer, you might think about updating.

    Once these first two issues get sorted out, you can then consider the third, Device Integration. NOT A PROBLEM. When the iPad first appeared, about the only way to get information in and out was through iTunes sync. Now, with the proliferation of cloud computing, the issue is no longer a difficulty. I prefer to connect my iPad to my computer every few days and use the sync apps-file sharing method in iTunes. However, many people prefer to use DropBox as their primary means of transferring information between their iPad and their Mac or PC. For specific types of documents, both Google and Microsoft have also introduced their own versions of the cloud for document syncing and collaboration.

    Finally, high school Psychology teachers may have other responsibilities that college instructors don’t have to deal with, such as interfacing with an administrative network, putting together course lessons for five day/week classes, and making lesson plans available to supervisors. There are now a number of apps to facilitate these functions.

    Fair Use Guidelines & Copyright Issues
    We need to exercise great caution in what we download, copy, and/or display. Distribution of copyrighted materials is a serious issue but simply displaying the material may not be. There are strict copyright guidelines regarding such matters. Understanding the fair use guidelines and the exceptions is very important. My experience has been that an email asking permission is easily obtained and avoids many hassles. For an overall view, the Center For Social Media has provided a “best practices” paper dealing with copyright and provides a FAQs review (http://centerforsocialmedia.org/fair-use/related-materials/codes/code-best-practices-fair-use-online-video).

    Some accessories are a must to make full use of the iPad. Choose among the categories based upon personal look, feel, and expense. Try before you buy is always best, so speak to other colleagues and friends to determine what works best for you. If you live near an Apple store or BestBuy then go play. If you cannot, then four reliable online sources for accessories are Amazon.com, Meritline.com, Buy.com, and Handhelditems.com. Must have accessories include:   

    • Bluetooth Keyboard (stand-alone or in a folio case, approximately $50)
    • Folio style case or iPad cover (approximately $35)
    • Stylus (approximately $20) and Screen Cleaner (approximately $10)
    • Auxiliary speakers & headphone (range in price from $5 to $200)
    • Extra charger for office or auto (approximately $20)       

    There are a few excellent websites that will be helpful for both workflow and classroom teaching with the iPad.

    What Do You Want To Do?
    Probably the biggest question is “What do you actually want to do with your iPad?” This needs to be well thought out because it will entail investments of time, training, and some cash (for apps and accessories). I have arbitrarily divided the use of the iPad in both the workflow and the classroom into a number areas. These overlap and are by no means exhaustive. I’ve also listed apps that are highly rated in each category; some are free and others not. Check them out at the iTunes Store online or the App Store app on the iPad. Download the freebees and play. For those that cost, read the reviews and click the “most critical” in the reviews link before buying.

    The Workflow and Classroom Categories & Specific Apps
    Beginning and Ending the Workflow: Input and Output

    Getting your documents into the iPad is a fairly straightforward procedure called syncing.

    The two most popular and efficient ways are through iTunes sync and DropBox. Simply drag a file to DropBox on your computer (PC/Mac), and it will show up on your iPad (assuming that both are in the same wifi network). Once you have the document on the iPad, use the “open in” command to move the file to the appropriate app. Reversing this process moves the document back to your computer.

    iTunes sync occurs when you attach your iPad to the computer. There is a window in iTunes that contains all of the apps that share your documents. Simply add your document into this window, and it will sync to your iPad. The reverse process updates the document which can then be saved.

    The advantage of DropBox is that you don’t have to attach the iPad to the computer; further, you can set up folders to share with other people over any network. iTunes sync’s advantage is better organization and control of your documents. I prefer iTunes sync.

    Output from the iPad is pretty much the reverse of the processes listed. In addition, we can add email and printing as output methods. While I list presentation and communication apps later, printing is a special case, because it can take several steps to print. Some apps are AIRPRINT enabled meaning that they will, without any extra steps, print to an AIRPRINT ENABLED PRINTER. All of the major manufactures make them so if you are purchasing a new printer, look this up in the specs. For those of us who do not need a new printer, several apps are available in the iTunes store that will enable you to use a printer in the same wifi network. Choose apps that have two versions: a lite (free and trial) as well as a paid version. Download the lite and give it a try. If it works, then purchase the full paid version. Loading the app onto the iPad, and the computer version on to your Mac or PC will enable you to print wirelessly over your network. There are several choices: I have used PrintCentral from Eurosmartz ($10) since the iPad came out (it was one of the first apps) and it works just fine for me.

    Project and Task Management
    This category includes apps useful for project and event planning. The particularly popular apps are those that use the built-in Calendar and Reminders; those of you who use Google’s apps may want to integrate the Google Calendar into your iPad use. Additionally for those who really like to have more control, there are numbers of To-Do apps (e.g., Wunderlist, which is free, and ToDo, which costs $5). If you want to do graphic layouts of projects, Popplet and Corkulous are quite good. For special presentations and projects, Exhibit A ($10) is worth investigating. (Costs of the apps below are listed with the app; free apps are denoted by “F”)

    Project and Task Management Apps
    • Calendar (F)   
    • Corkulous (F + $5)   
    • Popplet Lite (F)  
    • ToDo ($5)  
    • Wunderlist (F)    

    Writing and Collaboration And Communication Tools and Apps
    These apps include writing and note taking apps, grading papers, email, Skype, Google docs, Dropbox, Podcast and Screencast production, internet.

    Apps to Substitute for MS Office and Note Taking
    • CloudOn (F)
    • DocsToGo ($10)
    • Google Docs (F)
    • Notability ($1)
    • Pages ($10)
    • Penultimate ($1)
    • Smart Office ($5)
    • SoundNote ($5)
    Good Utilitarian Browsers
    • Chrome (F)
    • Life Browser ($1)
    • Safari (F)

    Browsers That Play Flash
    • Photon ($5)
    • Puffin (F)
    • SkyFire ($3)

    Utility Apps for Recording, Communications, Bar Code Reading
    • Dictate (F)
    • Display Recorder ($10) 
    • FaceTime (F)
    • i-nigma (F) (QR codes)
    • Skype (F)    
    • Twitter (F)       

    Utilities for Printing                 
    • PrintCentral ($10)  
    Utilities for Displaying
    • Reflector ($15)    
    • Splashtop ($2)

    Finding WiFi
    • Wi-Fi Finder (F)

    Information Management
    These apps include textbooks, readers, database for information materials, lecture note replacement, and pdf readers/annotators.

    Apps for information storage -- A personal file cabinet
    • DropBox (F)     
    • EverNote (basic app is free, there is also a premium version for $5/month)
    • Exhibit A ($10)
    • GoodReader ($5)   
    • Google Drive (F)           

    WebPage Storage Apps (Read webpages offline without an internet connection)
    • Instapaper ($4)
    • JotNot ($2)
    • Offline Pages ($5)
    • Pocket (F)
    • Safari (F)

    Research and Reading and Reference
    • APA Journals (F) (priced by subscription)
    • CourseSmart (F) (books – prices vary)
    • Inkling (F) (books – prices vary)
    • Mendeley Lite (F)   
    • Wolfram Alpha ($5)

    PDF annotation, Pdf readers, Book Readers
    • iAnnotate ($10)
    • iBooks (F)
    • Kindle (F)      
    • neu.Annotate+ ($2)
    • Nook (F)

    Apps to use for Presentations, Whiteboard, Digital Jukebox, Survey and Polls (without clickers).  For a digital jukebox use GoodReader, Keynote, or any app that will play PowerPoint Slides
    • GoodReader ($5)
    • Keynote (F$10)
    • Lecture Tools (F)
    • Poll Everywhere (F+)
    • SlideShark (F)

    Classroom Management
    This category includes apps that are used for organizing the class such as calendars, grade books and attendance (roll book). If working with these types of apps feels cumbersome, then setting up a spreadsheet grade book on your computer and transferring it to the iPad may be a good choice. (I personally use the spreadsheet methods but some faculty like an all-in-one app.)
    • Calendar (F)
    • Google Calendar (F)
    • Numbers ($10) (an office spreadsheet)
    • Reminders (F)
    • ToDo ($5)
    • Wunderlist (F)

    The following are specific apps to organize classrooms, attendance, and gradebooks.
    • Class Organizer Complete ($5; for students)
    • GradeBook Pro ($10)
    • InClass (F; for students)
    • TeacherKit (F)
    • Teacher’s Aide (F)

    Demonstration Apps
    This category includes specific psychology-related demonstration apps. These vary from those that can be used as “labs,” class A/V displays, digital jukeboxes (brain and body), and informational for both the professor and students. The list is by no means exhaustive.

    General Psychology Information Apps
    • Psych Drugs (F)
    • PsychExplorer (F)
    • PsychGuide (F)
    • PsychTerms (F)
    • PsycTest Hero ($4)
    • Psychology Latest (F)

    Lab Demos
    • Cardiograph ($2)  
    • PAR CRR ($4)
    • Puffin (APA OPL) (F)
    • Stroop Effect (F)   
    • TouchReflex (F)

    Anatomy & Physiology
    • 3D Brain (F)
    • Brain Tutor (F)
    • Cardiograph ($2)
    • EyesandEars ($1)
    • Grays Anatomy ($1)
    • iMuscle ($2)

    Sensation & Perception             
    • 3D illusions (F)
    • Eye Illusions ($2)
    • EyeTricks ($1)

    Audio/Visual Informational Resources
    • iTunes U (F)    
    • Podcasts (F)   
    • SoundBox ($1)

    DIY Presentations
    • Educreations (F)
    • Explain Everything ($3)

    Video Presentations
    • Apple Video (F)
    • NetFlix ($8 monthly subscription for streaming)
    • YouTube(F)

    Social Media
    • FaceBook (F)
    • Twitter(F)

    You can find a digital version of this document with LIVE internet links (where applicable) on my college webpage (http://faculty.ccp.edu/faculty/dsberg/) and click on “TUTORIALS & DEMOS.

    David Berg is Professor of Psychology at Community College of Philadelphia where he was the recipient of the Lindback Foundation Award for excellence in college teaching, and where he served as past chair of the Behavioral Sciences Department. He received his Ph.D. from Temple University in experimental psychology and completed postdoctoral training in family systems theory from Drexel University/Hahnemann Medical College. David has pioneered workshops focusing on “wellness in the workplace” and has presented these to government, business, and educational institutions. He trains other psychologists to enable them to perform similar workshops. Dr. Berg has presented a number of workshops that focus on the use of writing in Psychology courses, both at NITOP and at APA. Further, he has presented a number of NITOP workshops on use of technology in the classroom. Since the advent of laptop computers, David has consulted with academic teaching faculty to bring them up to the cutting edge in using technology in the classroom. He also serves as a resource for those who teach in institutions on a “shoestring budget” like his own. He views and uses technology as a means to heighten the standards of critical thinking and writing in teaching rather than as a mere adjunct to lecturing.


  • 01 Jun 2017 8:19 AM | Anonymous

    Flipping the Classroom Improves Performance in
    Research Methods in Psychology Courses

     Ellen Furlong
    Illinois Wesleyan University

    Despite having taught it many times Research Methods in Psychology remains one of the most challenging courses I teach. The difficulty arises primarily because Methods has two major goals: (1) to teach students the required concepts (2) to be able to understand, evaluate, design, and conduct research. In short—we must teach both content (what is a hypothesis?) and skill (where is the hypothesis in this article? Is it strong? What is my hypothesis?), usually in just one semester.

    The first few times I taught Methods I tackled this problem by covering content in class and relying on a semester-long APA-style research proposal for students to practice. On the surface this worked modestly—students typically wrote interesting papers with at least superficially solid ability to apply their knowledge.  

    One semester I challenged my students to something new: I assigned a very short 2 page article (Kille, Forest & Wood, 2013) and asked questions about it (i.e., True or False. One of Kille and colleagues (2013) hypotheses was a rating of the likelihood that marriages of four well-known couples would break up in the next 5 years). This activity was a disaster. Although students readily defined a hypothesis or a dependent variable, almost none could correctly identify or differentiate them in the article. This revealed both a shallowness of understanding of the psychological concepts and a lack of practice applying and working with them.

    I found this troubling not only for my students who would go on to graduate school or take upper level seminars, but perhaps most troubling for my students who would likely not receive more training in methods and might graduate without the ability to consume research critically. Successful consumers of research need to not only describe the concepts involved in research, but apply them readily to newspapers, blog posts, or Buzzfeed articles that they read. This is especially important in today’s age of ‘disinformation’ and false news.

    In short, the problem with Research Methods is that to practice the skills involved in research, students first need to understand the concepts. And given the pressures of the semester we often don’t have enough time for them to do both.

    This is hardly a new problem; others with similar difficulty have often turned to flipped classrooms (see, for example Peterson, 2016 and Wilson, 2013 who have used flipped courses for similar reasons in a statistics course). A typical flipped classroom involves presenting traditional lecture-based material (i.e., the foundational concepts) in an online video that students watch on their own before coming to class. During class students then work together under the guidance of the instructor to practice applying these concepts and honing skills (e.g., Lage, Platt & Treglia, 2000). This allows students to do the “easy” parts of learning—listening to a professor lecture, memorizing material, etc. —at home, while doing the hard parts—actually thinking about and applying the material—in the classroom with the professor’s help.

    Flipped classrooms have many advantages. First, students can learn the content at their own pace because they can watch the lectures as often as they need to in order to understand the content. Second, through classroom activities, students can assess their own knowledge early, so they know what they don’t know before the exam, and target their practice accordingly. Third, because students practice their research skills in the classroom I can provide one-on-one time with them. I can offer instant feedback, can see where they struggle, and can scaffold them to success. I can correct their mistakes while they are making them, and adjust activities in the moment to ensure they fully meet my course goals.  When students practice their skills at home I may have no idea where or how they struggle.

    In effect, flipping the classroom allows me to move from a “sage on the stage” to a “guide on the side”, emphasizing the skill involved in assessing and designing research rather than providing definitions and rote memorization of the jargon.

    Implementing a flipped classroom is very time consuming and difficult—for every 10-20-minute video I made, I spent at least 3 hours writing a script (don’t think you can do this on the fly—you hem and haw and students feel like you’re wasting their time), creating slides, recording the video, editing it, and posting it to our course management system. Sometimes I found other people’s work that was far better than what I could have done (see Ben Goldacre’s Battling Bad Science TED Talk: https://www.youtube.com/watch?v=h4MhbkWJzKk) and that saved me hours, but for the most part I made my own lectures. I wrote online quizzes and discussion forums to ensure that students watched the lectures, and on top of all that I had to create an entirely new set of in-class activities to help my students practice their skills—the entire point of this exercise (The Society for the Teaching of Psychology  (http://topix.teachpsych.org/w/page/19980993/FrontPage), Teach Psych Science (http://www.teachpsychscience.org/),  and others have excellent resources for help on their websites). Each of these took at least another 2-3 hours to prepare, many of them much longer. In short, between making your own videos, exploring other people’s work, writing quizzes, and developing new in class exercises this is a daunting exercise, not to be assumed lightly.

    However, despite the immense amount of time and effort it took to flip my course the outcomes were phenomenal and I hope that will be encouraging enough to motivate others to pursue it and, equally importantly, to motivate your students to give a flipped class a chance.

    A brief word about what I will show you here—in the Fall of 2013 I taught Methods in a traditional lecture-based course and in the Fall of 2014 I taught the same course flipped with 16 video lectures spread throughout the semester. I chose to compare two fall semesters although my first time flipping the course occurred in the Spring of 2014. I did not examine this data as students in fall and spring typically differ in systematic ways (i.e., more first-semester juniors in the fall and more second-semester sophomores in the spring).

    I assessed three measures over the course of both semesters: (1) applied exam questions, (2) a large APA style research paper, and (3) student evaluations of instruction scores. I chose exam questions that focused on particularly difficult foundational questions and for which there were least two questions per topic. For the APA style research paper, I randomly selected 5 student papers per class for in-depth assessment. These were scored on a scale of 1 (absent) to 6 (exceeds expectations). There was a good correlation between these scorings (r = .87) and the grading rubric I had initially used to score the papers. Student evaluation of instruction scores ranged from 1 (strongly disagree) to 5 (strongly agree) and included a number of questions that I will discuss below. Finally, because the sample size was low I accepted alpha values of .10.

    T-tests revealed that students in the flipped course (F) and the traditional course (T) scored fairly similarly on most applied exam questions (Design: F 88%, T: 90%, p = .82; Hypotheses: F 81%, T 76%, p = .69; Sampling/ Assignment: F: 85%, T: 80%, p = .38; Reliability/Validity: F: 83%, T: 78%, p = .39) but for two of the hardest concepts, variables and causation, students in the flipped course greatly outperformed students in the traditional course (Variables: F: 90%, T: 79%, p = .06; Causation: F: 92%, T: 73%, p = .015).

    Though this was impressive, the largest improvement showed in the APA style research papers. Interestingly students in the flipped course used evidence better (F: 5.2; T: 3.4, p = .02), had better argument organization (F: 4.8, T: 3.2, p = .05), stronger hypotheses (F: 6, T: 4.2, p = .03), better proposed methods (F: 5.13, T: 4.13, p = .03), were able to discuss their predicted findings in more profound ways (F: 5.6, T: 4.35, p < .01), and had overall better papers than students in the traditional course (F: 5.45, T: 4.5, p = .06). Students in the flipped course were also marginally better at synthesizing information across sources (F: 5, T: 3.8, p = .11). However, it wasn’t just that students in the flipped course were better writers (Writing style: F: , 4.54, T: 4.47, ns) or better at following directions (APA Style: F: 5.13, T: 4, ns) so their improvements in these areas seems targeted and important.

    Student evaluation of instruction scores also told an interesting tale—students in the flipped course were more likely to recommend the course (T: 4.13, F: 4.70, p = .10) even though they found it provided a greater intellectual challenge (T: 4.40, F: 4.90, p = .06) and they found the difficulty level less appropriate (i.e., they reported that the course was too hard: T: 4.67, F: 4.10, p = .01). So even though students found the course harder they were more likely to recommend the flipped class to others compared to those in the traditional course.

    While we’re talking about student evaluation scores, I will point out that my evaluation scores suffered a little the first semester I flipped the course (Spring 2014). While they dropped in some areas (i.e., students found me less available for help; thought my comments were not as useful) their overall evaluation scores stayed fairly similar (4.58 vs 4.59). Further, this ‘hit’ to my evaluations disappeared after one semester. My interpretation here is that I was frantically writing lectures and prepping in-class activities and didn’t have as much time to spend with the students and on comments. Now that all that work is done I have more time than ever to spend on my students. Since then my evaluation scores have stayed the same or risen (average 2014/2015: 4.58, 2015-2016: 4.60, 2016-2017: 4.82). Open ended student evaluations indicate that they very much valued the flipped experience and used it just as I would hope. For example, one representative comment said:

    Teaching this particular material in a “flipped course” was effective. The nature of the material is generally easy to understand with previous experience in psychology but it was not always as simple to apply it; therefore, practicing application in class was helpful. Overall this fostered the ability to apply the knowledge across useful areas both in this course and other courses.

    In summary, flipping the course in Research Methods is hard, but it benefits the students. While this benefit may not necessarily show up on every exam it shows where it counts—when students use their knowledge of methods to evaluate articles or design their own research. They are better able to think about important scientific controls, to design better experiments, and to keep their interpretation within reach of their data set. In short, this improves their training as scientists and consumers of research which we hope will persist throughout their lives. Though this work is hard (for both you and the students), it pays off.

    I’ll leave you here with a few quick words of advice about flipping your own course: First, you don’t need to flip your entire course all at once. Consider flipping one day this semester and see how it goes. Next semester, add another. Second, borrow from people who have done this already. Raid listservs and teaching websites. Email me and I will happily send you my materials (scripts, videos, quizzes, activities, etc.) or give you a pep talk. Talk to your colleagues and share with them. Third, tell your students they will be in a flipped course and, importantly, why. Give them the data I’ve given you—reassure them that their papers will be stronger, their grades will be better, and they will be happier. They will get on board. Fourth, and perhaps the scariest for junior faculty like myself, accept that the first semester you flip, your teaching evaluations may take a hit. Know that you’re gambling, yes, but it’s on a good bet—they will likely rise higher down the road once you’ve sold your students, once they know what they’re getting by enrolling in your course, and once you have mastered the flip.



    Kille, D.R., Forest, M.L. & Wood, J.V. (2013). Tall, dark, and stable: Embodiment motivates mate selection. Psychological Science, 24, 112-114.

    Lage, M.J., Platt, G.J., & Treglia, M. (2000). Inverting the classroom: A gateway to creating an inclusive learning environment. The Journal of Economic Education, 31, 30-43.

    Peterson, D.J. (2016). The flipped classroom improves student achievement and course satisfaction in a statistics course: A quasi-experimental design. Teaching of Psychology, 43(1), 10-15.

    Wilson, S.G., (2013). The flipped class: A method to address the challenges of an undergraduate statistics course. Teaching of Psychology, 40(3), 193-199.


    Biographical Sketch

    Ellen Furlong is an Assistant Professor in Psychology and Director of the Comparative Cognition Lab at Illinois Wesleyan University. She received her B.A. in Mathematics from Transylvania University and her Ph.D. in Psychology from The Ohio State University. Before joining the faculty at Illinois Wesleyan University in 2013, she served as a postdoctoral fellow at Yale University. Ellen has taught several courses with "flipped" components including a survey level fully online course, a writing intensive research methods course with flipped lectures, and a team-taught, cross-institution (Illinois Wesleyan and Transylvania Universities) May Term travel course with flipped lectures and skyped class sessions.

  • 16 May 2017 8:57 AM | Anonymous

    Teaching with Affordable Technology to Increase Student Learning

    Judith Pena-Shaff (Ithaca College)

     Amber Gilewski (Tompkins Cortland Community College)


    Last year at the APA Convention in Orlando, we participated in a symposium about the use of Open Educational Resources (OER) to increase student learning. Judith had little familiarity with OER, while Amber had been using these resources in her classes for the past two years, on the recommendation of her Provost who was enthusiastic about them.  A few days later, the president of Judith’s institution began his all-faculty meeting cautioning about the threat that OER known as Massive Open Online Courses (MOOCs) posed to traditional institutions of higher education. As a current participant in an Introduction to Psychology class offered through Coursera, questions about the educational and learning values of these resources came to Judith’s mind. Will OER increase students’ learning? And if so, how? In this essay, we discuss the value of open educational resources to increase student learning opportunities, as well as their challenges and promises.

    Open Educational Resources (OER) are “teaching, learning, and research resources that reside in the public domain or have been released under an intellectual property license that permits their free use or re-purposing by others” (Atkins, Brown, & Hammond, 2007, p 4). Inspired by the Open Source Software (OSS) and the Open Access (OA) movements in the mid 90’s (Baraniuk, 2008; Wiley & Gurrell, 2009), OER are relatively new phenomena that aim to 1) provide free or at least affordable access to knowledge and digital educational and research resources; and 2) reduce the high cost of teaching materials. Philanthropically, it is hoped that OER will help to equalize worldwide access to knowledge, and provide everyone with the opportunity to share, re-use, and re-conceptualize knowledge (Atkins et al., 2007; Baraniuk, 2008). OER include, but are not limited to, learning resources such as full online courses, courseware (e.g., syllabi, lectures, quizzes, and homework assignments), learning objects, assessment tools, software (e.g. IHMC CmapTools program), learning management systems (e.g. Sakai), textbooks, encyclopedias (e.g. Wikipedia), simulations, and other resources or techniques used to support access to knowledge (Hylén, 2006; Downes,2007). Some well-known open education projects are Connexions, which started in 1999; Wikipedia, launched in 2001; a series of OER projects sponsored by the Hewlett Foundation ; MIT Open Courseware, which began in 2002; and more recently, platforms such as Coursera Udacity, and edX (a joint venture between Harvard and MIT), which offer MOOCs.

    There are many reasons why psychology instructors might decide to adopt OER in their traditional face-to-face or distance learning classes. First, OER allow us to provide students with affordable access to information and knowledge. For example, Gilewski provided students with the option to use an OER textbook in her general psychology community college classes (Gilewski, 2012). They could either read the book online or print it for a small fee. She found, in contrast with previous semesters, that students spent less for their class materials, their grades improved, and there was a reduction in the number of course withdrawals. However, it is impossible to know if these results were caused by students’ access to affordable reading material.

    Second, OER allows instructors the opportunity to customize their course materials, providing students with different types of learning aids that better fit the course objectives and benefit different types of learners. For example, Audley-Piotrowski and Magun-Jackson (2012) used a custom-designed DVD with different types of learning resources to increase student preparation and involvement in a Developmental Psychology course. Their study revealed that different types of learning aids engaged different types of students. Non-traditional students and students who defined themselves as independent learners benefited the most from the ancillary the course CD offered than more traditional and dependent learners.

    In addition, OER can be used to combine different tools to help students develop shared knowledge through communities of practice. Draper (2012) explored how knowledge-building activities, such as individually and collaboratively creating concept maps, helped her students develop knowledge convergence. She used Moodle, a free course management system, an asynchronous online communication system for student collaboration, and IHMC CMap tools, a concept mapping software package that can be downloaded for free at http://cmap.ihmc.us/download/. Integrating these learning resources with instructional activities increased student engagement and participation and fostered the development of complex knowledge structures both in online and blended classroom environments.

    So far, we have presented the inclusion of OER in somewhat traditional course environments. MOOCs, however, are a different species of OER. Although the first course using the name MOOC was offered in 2008, the term became a buzzword at the beginning of 2012, with the creation of Coursera, an online platform that offers entire college courses for free. This company, started by two Stanford professors, now has contracts with well-known universities that offer free courses, although not yet for credit, through its online platform. Judith’s experience taking an Introduction to Psychology class taught by University of Toronto professor Steve Joordens has been very positive so far, although not very challenging. The lectures are 15-minutes or less, and are geared to introduce a few basic psychology concepts and theories to a very diverse audience in terms of age, occupation, and geographical location. At the end of each lecture there are two multiple-choice items related to the lecture (not graded), links to free online videos (usually from YouTube), and additional readings. The online discussions are lively, and some participants have been promoted to the level of teaching assistants because of the feedback they often give to others. Other participants write lecture notes and share them with the class. Judith, as others, just watches the lectures. To obtain a certificate of completion a student must complete two multiple-choice exams with a grade of 70 or higher. These tests permit a review of the lecture and retest on the items, to allow the student to correct wrong responses (very like B. F. Skinner’s Programmed Instruction technique). In addition, a short, peer-reviewed argument paper can lead to a “certification of completion with distinction.”

    From these examples we can see that OER offer instructors and students certain advantages. Students find them more affordable than commercial sources. Thus, if access to textbooks is an issue for our students, then OER become very appealing. OER also provide equal access to learning resources worldwide. For example, in the Coursera Introductory Psychology course, all participants have access to the videos and readings, no matter where they live or their levels of education. Many of the resources can be customized by instructors (e.g. editing the textbook, adding or simplifying information). They also give instructors the flexibility to combine different learning resources to better serve their students, to favor different pedagogical approaches (from memorization to knowledge construction), and to complement the textbook. They can be designed to follow a non-linear format. Instructors can link the course syllabus to the readings, videos, and Internet resources to help students gain a better understanding of the course content. All these factors sound very appealing.

    For faculty interested in infusing more OER in their own courses, some resources may include, but are not limited to the Community College Consortium for Open Educational Resources (http://oerconsortium.org), Carnegie Mellon Open Learning Initiative (http://oli.cmu.edu), Saylor (www.saylor.org), and OpenStax College (http://openstaxcollege.org). Amber has been involved with the Kaleidoscope Project (http://www.project-kaleidoscope.org), a cross-institutional collaboration for using the best existing OER for the past few years. They are always looking for new adopters in this grant-funded work.

    However, there are also challenges in adopting OER. For example, increased access does not necessarily mean enhanced or increased learning or motivation. Research shows that less than 30% of psychology students read their textbooks before class and less than 70% read them before an exam (Clump, Bauer, & Bradley, 2004). Of the 60,000 individuals who registered for the Coursera-based Introduction to Psychology class that Judith is observing, 12,000 (20%) were still actively participating at the time we wrote this essay (class announcement, June 4, 2013). This was before the first assessment took place. We wonder how many participants will actually complete all the course assignments and finish the course.

    Also, research on students’ perceptions of textbooks’ pedagogical aids (Marek, Griggs, & Christopher, 1999) shows that students tend to prefer aids that directly relate to test preparation (such as chapter glossaries, boldface definitions, chapter summaries and self-tests) rather than aids that might lead to a deeper understanding of the course material. Therefore, it was not surprising that students in Audley-Piotrowski’s and Magun-Jackson’s (2012) case study focused only on the readings and concepts and not on the other resources, since the test focused mainly on the readings.

    Issues also arise from our lack of familiarity with and concerns over the quality of OER resources. Of course, this is not much different than when we try to select textbooks in our area. The main difference is that we can always get some feedback from colleagues about textbooks. Since OER are not so well known, we are less likely to get feedback so we have to figure things out on our own. Also, we must find the OER while the textbooks usually come to our offices via publishers’ representatives.

    A major challenge relates to the sustainability of OER in terms of funding (so far most OER funding has come from educational institutions’ or foundations’ grants), technical upkeep (e.g., What happens when a problem occurs? Who maintains the sites?), and content (updating the content, reliability of sources, and so on). Several models have been proposed, particularly for the sustainability of MOOCs, such as charging participants for certificates of completion, charging employers who might be given access to participants’ grades, and of course, sponsors.

    While we have different, affordable learning technologies available today, some of the problems we face as instructors are still the same. For example, Hammer (2012) discussed students’ lack of metacognitive skills and learning strategies. Basically, many of our students do not know how to study or which learning strategies work best for them. We need to teach students these strategies directly, and help them become more conscious and purposeful in their learning. One way to do this could be by creating assignments that make them reflect on how they learn, regardless of the type of learning resources or environment where learning takes place.

    Students also need to be active in learning. To encourage more active learning in her Introduction to Psychology classes, Amber has been involved with the Carnegie Mellon Open Learning Initiative, which provides a more interactive approach to learning the material. Students read material online, watch embedded videos, engage in “Learn-By-Doing” and “Did-I-Get-This?” activities that provide immediate, targeted feedback, before they go on to take graded Checkpoints after each module. She has seen a dramatic increase in her students’ success and interaction with course material, which she’ll present at a symposium at the APA’s 2013 Convention in Hawaii. 

    In conclusion, OER provides affordable access to learning resources. Integrating OER and active learning strategies might help to foster complex knowledge structures. Our role is to guide our students so they use and take advantage of these resources.


    Atkins, D.E., Brown, J.S., & Hammond, A.L. (2007). A review of the Open Educational Resources (OER) movement: Achievements, challenges, and opportunities (Report to the William and Flora Hewlett Foundation). Retrieved June 2013 from:  http://www.hewlett.org/uploads/files/ReviewoftheOERMovement.pdf.

    Audley-Piotrowski, S.R. & S. Magun-Jackson, S. (2012, August) Textbook alternatives and student learning in a lifespan development course. In A.M. Gilewski and D.C. Draper (chairs), Teaching with affordable technology to increase student learning: What works. Symposium presented at the annual convention of the American Psychological Association, Orlando, FL.

    Baraniuk, R. G. (2008). Challenges and opportunities for the open education movement: A Connexions case study. In T. Iiyoshi & M. V. Kumar (Eds.), The Collective Advancement of Education through Open Technology, Open Content, and Open Knowledge (pp. 229-246). Cambridge, MA: MIT Press.

    Clump, M.A., Bauer, H. & Bradley, C. (2004). The extent to which psychology students read textbooks: A multiple class analysis of reading across the psychology curriculum, Journal of Instrumental Psychology, 31, 227-233.

    Downes, S. (2007). Models for sustainable open educational resources. Interdisciplinary Journal of Knowledge and Learning Objects, 3, 29-44. Retrieved June, 2013 from: http://www.ijklo.org/

    Draper, D.C. (2012, August), Instructional strategies to promote knowledge convergence in online communities of practice.  In A.M. Gilewski and D.C. Draper (chairs), Teaching with affordable technology to increase student learning: What works. Symposium presented at the annual convention of the American Psychological Association, Orlando, FL.

    Gilewski, A.M., (2012, August). Using open educational resources to improve student success in introduction to psychology courses. In A.M. Gilewski and D.C. Draper (chairs), Teaching with affordable technology to increase student learning: What works. Symposium presented at the annual convention of the American Psychological Association, Orlando, FL.

    Hammer, E.Y (2012, August). Meta-studying: Teaching metacognitive strategies to enhance student success. Paper presented at the annual convention of the American Psychological Association, Orlando, FL.

    Hylén, J. (2006, September). Open educational resources: Opportunities and challenges. Proceedings of Open Education 2006: Community, culture and context.  Utah State University (pp. 49-63). Retrieved June 10, 2013 from: http://library.oum.edu.my/oumlib/sites/default/files/file_attachments/odl-resources/386010/oer-opportunities.pdf.

    Marek, P., Griggs, R. A., & Christopher, A. N. (1999). Pedagogical aids in textbooks: Do college students' perceptions justify their prevalence? Teaching of Psychology, 26(1), 11-19.

    Wiley, D., & Gurrell, S. (2009). A decade of development. Open Learning, 24(1), 11-21.  doi:10.1080/02680510802627746.



    Judith Pena-Shaff is an associate professor and chair of the psychology department at Ithaca College. She earned her Ph.D. in educational psychology from Cornell University in 2001. Dr. Pena-Shaff’s research interest is in instructional technology. Specifically, she is interested in the knowledge construction processes students use in computer-mediated learning environments with the purpose of creating a taxonomy to help instructors assess student learning.  In addition, Dr. Pena-Shaff is highly engaged in her community, often conducting evaluations of educational programs run by schools and local organizations.


    Amber Gilewski is an assistant professor of psychology at Tompkins Cortland Community College in upstate NY. She is a Psychology Fellow on the Kaleidoscope Project, which is a Next Generation Learning Challenges grant-funded collaboration of colleges in the U.S. devoted to improving student success and retention in general education courses, through the use of OER. She earned her master’s degree in Clinical-Counseling Psychology from LaSalle University in 2002 and has been teaching at community colleges since 2004.


  • 02 May 2017 7:44 PM | Anonymous

    A Short Writing Assignment for Introductory Courses and Beyond
    Mitchell M. Handelsman
    University of Colorado Denver


    I don’t want to be a downer or anything, but I have a lot of problems in my teaching. Among them:

    • Getting students to do the readings
    • Getting students to think
    • Getting students to think about the readings they do
    • Wanting to have students write in meaningful ways
    • Having too much work to do
    • Getting bored reading papers that all say the same thing
    • Having student read without being accountable until the test, which may be weeks away (Handelsman, 2016)
    In this essay I describe an assignment that solves, or at least addresses, these problems. I have students write very short papers about their reading assignments in which they do more than summarize or question. To get a sense of the assignment, imagine that you are an introductory psychology student, and you read this in the syllabus:


    Processing and Reflecting on Psychology (PROPS)
    • Actors need props, right? If you want to act like a student, you need PROPS!
    • PROPS are short reflections on—and explorations of—your reading. They can be as short as a few sentences and no longer than 1 page. You will process (do something with, reflect on) at least 2 major concepts or key terms from the material you read. Here’s what I mean by processing:
      • You can ask and answer a question about what you’ve read.
      • You can differentiate key terms from each other, or show how you might remember them.
      • You can generate a couple of new examples of a couple of key terms.
      • You can relate the concepts to material from other modules, courses, or experiences.
      • In general, you can do anything beyond just questioning (e.g., “What does the hindsight bias mean?”) or reporting (e.g., “The psychoanalytic approach deals with unconscious material.”).
    • I assign PROPS to encourage you to:
      • do the reading (Course Goals 1 and 2) and do it actively (Course Goal 3).
      • practice active learning skills (Course Goal 3), such as self-reflection, applying, and elaborating.
      • come to class, and come prepared to work (Good for ALL course goals!).
    • Logistics
      • You will write 15 PROPS this semester. At the top of each, put your name, the date, the module covered, and the number of the PROP (e.g., the first prop you submit will be “PROP 1”).
      • PROPS need to be typed, double-spaced, 12-point font, 1-inch margins, no longer than 1 page.
      • You can hand in a PROP any day for which there is a reading assignment. The 2 (or more) concepts you process must be from the reading assigned for that day.
      • You can only hand in 1 PROP per class.
      • You have some choice about when you hand in PROPS, but I encourage you to start soon!! If you wait until the beginning of March, for example, you will have to hand in a PROP every class period.
    • Grading
      • You can earn 2 points for each of your PROPS. You will earn 2 points for showing that you’ve done the reading and are doing something more than reporting or questioning 2 concepts. You will earn 1 point if you hand in the PROP on time but have not processed or reflected actively upon 2 concepts.
      • I don’t grade PROPS on accuracy, but on activity! You are rewarded for taking risks and trying to learn.
    • Hints
      • The best PROPS are those that help you answer test questions by going beyond simple, sweeping statements or stories about your life. Take risks to see if you understand.
      • Use the language of psychology. Show that you’ve done the reading (Course Goals 1, 2, and 4).
      • If you discuss personal experiences, do more than tell a story:  show explicitly how the concepts apply your experience. For example: To say that you use coping strategies and tell a story about one of them is not enough. To show why some of your strategies are problem-focused and some emotion-focused is better. To relate your coping to some other information in the book, like speculating on some biological, social, or psychological factors in your coping, is wonderful!
      • PROPS can demonstrate that you appreciate the complexity of human behavior (Course Goal 2) by avoiding simplistic and extreme statements. For example, instead of, “I find it interesting that most fields of practice use the scientific method.This means that psychology is no different than any of the other fields of study in the world,” this might be better: “Many fields of study use the scientific method.Thus, psychology shares one characteristic with fields like biology and physics.In other ways, of course, psychology is different from other fields.”

    By the way, here are the course goals that the assignment refers to:

    I teach this course so you can:

    1. Learn major concepts and findings in psychology.
    2. Appreciate the complexity of human behavior.
    3. Develop and practice more active ways of studying and learning, including writing to learn, active reading, reflection, participating in class (individually and in groups), and more effective test-taking skills.
    4. Appreciate how psychologists think; e.g., how they use scientific methods to study behavior.
    5. Develop the ability to meet deadlines and follow directions.
    Students can earn a total of 400 points in the course; thus, these papers represent 7.5% of the final grade. Of course, the relative weight of the assignment is up to you depending on your goals. In my course, students earn 300 points for test performance and the rest for two larger papers in which they process at least three concepts across at least two chapters. One of these papers can be revised, and one can be an expansion of a PROP.

         I used to have students submit hard copies of their PROPS at the beginning of class, to encourage attendance. Recently I’ve been having students submit these types of papers on our LMS a few hours before class so I have the chance to read at least some of them before class (Handelsman, 2014). This gives me a chance to address misunderstandings and tailor exercises to incorporate students’ efforts.

         You can adapt this assignment for other courses and purposes (Handelsman, 2014). For example, you can specify additional elements for one or more of the PROPs, such as having students apply concepts from the text to an outside reading, an upcoming presentation, or previous PROPS. You can increase the number of concepts as the semester goes on. In upper-division courses you might specify the type of higher-order thinking you want students to do.

         Although the final product is short, I find it helpful to let students know that they may need to write much more than one page and then edit it to show me their best work. Here is the way I often explain it:

    “A one-page paper is like a traditional five-page paper with the extra verbiage removed. In high school (or other college courses), you sometimes spend the first four of the five pages summarizing what you’ve read. Then, you have a page to go and you don’t have anything else to summarize, so you say to yourself, ‘Let me just mess around and throw in something from the previous unit that seems to relate.’ It’s on that last page that you actually do something with what you’ve read. That’s what I want! I don’t need the summary. So you may have to write all those pages, but cut out the first ones and polish up the part where you’re thinking!

     Of course, there are still problems with this assignment (What kind of academic would I be if I didn’t see problems?):

    1. There is not enough opportunity for students to revise their work, and I do not spend enough time on grammar, style, and other aspects of writing. In my defense, I want freshmen to have ideas. Once they have something of their own to say, they may be more motivated to learn how to share their thoughts in effective ways.
    2. I still have a lot of reading to do. However, PROP reading is more interesting than reading a bunch of summaries, and the short length makes grading easier. And, of course, the assignment fits my short attention span.
    3. Students can still read the first paragraph, or any paragraph, of a chapter and write something that would work. But I figure that even a little effort is better than nothing! They still have more of an opportunity to think and read differently (Hanelsman, 2016).

    I hope you see some of the advantages of this assignment and ways to adapt it to your own course objectives. And forgive me for taking more than a page to explain it.


    Handelsman, M. M. (2014, August 19). This year I’m having my freshmen do POT: Four reasons to have students rolling in papers [Blog post]. Retrieved from http://www.psychologytoday.com/blog/the-ethical-professor/201408/year-im-having-my-freshmen-do-pot-0

    Handelsman, M. M. (2016, September 28). Reading with purpose, or purposes [Blog post]. Retrieved from http://www.psychologytoday.com/blog/the-ethical-professor/201609/reading-purpose-or-purposes




    Mitchell M. Handelsman, Ph.D., is Professor of Psychology and CU President's Teaching Scholar at the University of Colorado Denver, where he has been on the faculty since 1982.  Dr. Handelsman has won numerous teaching awards, including the 1992 CASE (Council for the Advancement and Support of Education) Colorado Professor of the Year Award, and APA’s Division 2 Excellence in Teaching Award in 1995.  He has co-authored three books, Ethics for Psychotherapists and Counselors: A Proactive Approach (2010; with Sharon Anderson), Ethical Dilemmas in Psychotherapy: Positive Approaches to Decision Making (2015; with Samuel Knapp and Michael Gottlieb), and The Life of Charlie Burrell:  Breaking the Color Barrier in Classical Music (2015, with Charlie Burrell). He is an associate editor of the APA Handbook of Ethics in Psychology (2012). 


  • 16 Apr 2017 9:18 AM | Anonymous
    OMG RU Really Going to Send That?
    Email Communication with Students

    Andrew Peck, PhD

    The Pennsylvania State University

         Electronic communication plays an important role in traditional collegiate education and online learning. In 2001, the number of email messages outnumbered letters sent by the United States Postal Service (Levinson, 2010). In 2002, Bloch reported that the typed word began to establish itself as the primary means of interpersonal communication, mentioning a case in which a student broke-up with her boyfriend via email. In fact, email has become the most widely used instructional technology (see Wilson & Florell, 2012).  Recognizing this, at least one college tells students that email is the “lifeline of [their] communication with the college.” (http://www.gwinnetttech.edu/webmail/, sec. 1). Interestingly, while we are most likely to initiate electronic correspondence to send course announcements or meeting requests, students tend to use their “lifeline” to make appointments, ask questions, and offer excuses (Duran, Kelly, & Keaten, 2005)


          Email can benefit faculty members and students in a variety of ways. Email is a relatively inexpensive way to communicate with many people quickly, it fosters collaboration, file sharing (Hassini, 2004) and group problem solving (Hassini, 2004; Wilson & Florell, 2012), and it provides an electronic record or “paper trail” for later reference (Wilson & Florell, 2012). Email can also increase the accessibility of the instructor (Hassini, 2004; Wilson & Florell, 2012). We can use email to provide feedback, which can foster academic development (Duran, Kelly, & Keaten, 2005), motivation (Duran, Kelly, & Keaten, 2005; Kim & Keller, 2008), and achievement (Kim & Keller, 2008). Some have noted that email can increase student writing (Hassini, 2004), although others have expressed concerns about the quality of students’ electronic correspondences (see Bloch, 2002). Email can increase communication with students who struggle with face-to-face communication, including foreign, shy, or disabled students (see Bloch, 2002; Duran, Kelly, & Keaten, 2005). Finally, email use can improve students’ perceptions of us, especially when our responses are helpful and prompt (Sheer & Fung, 2007), and include appropriate emotional content (Wilson & Florell, 2012).


          Like other instructional technologies, email is a tool, and misuse can result in unexpected consequences. Although the option to send a message to a large group of people quickly can be helpful, email does not come with “you probably shouldn’t send that” warnings, and sometimes people will send ill-conceived electronic messages to many recipients, as these examples of public Tweets (posts on Twitter) demonstrate:

    “I can't believe my Grandmothers making me take out the garbage   I'm rich f*** this I'm going home I don't need this s***”   - 50 cent (Note: I’ve added spaces and censored the message to make it more readable and appropriate for readers)

     “With so many Africans in Greece, at least the mosquitoes of West Nile will eat homemade food.”   - Voula Papachristou, Greek triple jumper who was removed from the Greek Olympic team for posting this sarcastic comment

     Although many of us are fortunate enough to have students who don’t send inappropriate mass mailings to classmates regularly, email does provide an avenue for upset students to vent before they’ve fully considered the consequences. Furthermore, while email increases the accessibility of the instructor, it also means that students have increased expectations about our availability and personal attention. Consequently, responding to email seems to have changed the nature of our work.

         Some of us prefer to use email as little as possible because the loss of non-verbal, social, and contextual cues can increase misunderstandings (Hassani, 2004), but many of us seem to treat it as a job requirement (and sometimes it is). Nonetheless, it can be time consuming to respond appropriately to student messages (Hassani, 2004), and sometimes responding becomes “the third shift in an already overcrowded day” (Mason, 2010, para. 3). Sometimes, when it is clear that students did not take the time to read important announcements sent via email, we wonder if sending email is worth the time it takes us to compose the message.

         To make matters worse, sometimes we wonder if the email students send are actually written by the student who is listed as the sender. In our department, my colleagues and I have received messages from student accounts that were actually written by those students’ friends, roommates, and parents. Ironically, some of us might wish students’ parents wrote messages for their children more often, as student messages can be too casual for many educators (see Bloch, 2002). It is not uncommon for electronic messages to lack grammar and punctuation, as this example demonstrates:

     “can i come 2 ur office i need 2 meet w u b4 the test i have ?s thx”

     Faculty Member Expectations

          Faculty members vary in their expectations of student email (Biesenbach-Lucas, 2007). To help students understand specific expectations, some of us include a statement about email communication in their syllabus. Here is an excerpt from a sample syllabus that focuses on instructor accessibility and other concerns:

    Email policy: On weekdays, I check my mail once -- in the early morning. If you send me an e-mail after 6 a.m., do NOT expect an answer until the next day. I do NOT check my mail at all on weekends. So if you send me a message anytime after 6 a.m. on Friday, you will not get an answer until Monday morning. I do not open emails with attachments. I do not open emails without subject lines. I do not open emails written in languages I can’t read – so be sure if you have your email set to a non-English format that your name and information come through in English. (http://public.wsu.edu/~mejia/Handbook/Sample_105_Syllabus.htm, para. 2)

    Here is an excerpt from another syllabus that focuses on tone and style:

     …all email communication will follow the guidelines enumerated here.  Email should be composed in formal, professional language, and with attention to the propriety accorded to the position of the writer, and the addressee…(http://www.hist.umn.edu/hist3722/syllabus.html, para. 9)

     Some might worry that including these types of statements in their syllabus might cause students to view them as overly strict, but students may not be aware of how they come across in their email and appreciate knowing teacher expectations (Martin, 2011).

         While a syllabus statement can help, challenging email messages seem to come with the job. While there are no recipes or guidelines we can use to construct the perfect email message, people have offered a number of helpful considerations. To help sort out these considerations, I have organized them below using based the popular green, yellow, red color coding scheme to reflect the potential gravity of the student’s message or the educator’s response.

     Code Green Messages

          Fortunately, we sometimes get “Code Green” messages. These messages are complimentary or positive in tone and content (I wanted to thank you for…, I enjoyed your course, are you teaching others…), ask for appropriate information respectfully, or include appropriate requests. Generally, these messages are easy to respond to professionally, so there is little need to offer strategies for responding to these types of messages.

     Code Yellow Messages

          Unfortunately, “Code Green” messages are often outnumbered by “Code Yellow” messages. These messages require us to proceed cautiously, as the message might require a considered response. Experience suggests that there are several types of “Code Yellow” messages: those that demonstrate that students misunderstand their own responsibilities, messages containing inappropriate personal information, and messages motivated by students’ anxieties (see Wilson and Florell, 2012, for an excellent review).

         Sometimes students misunderstand their own responsibilities, and deflect or request accommodations to compensate (Wilson and Florell, 2012). For example, my colleagues and I get messages from students like these:

    Dr. __ , I didn’t do well on your final exam. I am on the __ team and need an A in your class to get into my major and retain my scholarship. Please help.

     Dr. __ , I didn't realize the ___ was due yesterday. What can I do to make-up those points?

     Dr. __, I won’t be prepared for class discussion and can't do the first reading quiz because I just ordered the book. I apologize for any inconvenience.

     Dr. __ , I didn't make it to class today. Can you please send me the notes I missed?

    Sometimes students will include personal details of their lives inappropriately to justify a request. Sometimes lonely students just write to be friendly, and sometimes students seeking relationship advice confuse us with writers for the Dear Abby column. Consistent with examples provided by Wilson and Florell (2012), here are some example messages my colleagues and I received:

    Dr. __, How are you? I would like to make an appt. to meet with you. I don’t have anything specific to discuss, I just thought I would stop in to say hi and chat. I have two dogs named….

    Dr. __ , Help!…me and my friend hooked up once in the beginning of the semester and I liked her but didn't think she liked me back so I moved on, and……but now...what should I do?

    Sometimes “Code Yellow” messages are sent by conscientious and responsible students whose anxieties get the best of them.  Consistent with examples provided by Wilson and Florell (2012), here are some example messages we received:

    Dr. ___  , I am in your 11:00 am class. I completed the extra credit writing assignment in class today, but I didn't receive credit in the online grade book yet. Please get back to me right away. I really need this credit. [message sent at 1:30 pm]

    Dr. ___ , I wonder if the study guide you gave us is really everything we need to know for the final. We didn’t cover Chapter 11 in class, and it isn’t on the syllabus, but should I study it anyway? I emailed you earlier today, but I didn’t hear back yet.

    Sometimes, students send “Code Yellow” messages requesting information that is outside of the responder’s expertise. In these cases, it is appropriate to redirect the student to the appropriate resource, often an academic advisor or health services professional. However, many “Code Yellow” messages are class specific, requiring us to respond directly. In these cases, we should try to treat these moments as “teachable moments.” We should model professionalism, maintain a professional tone and offer appropriate content (Wilson & Florell, 2012). Sometimes leading by example can help, and one never knows who will read the message, especially when technologies make it easy to share electronic correspondence with others easily.

         As mentioned above, students appreciate it when we include emotional content in their responses (Sheer & Fung, 2007), but it is important to balance a congenial tone with a professional tone. One way to do that is to express empathy/sympathy when saying “no” (Wilson & Florell, 2012).

    Example: Thanks for letting me know. I appreciate your dilemma. I hope that you can stay on the team and keep your scholarship. I’d really like to accommodate your request, but I have to assign your grade on the basis of merit and abide by the grading policies in our course syllabus or I will…. violate departmental and college policies….create an unfair situation for other students….

    Wilson & Florell (2012) have also recommended that we provide students with perspective and encourage responsible action.

    Example: Unfortunately, you can’t make it up, but it is only worth…you can still do well in the course if you…..

    Example: Yes, you can do that. Please see the syllabus for details.

    They also recommend ending our messages with a positive and sincere tone when possible, but also recognize that a persistent student will struggle to take “no” for an answer. In these cases, it is up to us to end the conversation directly, but not aggressively, ignoring additional email from the student about the same issue.

    Example: Thanks for following-up and providing more information. I hope you have a good weekend.

    Example: I appreciate your continued concerns, but as I said, there isn’t anything else I can do without violating college/course policies. I consider this matter closed.

    Code Red Messages

    While “Code Yellow” messages require us to slow down and respond cautiously, “Code Red” messages often require us to stop what we’re doing to construct a planned response. “Code Red” messages are highly emotional, highly critical, or have an aggressive tone. Examples include pleas for help, student disclosures of abuse or suicidal inclinations, or hostile messages from irate students. While discussing strategies for responding to aggressive behavior, Tunnecliffe (2007) listed a number of potential causes for students’ anger.  He noted that some aggression stems from the lack of critical knowledge or inaccurate information, unrealistic expectations, or previous rewards for aggressive behaviors. Research on the development of the teenage brain also suggests that teenagers are more likely to become highly emotional than we are, and that emotion may cloud students’ reasoning abilities (for an example, see Spinks, 2013). Regardless of the factors involved, many aggressive messages seem to be triggered by perceptions of unfairness or inequity.

         Because of the nature of “Code Red” messages, there are a number of things to consider when responding. On many campuses, when faculty members are alerted to imminent threats of harm (including student self-harm) they are required to alert their chairs/department heads and campus or local police. Many campuses have counseling or intervention teams, other student resources, or partnerships with community programs to offer student resources. When appropriate, we should introduce these resources to victimized students and should consider facilitating student contact/appointment scheduling. If nothing else, we can encourage victimized students to go to the local hospital, where hospital personnel and case-workers can get involved.

         On some campuses, faculty members are instructed NOT to take on the role of detective/police officer or ask the student specific questions about a traumatic experience. This can increase feelings of victimization and make it less likely that the student will share critical details with law enforcement officials, student conduct authorities, police, or health professionals. Instead, we are advised to take the information the student has provided at face value, ask a few general questions (What happened? When? Where?) so that information can be passed on to authorities, reassure the student that they will do what they can to help, and then follow campus guidelines for helping.

         Dealing with aggressive students can be challenging and emotional for us. My colleagues and I have found it helpful to walk away from the computer and let some time pass before they respond (usually 12-24 hrs). This gives us time to cool down so that we can respond more professionally, and it gives the student time to cool down, too. Occasionally, students will realize their message contained inappropriate content or had an inappropriate tone, and they will send a follow-up apology. While there isn’t any research on successful strategies for responding to aggressive email, recommendations can be drawn from discussions about the best ways to communicate with angry students to promote de-escalation. It is important to avoid using a reprimanding tone (Tunnecliffe, 2007), which can promote defensiveness and increase perceptions of victimization. It is also important to recognize that anxiety can increase threat perceptions (Craske, Rauch, Ursano, Prenoveau, Pine, Zinbarg, 2009), and that anxious students are more likely to interpret ambiguous information or references to authority as more threatening than intended. A calm, jargon-free, tone might be more successful (Tunnecliffe, 2007; University of Oregon Counseling and Testing Center, 2012). With this in mind, it is important to note that we should avoid using capitalized words or bold text for emphasis, as some student interpret these formatting cues to mean yelling rather than emphasis (Hassini, 2004).  The University of Oregon Counseling and Testing Center recommends acknowledging the student’s emotion, and Larson (2008) recommends using content cues that facilitate an empathetic or sympathetic tone (e.g., I can see this is really important to you). We should use the present tense, focusing on the present situation rather than rehashing the past (Tunnecliffe, 2007) and explain what we can do (Larson, 2008) rather than explaining why we can’t address the student’s concerns, even if that is nothing more than an offer to meet and discuss.

         Some of us might want to respond to criticisms from students directly. We all make mistakes, and sometimes students’ criticisms are based on something legitimate. In these cases, it might be best to agree with what is accurate and share your plan for corrective action (Tunnecliffe, 2007). If criticism is vague, it is fine to ask for clarification (Larson, 2008). Sometimes the initial criticism, or the response to your request for clarification, can be lengthy. In these cases, it might be best to address concerns globally rather than respond to individual concerns (Tunnecliffe, 2007). If none of these strategies sound appealing, we can always deflect the criticisms by simply thanking students for sharing their views (Tunnecliffe, 2007).

     Final Thoughts: Maintain Perspective

    Regardless of how you choose to respond to critical email messages, it is important to consider Alexander Pope’s “to err is human; to forgive divine” and to cut ourselves some slack (Tunnecliffe, 2007). It is also important to recognize that, while we can make the most out of “teachable moments,” we can’t get through to everyone (Larson, 2008). Research has shown that readers who are angered by email attribute the tone to the writer’s personality (Levinson, 2010). Student politeness affects our feelings towards the student, our beliefs about the student’s competence, and our motivations to help (Stephens, Houser, & Cowan, 2009; Bolkan & Holmgren, 2012). So, it is critically important to remember and apply the lessons we teach our students about the Fundamental Attribution Error and consider that situational, rather than dispositional, factors can lead the student to send inappropriate email.

         Steve Johnson, a football player for the Buffalo Bills, blamed God for a dropped pass and posted the following to Twitter:


    So, the next time you read an annoying email message from a student, take a moment to appreciate that you are in good company.


    Biesenbach-Lucas, S. (2007). Students writing emails to faculty: An examination of e-politeness among native and non-native speakers of English. Language Learning & Technology, 11(2), 59-81.

    Bloch, J. (2002). Student/teacher interaction via email: The social context of internet discourse. Journal of Second Language Writing, 11, 117-134.

    Bolkan, S., & Holmgren, J.L. (2012). ‘‘You are such a great teacher and I hate to bother you but...’’: Instructors’ perceptions of students and their use of email messages with varying politeness strategies. Communication Education, 61(3), 253-270.

    Craske, M.G., Rauch, S.L., Ursano, R., Prenoveau, J., Pine, D.S., Zinbarg, R.E., (2009). What is an anxiety disorder? Depression and Anxiety, 26, 1066–1085.

    Duran, R.L., Kelly, L., & Keaten, J.A. (2005). College faculty use and perceptions of electronic mail to communicate with students. Communication Quarterly, 53(2), 159-176

    Gwinnet Technical College. (n.d.) Student webmail. Retrieved from http://www.gwinnetttech.edu/webmail/

    Hassini, E. (2004). Student–instructor communication: The role of email. Computers & Education, 47,  29–40.

    Kim, C. & Keller, J.M. (2008). Effects of motivational and volitional email messages (mvem) with personal messages on undergraduate students’ motivation, study habits and achievement. British Journal of Educational Technology, 39(1), 36–51. doi:10.1111

    Larson, J. (2008). Angry and aggressive students. Principal Leadership, January, 12-15. Retrieved from http://www.nasponline.org/resources/principals/Angry%20and%20Aggressive%20Students-NASSP%20Jan%2008.pdf

    Levinson, D.B. (2010). Passive and indirect forms of aggression & email: the ability to reliably perceive passive forms of aggression over email. (Unpublished doctoral dissertation). Wright Institute Graduate School of Psychology, Berkeley, CA.

    RC Martin. (2011, June 21). Avoiding the angry email [Web log post]. Retrieved from http://blog.uwgb.edu/alltherage/avoiding-the-angry-email/

    RC Martin. (2012, March 2). Responding to the angry email: A follow-up [Web log post]. Retrieved from http://blog.uwgb.edu/alltherage/responding-to-the-angry-email-a-follow-up/

    Mason, M.A., (2010, July). Email: The third shift. The Chronicle of Higher Education. Retrieved from http://chronicle.com/article/E-Mail-the-Third-Shift/66312/

    Mejia, E. (n.d.). Sample English 105 syllabus. Retrieved from http://public.wsu.edu/~mejia/Handbook/Sample_105_Syllabus.htm

    Richtmyer, E. (2007). History 3722 syllabus. Retrieved from http://www.hist.umn.edu/hist3722/syllabus.html

    Sheer, V.C., & Fung, T.K. (2007). Can email communication enhance professor-student relationship and student evaluation of professor?: Some empirical evidence. Journal of Educational Computing Research, 37(3), 289-306.

    Spinks, S. (2013). One reason teens respond differently to the word: Immature brain circuitry. Retrieved from http://www.pbs.org/wgbh/pages/frontline/shows/teenbrain/work/onereason.html

    Stephens, K.K, Houser, M.L., & Cowan, R.L. (2009). R U able to meat me: The impact of students’ overly casual email messages to instructors. Communication Education, 58(3), 303-326.

    Tunnecliffe, M. (2007). Behavioural de-escalation. Retrieved from http://www.education.nt.gov.au/__data /assets/pdf_file/0014/2318/Module7TeacherNotes.pdf

    University of Oregon Counseling and Testing Center. (2012). Strategies for Dealing with Angry Students Outside the Classroom. Retrieved from http://counseling.uoregon.edu/dnn/FacultyStaff/DisruptiveThreateningStudents/DealingwithAngryStudentsOutsidetheClassroom/tabid/325/Default.aspx

    Wilson, S., & Florell, D. (2012). What can we do about student e-mails? Observer, 25(5), 47-50.

  • 03 Apr 2017 9:00 PM | Anonymous

    Submitted by William S. Altman and Lyra Stein, Editors, E-xcellence in Teaching Essays ___________________________________________________________________________

    Using media in the classroom: A cautionary tale and some encouraging findings

    Lynne N. Kennette
    Durham College


    Instructors should use caution when implementing new methods of teaching or assessments: just because students like it, doesn’t mean their learning necessarily benefits. This was recently revealed to me in one of my classes when I tried a new activity. However, as I discovered through student comments, there is a silver lining (read on!)


    One of the key skills that instructors in psychology try to develop in their students is the identification of  independent variables (IV) and dependent variables (DV), which form the basis of research design and analysis. The very foundation of the scientific method includes identifying changes in one variable and how it relates to  another variable. I wondered whether students would show a performance advantage (or any preference for) using media clips over written scenarios used for identifying IVs and DVs in experiments. So, I presented students with video clips from episodes of the television series MythBusters (Discovery Channel), audio clips from the National Public Radio’s Radiolab series, and my traditional written experiment scenarios.

    Burkley and Burkley (2009) reported the benefits of using MythBusters clips to illustrate experimental designs. Students enjoyed the use of these clips in class, and performed better on MythBusters-related exam questions (compared to control questions). I suspected that students would prefer the video and audio scenarios for their entertainment value, but wondered whether their performance would actually benefit. Previous research suggested that students might both prefer and benefit from multimedia formats because it would stimulate interest and thus retention (Nowaczyk, Santos, & Patton, 1998). Media may also be more engaging than a written description, and engaging content leads to better learning of information (Tobias, 1994), and as we know, students put more effort into tasks they find interesting (Renninger, 1992).

    However, it is also possible that the additional information provided by audio and video clips could distract students from the relevant information required to complete the task of identifying IVs and DVs (Walker & Bourne, 1961). This distracting information may come from the irrelevant “story-telling” details required to make these media commercially appealing (especially in the case of MythBusters). Additionally, because the learner cannot as easily control the stream of information (i.e., the speed at which information is delivered), students may experience a cost when presented with media compared to the traditional written format.


    In two sections of my advanced cognitive psychology laboratory course (and following a brief review lecture on the topic of IVs and DVs), students were presented with traditional written scenarios, video clips, and audio clips and had to identify IVs and DVs. Students were assessed multiple times: immediately following the IV/DV review lecture (Time 1), during the second to last week of class (Time 2), and on the very last day of class (Time 3; here, I presented previously-encountered scenarios to measure retention, however this timepoint resulted in ceiling effects and was, therefore, difficult to analyze). At the end of the class, I also asked students (anonymously) some qualitative questions to obtain their perceptions of the three question types (e.g., which of the three were perceived easier).

    Results and Discussion

    After adjusting for final course grade, it is reassuring to have found that students improved over the course of the semester (F(2, 252) = 50.87, p < .001, hp2 = .288). Student performance on the three formats also differed (F(2, 252) = 4.01, p = .019, hp2 = .031), whereby students answered the traditional written scenarios more accurately than Radiolab questions (Mwritten = 78%, MRadiolab =68%, p = .005), but performance on the written scenarios did not differ from MythBusters questions (p = .128). What is perhaps even more interesting is that students perceived all three to be of similar difficulty, but indicated a preference for the MythBusters clips over the Radiolab audio clips. In addition, many students provided unsolicited feedback about how much “fun” the video and audio clips were and that these allowed them to finally “get” IV manipulation and DV measurement.

    So, does showing students video and audio clips actually benefit learning or performance on assessments? My experience with this activity is particularly interesting because it taught me that using media or multimedia for classroom assessment may not necessarily lead to better understanding, even though students expressed a preference for these formats. Student preference for these formats does, however, suggest that instructors can use multimedia as a valuable tool because they increase student engagement with course material.


    Some of the factors that instructors should consider when contemplating the use of multimedia for teaching and assessment include:

                Familiarity: the written format is a common way to expose students to IV and DV identification, which they may have encountered in previous courses. It is also the most common assessment method (tests and assignments), and therefore students are familiar with this format from high school. If planning to use multimedia for assessments, students should be given ample time to practice assessments using those less familiar formats.

    Superfluous information: Both types of media clips contained additional details that were not directly relevant to the experiment. The presence of these extraneous details could distract students (especially those not sufficiently proficient in experimental design and unable to suppress this irrelevant information). Walker and Bourne (1961) found a linear decline in performance on a problem-solving task with each added piece of irrelevant information (also see Mayer, Heiser, & Loan, 2001, for a more recent investigation).

    Entertainment: Students’ previous experience with MythBusters, Radiolab, or both (or perhaps television and radio more generally) as entertainment may result in difficulty focusing on the relevant experimental features of the clips (i.e., IVs and DVs), leading to declines in performance than with the written experimental scenarios,

    Concluding remarks

                Instructors should use caution when implementing new technologies and new teaching strategies. As my recent experience has demonstrated, just because they like it, doesn’t mean they necessarily learn, perform, or retain it better. Similarly, these new techniques or formats (although interesting for students) may not be appropriate to use during assessments. However, it is encouraging to know that they can lead to increased student engagement (e.g., MythBusters) which can lead to increased learning while in class! Because student engagement is so important, instructors should use many tools to encourage student learning in their discipline, while keeping in mind the considerations outlined above.



    Burkley, E., & Burkley, M. (2009). Mythbusters: A tool for teaching research methods in psychology. Teaching of Psychology, 36(3), 179–184. doi:10.1080/00986280902739586

    Mayer, R. E., Heiser, J., & Loan, S. (2001). Cognitive constraints on multimedia learning: When presenting more material results in less understanding. Journal of

    Educational Psychology, 93(1), 187–198. doi:10.1037/0022-0663.93.1.187

    Nowaczyk, R. H., Santos, L. T., & Patton, C. (1998). Student perception of multimedia in the undergraduate classroom. International Journal of Instructional Media, 25(4), 367–382.

    Renninger, K. A. (1992). Individual interest and development: Implications for theory and practice. In K. A. Renninger, S. Hidi, & A. Krapp (Eds.), The role of interest in learning and development (pp. 361–398). Hillsdale, NJ: Erlbaum.

    Tobias, S. (1994). Interest, prior knowledge and learning. Review of Educational Research, 64(1), 37–54. doi:10.3102/00346543064001037

    Walker, C. M., & Bourne, L. E. (1961). The identification of concepts as a function of amounts of relevant and irrelevant information. The American Journal of Psychology, 74(3), 410–417. doi:10.2307/1419747

    Author bio

    Lynne N. Kennette, Ph.D. is a Professor of psychology and program coordinator for General Arts and Sciences programs at Durham College in Oshawa, Ontario (Canada). She is a graduate of Wayne State University (Detroit, Michigan, M.A. and Ph.D.) and the University of Windsor (Windsor, Ontario, B.A.). She teaches primarily general education courses in introductory psychology and her research focuses on the SoTL as well as how the mind processes languages. This research was conducted at Wayne State University.

Powered by Wild Apricot Membership Software