Help Sheet Content Predicts Test Performance

18 Dec 2017 4:01 PM | Anonymous
Help Sheet Content Predicts Test Performance

Mark R. Ludorf and Sarah O. Clark
Stephen F. Austin State University

Readers of E-xcellence in Teaching know the importance of finding the best teaching methods and techniques to reach students. Although instructors rightfully seek to improve their teaching to enhance student learning, often times too much focus is placed on enhancing “input” and not enough focus is placed on enhancing the fidelity of “output”. That is, instructors should explore both the methods to make them better teachers, but also consider innovative methods to create better measurements of what students have learned.

Professors regularly confront the challenge of teaching to a student population with diverse levels of academic ability. To address such diverse ability instructors have implemented various pedagogical methods, many of which are time consuming and tedious. One method instructors have used to address diverse learning abilities is to allow students to access information during a test. Some instructors limit the amount of information that is accessible (e.g., index card or standard sheet of paper), while other instructors allow access to an unlimited amount of information (i.e., “open book”).

Ludorf (1994), allowed students to select the amount information they could access on each of five statistics tests. Results showed significantly higher average test performance (72% versus 62%) when less information was accessed than when more information was accessed; a result consistent with previous results (Boniface, 1985)

During the last 3 decades numerous researchers (e.g., Dorsel & Cundiff, 1979) have explored the role of help sheets (aka cheat sheet or crib sheet) and how the use of help sheets is related to test performance (Dickson & Bauer, 2008; Dickson & Miller, 2005; Hindman, 1980; Visco, Swaminathan, Zagumny & Anthony, 2007; Whitley, 1996), learning (Dickson & Bauer, 2008; Funk & Dickson, 2011) and anxiety reduction (e.g., Drake, Freed, and Hunter, 1998; Erbe, 2007; Trigwell, 1987). Overall the results have been mixed regarding help sheet use and the variables investigated.

One aspect of help sheets that has received little attention is the relationship between the content of a help sheet and test performance. Most of the research cited above examined the relationship between test performance and whether or not a student used a help sheet. Only a few studies (Dickson & Miller, 2006; Gharib, Phillips, & Mathew, 2012; Visco, et al., 2007) have explored how the specific content of a help sheet is related to performance.

Dickson and Miller (2006) found significantly higher test performance when students used an instructor provided help sheet compared to a student provided help sheet. However, the result may be confounded as help sheet condition may have varied systematically with the amount of studying students did. Visco et al. (2007), examined student generated help sheets and concluded that students likely need additional direction on what content to include on a help sheet in order to enhance performance. Finally, Gharib et al. (2012) examined the quality of students’ help sheets and found a reliable and positive relationship between the quality of the help sheet content and test performance; where a quality measure was obtained by rating a help sheet for organization and amount of detail.

To summarize the relevant research, the use of help sheets is not reliably or consistently related to student performance, learning, or anxiety levels. Moreover, help sheet quality appears to vary across students and such variation may explain the body of results. Thus, help sheet content should be examined more systematically.

The current study provided a systematic exploration to determine whether characteristics of the help sheet content (e.g., overall quality, inclusion of process information, density of information, etc) were related to test performance. Results of the study may be used to provide students guidance (Visco et al., 2007) when constructing a help sheet in order to enhance performance.



 Participants (N = 21) were students enrolled in a required junior level psychological statistics course. Other sections of the course were taught by different instructors; students selected to enroll in this section unaware of the assessment that would be conducted. A majority of the participants were women. No other demographic information was collected.  


Students created a one-page 8.5 × 11 in. [21.6 × 28 cm] help sheet to use on each test. The help sheet could contain any information a student wanted to include and both sides of the sheet could be used. Students were informed that help sheets would be collected.  Both sides of each help sheet were scanned to create an electronic copy. All help sheets were returned when the tests were returned.


Students were required to construct a help sheet for each test, though there was no requirement to use the help sheet. Based on informal observation during the test, all students appeared to use the help sheet to some degree.

Tests in the statistics course were all problem based and were graded on a 100 point scale. Student help sheets were collected, scanned, and rated by two raters on the variables of interest below. Both help sheet raters were blind to students’ test performance at the time that the ratings were made.

Variables of interest. Help sheets were evaluated on the variables of Overall Quality (4 – 0, with 4 being the highest quality); Verbal Process information (i.e., instructions) (1 <very informational> – 3 <neutral>  – 5 <not very informational> ), Numeric Process information (i.e., solved problems) (1 <very informational> – 3 <neutral>  – 5 <not very informational> ), Density of the information (as rated in deciles – 10 – 100%), Organization of information (1 <very organized> – 3 <neutral>  – 5 <very unorganized> ), use of Color (present or absent) and Submission Order (ordinal position when the test was submitted).




The analyses were based on students’ help sheets and test performance from a single test. Interrater rater reliability was computed for the two raters across the scales described above. Interrater reliability ranged from moderate to high, .521(Organization) to .978 (Density).

Help sheet ratings for the two raters were averaged and then regressed against students’ test scores to determine which characteristics of help sheet predicted tested performance. Results showed that higher quality help sheets predicted higher test performance (b = 33.20, p < .001) as did lower density of information (b = -.35, p = .05). Moreover, higher verbal process scores were associated with lower test performance (b = 13.14, p < .01). None of the other variables were related to performance (p > .05).

Discussion, Conclusion and Recommendations

Results of the preliminary analyses suggest that it is not enough just to consider whether a student has access to a help sheet or not, but rather a careful examination of the help sheet content is required. Similar to Gharib et al. (2012), overall quality of the help sheet was found to be a very important characteristic of the help sheet. As overall quality increased, test scores also increased.

Density of information was also significantly related to performance. Although not the strongest effect, it appears that having less information on the help sheet predicted higher performance. Such a pattern is consistent with previous research (Visco et al., 2007) which may indicated that density of information is a proxy for learning in an inverse direction. That is, students who have a robust understanding of the material do not need to include as much information on the sheet and create a less dense help sheet. Conversely, students who do not have a robust understanding of the material must include as much information as possible to compensate for the lack of understanding, thereby creating a high density help sheet.

One surprising finding was that students who included more verbal process information, which included information like instructions on how to perform some processes, scored lower than those students who included less of this information. Similar to the density argument above, it could be the case that students who included more verbal process information did so because they were not comfortable completing such problems without help sheet information and so they included more verbal process information on their help sheets.

Finally, in examining the help sheet research there are two notable issues. First, help sheets do not facilitate student performance in courses involving mostly content knowledge including abnormal psychology (Hindman, 1980), developmental psychology (Dickson and Miller, 2005 and 2006), or social psychology (Whitley. 1996). However, when a course includes more process than content knowledge, as in the current course or other studies including statistics (Ludorf, 1994, Philips, et al., 2012) or engineering (Visco et al., 2007), students’ test performance appears to be related to help sheet content. Second, taking into account the research showing that content of a help sheet is related to test performance, we join Visco and colleagues in calling for the need of instructors to become more involved with help sheet construction as a way to provide students of all abilities a high quality help sheet.


Boniface, D. (1985). Candidates’ use of notes and textbooks during an open-book examination. Educational Research, 27(3), 201-209.

Dickson, K. L., & Bauer, J. (2008). Do students learn course material during crib card construction? Teaching of Psychology, 35, 117-120.

Dickson, K. L., & Miller, M. D. (2005). Authorized crib cards do not improve exam performance. Teaching of Psychology, 32, 230–232.

Dickson, K. L., & Miller, M. D. (2006). Effect of crib card construction and use on exam performance. Teaching of Psychology, 33, 39–40.

Dorsel, T. N., & Cundiff, G. W. (1979). The cheat-sheet: Efficient coding device or indispensable crutch? Journal of Experimental Education, 48, 39–42.

Drake, V. K., Freed, P., & Hunter, J. M. (1998). Crib sheets or security blankets? Issues in Mental Health Nursing, 19, 291–300.

Erbe, B. (2007). Reducing test anxiety while increasing learning – The cheat sheet. College Teaching, 55(3), 96-97.

Funk, S. C., & Dickson, K. L. (2011). Crib card use during tests: Helpful or a crutch? Teaching of Psychology, 38, 114-117.

Gharib, A., Phillips, W., & Mathew, N. (2012). Cheat Sheet or Open-Book? A Comparison of the Effects of Exam Types on Performance, Retention, and Anxiety. Psychology Research, 2(8), 469-478

Hindman, C. D. (1980). Crib notes in the classroom: Cheaters never win. Teaching of Psychology, 7, 166–168.

Ludorf, M. R. (1994). Student selected testing: A more sensitive evaluation of learning.  Paper presented to the American Psychological Society Institute on The Teaching of Psychology, Washington, DC.

Trigwell, K. (1987). The crib card examination system. Assessment and Evaluation in Higher Education, 12, 56–65.

Visco, D., Swaminathan, S., Zagumny, L, & Anthony, H. (2007). AC 2007-621: Interpreting Student-Constructed Study Guides. ASEE Annual Meeting and Exposition Proceedings, Honolulu, HI.

Whitley, B. E., Jr. (1996). Does “cheating” help? The effect of using authorized crib notes during examinations. College Student Journal, 30, 489–493.


Author Notes

Mark Ludorf is a Cognitive psychologist who joined the faculty at Stephen F. Austin State University(SFA) in the fall of 1990 and is currently a Full Professor of Psychology. He has served in university wide administrative positions at two universities (SFA and Oakland University in Rochester, MI). He was also an American Council on Education (ACE) Fellow in Academic Administration.  Ludorf has been active in the use technology in higher education. He has taught online since 2001 and developed several online courses. His other academic interests are in leadership and study abroad. Ludorf currently serves as Senior Editor of the Journal of Leadership Studies. He has also offered numerous study abroad programs in Italy.  At SFA Ludorf has been recognized as the Alumni Distinguished Professor and was awarded the SFA Foundation Faculty Achievement Award.

Sara Clark was an undergraduate teaching assistant in statistics at Stephen F. Austin State University. She completed her Bachelor’s degree in Psychology at SFA. She was also the 2013 recipient of the Jeff and Jackie Badders Award which is given to the top graduating senior psychology major.

Powered by Wild Apricot Membership Software