This updated analysis of commercial contract cheating prevalence does not suggest an upward trend between 1990–2020. In addition, the analysis indicates that the rate of engagement in this behavior is persistently low—in the range of 2.5–3.5 percent of students self‐reporting engaging in commercial contract cheating. A caveat, of course, is, as mentioned earlier, people tend to under‐report socially‐undesirable behaviors. Still, there are two questions to consider: 1) if cut‐and‐paste plagiarism is trending down because of text‐matching software, why is it not being replaced by commercial contract cheating and 2) given the availability of commercial contract cheating services, why do relatively few students report engaging in this behavior?
Table 3 Trends in commercial contract cheating 1990–2020
Number of studies | Number of students | Students contract cheating | % contract cheating | Spearman's correlation year‐% cheating | p | |
---|---|---|---|---|---|---|
Newton (2018) | 71 | 54514 | 1919 | 3.52 | 0.368 | .0016* |
Newton (2018) plus studies 2016–2020 | 76 | 79745 | 2737 | 3.43 | 0.303 | .008* |
Studies 1990–2020 | 71 | 78354 | 2712 | 3.46 | 0.206 | .085 |
Studies 1990–2020—English‐speaking countries | 54 | 65843 | 1620 | 2.46 | –0.135 | .332 |
Note: * = significant p < .05
Rundle et al. (2019, 2020) offer several answers to the questions above. First, regarding the possibility that commercial contract cheating has not replaced cut‐and‐paste plagiarism, they suggest that this outcome is consistent with research on “crime displacement”. Specifically, when crimes of a certain type, or in a certain place, are reduced, they are not replaced to the same extent by different crimes or by the same crimes occurring elsewhere (Cornish and Clarke, 1987). If this pattern of behavior occurs for academic misconduct, reducing one kind of behavior should not necessarily cause others to increase. Second, Rundle et al. (2019) found that students indicate that they mostly do not engage in commercial contract cheating because they perceive it as immoral non‐normative, and undermining their learning goals. In addition, Rundle et al. (2020) suggest that there are more, and stronger, practical and psychological barriers preventing contract cheating than other forms of unethical assessment behavior.
The extended analysis of Newton's (2018) results suggests that commercial contract cheating did not appear to trend upwards as a substitute for the other forms of cheating and plagiarism that trended downwards. Therefore, it is worth examining whether academic integrity interventions that have been implemented in the past 30 years may provide a better explanation for the downward trend in the prevalence of cheating and plagiarism.
STUDIES OF INTERVENTION‐RELATED CHANGE IN ACADEMIC MISCONDUCT
Numerous studies in the past 30 years have examined interventions designed to prevent or detect plagiarism and cheating. Importantly, there is substantial evidence that academic integrity interventions have proliferated over this time. For example, the text‐matching software Turnitin™ went from 1 million submissions in 2002 to 500 million in 2014 (Turnitin.com, 2020). Stoesz and Yudintseva (2018) identified 21 studies of educational interventions (workshops and tutorials) designed to improve academic integrity reported in published literature between 1995 and 2016, and many of these describe sustained or ongoing interventions created in the last 30 years.
Rather than duplicate existing analyses, two excellent recent reviews of the literature on the effectiveness of academic integrity interventions capture most of the intervention studies in academic integrity literature in the past 30 years. In addition to Stoesz and Yudintseva's (2018) systematic review, mentioned above, Marusic et al. (2016) used the Cochrane methodology to assess interventions designed to promote research and publication integrity, which included many studies focused on reducing plagiarism. Additionally, three papers not cited in these reviews are particularly noteworthy, as they report studies that tracked academic integrity interventions over extended periods (Levine and Pazdernik, 2018; Owens and White, 2013; Perkins et al., 2020).
Academic integrity interventions in the past 30 years seem to come in one of four principal forms: 1) the implementation of honor codes designed to crystalize a shared understanding of acceptable behavior and influence students’ attitudes regarding plagiarism and cheating, 2) educational modules (classes, tutorials, online activities) designed to educate students about academic integrity and/or appropriate citation practices, 3) the use of text‐matching software, often accompanied by education to help students understand text‐matching reports and interpret differences between matched text and plagiarism, and 4) some combination of the above.
Although there has been strong advocacy for the use of honor codes to improve academic integrity, the evidence for their impact is often of limited quality. Studies reported by McCabe et al. (2002), for example, typically show a correlation between the use of honor codes and the prevalence of self‐reported plagiarism and cheating. However, any apparent impact of honor codes on rates of cheating may be a problem of self‐selection bias rather than an effect of the codes themselves (McCabe, 2016). In other words, institutions with honor codes may attract non‐cheating students or encourage under‐reporting of cheating by students. Still, there is some longitudinal evidence (i.e. with pre‐test and post‐test measures) that indicates that implementing honor codes makes students view cheating more negatively (e.g. Raman and Ramlogan, 2020). In addition, honor codes set expectations about standards of acceptable behavior, and studies indicate that academic integrity standards influence students’ misconduct behavior (e.g. Curtis et al., 2018). However, asking students to pledge to be honest before submitting assignments may not be enough to reduce misconduct. Evidence that simply making an ethical pledge before submitting work reduces cheating (Shu et al., 2012) has recently been found to have been based on fabricated data (Baskin, 2021). The best evidence for honor codes reducing cheating seems to be when students are regularly reminded about the codes (Tatum and Schwartz, 2017).
The evidence from reviews by Marusic et al. (2016) and Stoesz and Yudintseva (2018) indicates that training in citation skills and paraphrasing are generally helpful, albeit that the effects are modest. Marusic et al. (2016), in particular, concluded that training involving practical exercises and text‐matching software showed the most promise in reducing plagiarism (e.g. Barrett and Malcolm, 2006; Batane, 2010; Rolfe, 2011). Stoesz and Yudintseva (2018) concur that educational interventions, which may be automated and delivered online (e.g. Belter and Pré, 2009; Curtis et al. 2013) may be enhanced with hands‐on, in‐class experiences. These reviews suggest that anti‐cheating educational interventions may improve students’ attitudes toward integrity, not just their skills, thus having a similar effect to honor codes. However, the reviews generally