Is ChatGPT a Game Changer?
Academic integrity has been a cornerstone of education systems from K-12 to post-secondary, but this trait faces a new challenge with the recent emergence of the general-purpose AI chatbot ChatGPT.
ChatGPT is a highly popular AI chatbot that uses Natural Language Processing (NLP) models to generate human-like text for a large variety of applications. This AI technology makes use of training data from sources like websites, textbooks, and articles to complete a wide range of writing prompts such as writing articles, stories, poetry, translation, coding or debugging. A recent survey of 1,000 U.S. college students indicates that ChatGPT is being rapidly adopted: 30% of surveyed students have used it on written homework (and 60% of this group use it on the majority of their assignments). This survey also found that the majority of these students believed it was cheating but still continued to use it.
While the underlying technology is complex, ChatGPT is notable for being highly accessible (users can sign up for free in minutes, and no training is required) and remarkably capable in various settings. For example, this AI chatbot has notably passed several different tests required for degrees. Users have found that it could pass a final MBA exam (with a B to B- grade), U.S. medical licensing exams, and law exams. However, while researchers have been impressed by the power and versatility of this tool, these tests also highlight the weaknesses of this AI software; indeed, it can be challenged by mistakes in “relatively simple calculations at the level of 6th grade Math. These mistakes can be massive in magnitude.” Furthermore, its knowledge is limited to 2021 information rather than being continuously updated.
While education systems have needed to be aware of the risk of plagiarism (such as the uncredited use of someone else’s work or ideas) or the risk of students using electronic devices to cheat during testing, the rapid adoption and popularity of OpenAI’s ChatGPT program seem to be far more disruptive.
As a result of the popularity of ChatGPT (and the potential for even more powerful tools in the future), schools and universities are quickly moving to develop policies that address the concerns of teachers, administrators, and parents. Some teachers and professors worry about the ease of generating content for taking exams or writing assignments that can bypass existing software solutions designed to catch cheating (such as Turnitin). Notably, America’s largest school district in New York City has blocked student and teacher access on its networks and devices due to “negative impacts on student learning, and concerns regarding the safety and accuracy of content.” In Canada, various assessments of the effects of this AI technology are underway by the Quebec Ministry of Education and the Hamilton-Wentworth District School Board in Ontario, while some universities and school districts in Ontario are taking a “wait-and-see” approach or offering training for staff on the use of the technology.
As with all technologies that can be used in harmful ways, there is a cat-and-mouse game of developing new solutions or deterrents. One example is GPT Zero, an app developed by the Canadian computer science student Edward Tian, to quickly and efficiently identify whether a text was written by a machine or a human. Based on the popularity of this detection app, Tian is developing a tool specifically designed for educators. This has resulted in 33,000 individuals signing up for his product’s waitlist. Open AI, the makers of ChatGPT, has also tried to address concerns related to cheating by introducing a new tool called AI Text Classifier that can help teachers detect if passages were written by humans or AI. However, OpenAI notes that its AI Text Classifier has limitations and will be imperfect. For example, it may generate false positives, unreliable detection in short texts (less than 1,000 characters), and is only recommended for English text. Furthermore, like many AI technologies, the inner workings of these AI identification tools are often opaque, and it is unclear how this determination would be made.
Conversely, some educators argue that there are also potential benefits to this new technology. Like other tools, it could be used for positive purposes, such as improving student writing or encouraging education systems to develop better ways of measuring student learning beyond repeating facts or basic knowledge. Indeed, ICTC research has highlighted emerging opportunities for technology to improve student learning, whether in the classroom or remote and hybrid settings. Some teachers have suggested that they have a responsibility to teach their students how to use ChatGPT, and further, the end result of student learning should evolve to be less easily solved by a chatbot.  ChatGPT in the classroom may accelerate prior efforts in education systems to develop a “focus on creativity, voice, and process—things the bot can’t imitate.” This requires a renewed emphasis on projects or assessments that are not easily AI generated, such as oral presentations, essays on personal lived experiences, or local community-based projects. Other education experts have encouraged the use of group assignments to assess teamwork and collaboration skills (a human strength) or performance-based assessments where students can demonstrate understanding through “hands-on activities or projects such as science experiments, art projects, or mock trials.” Further ideas include replacing traditional writing assignments with podcast production, student debates, or conducting interviews to demonstrate knowledge of the material.
Integrating chatbot tools in the classroom may also prepare students for tasks in their future workplaces as opportunities emerge for AI as a co-pilot or collaborator to assist in writing, searching large volumes of text to find the most relevant passages, or refining ideas.
Chatbot tools are not only for students: teachers might have the opportunity to use AI technologies for customized teachings, to guide students and tailor lessons to match student needs and interests at their own pace. ChatGPT can be used as a coach to help students in specific areas, such as grammar usage. Rapid customization could be a boon, given the challenges teachers face in helping students who struggle in some academic areas and need individual attention. For example, AI can already be used to help guide students to improve their draft assignments before a final assessment ; one example is OpenEssayist, an intelligent linguistic analytics tool that can generate feedback on essays.
Perhaps AI can be used to help find solutions for the challenges it creates. When queried for ways to evaluate student learning (while maintaining academic integrity), ChatGPT provided this response:
While ChatGPT can be a valuable tool for learners, it is important to evaluate student learning and maintain academic integrity in the following ways:
Students have always had shortcuts available to them, but the continued advancements in digital technologies have forced educators to consider new and unexpected impacts. In previous eras, there were fears that calculators would prevent students from learning arithmetic or that using Wikipedia and other internet resources would lead to widespread plagiarism and the inability to look critically at information sources. Yet, the education system ultimately adapted to the positive and negative aspects of these tools to bolster student learning. Ultimately, educators at all levels will need to carefully consider how to best manage ChatGPT and other emerging AI applications. Capable AI technologies are available now, and students will need to understand their future impacts on the workplace and society.
21 ES428045_PREMS 092922 GBR 2517 AI and Education TXT 16x24.pdf (coe.int)