Generative Artificial Intelligence (AI) models, such as ChatGPT, have sparked debates and moral panic since their introduction in November 2022. Concerns have been raised about the impact of generative AI on the integrity of creative and academic work, specifically its […]
Generative Artificial Intelligence (AI) models, such as ChatGPT, have sparked debates and moral panic since their introduction in November 2022. Concerns have been raised about the impact of generative AI on the integrity of creative and academic work, specifically its ability to generate human-like texts and images.
ChatGPT is a generative AI model powered by machine learning. It produces responses that resemble human language by recognizing patterns in data. While it may seem like the model is engaged in natural conversation, it relies on extensive datasets to generate coherent answers.
Higher education is one sector where the rise of AI tools like ChatGPT has raised concerns about ethics and the integrity of teaching, learning, and knowledge production.
We are a group of media and communication academics teaching at South African universities. We conducted an online survey among undergraduate students from five South African universities: the University of Cape Town, Cape Peninsula University of Technology, Stellenbosch University, Rhodes University, and the University of the Witwatersrand. Our aim was to understand how students utilize generative AI and AI tools in their academic practices.
The results indicate that the moral panic surrounding the use of generative AI is unfounded. Students are not overly focused on ChatGPT specifically. We found that students often use generative AI tools for active learning and have a critical and nuanced understanding of these tools.
However, what might raise more concerns in terms of teaching and learning is the students’ use of AI tools to generate ideas for assignments or essays when they are stuck on a particular topic.
Breaking Down the Data
The survey was completed by 1471 students, with the majority speaking English as their first language, followed by isiXhosa and isiZulu. Most respondents were first-year students, enrolled in Humanities, Natural Sciences, Education, and Economics faculties. While the survey was somewhat focused on first-year Humanities students, it provides valuable indicative findings for researchers exploring new terrain in the educational process.
We asked students if they had used specific AI tools, citing some of the most popular tools across various categories. Our survey did not explore lecturers’ attitudes or policies regarding AI tools. That will be investigated in the next phase of our research, which will include focus groups with students and interviews with lecturers. Our research does not specifically focus on ChatGPT, although we did ask students about their use of this particular tool. We explored the widespread use of AI technologies to gain insights into how students utilize these tools, which tools they use, and where ChatGPT fits into these practices.
Here are the key findings:
– 41% of respondents primarily use laptops for their work, followed closely by smartphones (29.8%). Only 10.5% use desktop computers, and 6.6% use tablets.
– Students have typically used various other AI tools before ChatGPT, including translation tools and reference tools. Regarding the use of writing assistants like Quillbot, 46.5% of respondents stated that they use such tools to improve their writing style for assignments. 80.5% indicated that they have used tools like Grammarly to aid their writing in proper English.
– Less than half of the respondents (37.3%) said they used ChatGPT to answer essay questions.
– Students acknowledged that AI tools could potentially lead to plagiarism or influence their learning. However, they also mentioned that they don’t use these tools in problematic ways.
– Respondents were highly positive about the potential of digital and AI tools to facilitate their progress through university. They mentioned that these tools can help with clarifying academic concepts, generating ideas, structuring essays, improving academic writing, saving time, checking spelling and grammar, understanding task instructions, finding information or academic sources, summarizing academic texts, assisting non-native English speakers in enhancing their academic writing, studying for exams, paraphrasing effectively, avoiding plagiarism, and citing sources correctly.
– Most students who found these tools helpful in the learning process used ChatGPT and similar tools to clarify concepts from their studies that they did not fully understand or that they felt were inadequately explained by lecturers.
We were particularly interested in discovering that students frequently use generative AI tools for engaged learning. This educational approach places responsibility on students for their own learning. They actively develop thinking and learning skills, strategies, and the formulation of new ideas and understanding through conversation and collaborative work.
Through their use of AI tools, students can personalize content to address their specific strengths and weaknesses, resulting in a more engaging learning experience. AI tools can also serve as personalized online “tutors” with whom students can “converse” to help them understand difficult concepts.
Concerns about how AI tools might potentially undermine academic grading and integrity are valid. However, those working in higher education must consider the importance of incorporating student perspectives to work on new paths of assessment and learning.
This article was written by Marenet Jordaan, Admire Mare, Job Mwaura, Sisanda Nkoala, Alette Schoon, and Alexia Smit.
Tanja Bosch, Professor of Media Studies and Production, University of Cape Town, and Chikezie E. Uzuegbunam, Lecturer & MA Program Coordinator, Rhodes University.
This article was originally published on The Conversation and is republished under a Creative Commons license. Read the original article.