How AI overdependence hinders critical thinking skills

What you need to know:

  • Experts warn that the trend not only undermines the educational process but also raises significant ethical concerns.

In an era where technology permeates every aspect of life, artificial intelligence (AI) stands at the forefront of innovation.

From enhancing business operations to transforming healthcare, AI has made significant strides. In the realm of education, AI offers an abundance of benefits, ranging from personalised learning experiences to instant access to vast amounts of information.

However, as with any powerful tool, AI is vulnerable to misuse. Alarmingly, university students are increasingly relying on AI tools to think for them, particularly when it comes to completing assignments.

This trend not only undermines the educational process but also raises significant ethical concerns.

Educational software, online learning platforms, and even administrative processes have been integrating AI for years. Tools like ChatGPT, Grammarly, Turnitin, and various learning management systems have become staples in many educational institutions.

These tools assist students in improving their writing, checking for plagiarism, and managing their coursework more efficiently. However, more sophisticated AI tools, like OpenAI's GPT-4, have brought about new challenges.

Mr Jackson Nzira, an assistant lecturer at the University of Dar es Salaam (UDSM), says AI has the potential to revolutionise education by providing personalised learning and aiding in the management of educational content. However, he cautions that its misuse can lead to “a low quality of education if students rely on it to do their work."

One of the most significant issues with AI tools is the temptation they present. Mr Alfani Mduge, a lecturer at Saint Augustine University of Tanzania (SAUT), shares that university students, often under immense pressure to meet deadlines and achieve high grades, may resort to using AI to complete their assignments.

“Instead of engaging deeply with the material, these students command AI tools to generate essays, solve problems, and even conduct research. While this might seem like a quick fix, it significantly undermines the learning process,” he says.

His counterpart from SAUT, Ms Zabibu Idrissa, notes; "I have seen students submit assignments that are generated by AI. They lack the depth and critical analysis that we expect from university-level work. This is a growing problem that we need to address."

GPT-4, a state-of-the-art AI language model developed by OpenAI, can generate human-like text based on the prompts it receives.

While it is an excellent tool for various applications, its misuse in academic settings is alarming. Students can input assignment prompts and receive well-structured, coherent essays in return. This bypasses the need for critical engagement with the material, as students rely on the AI to do the thinking for them.

"AI tools can be really helpful if used correctly. But it's easy to fall into the trap of letting the AI do all the work. I've seen classmates submit AI-generated essays and get good grades, but they don't learn anything from the process," says Mary, a second-year student at the University of Dar es Salaam.

Aireth Sumuni, a final-year student at the Institute of Finance Management (IFM), admits there is a lot of pressure to succeed, and sometimes using AI feels like the only way to keep up. “I know it's not right. I've tried to avoid it and focus on doing my work, but it's tough."

Aireth says, "I was curious about GPT-4 and decided to use it for one of my assignments. It felt like cheating, and I didn't get the satisfaction of understanding the material, but I continued to use it because it helped me get good grades with less tension and pressure."

Critical thinking is a fundamental skill that education aims to develop. It involves analysing information, evaluating evidence, and constructing well-reasoned arguments. When students rely on AI to complete their assignments, they miss out on these crucial opportunities for intellectual growth.

Over time, this can lead to a decline in their ability to think critically, solve complex problems, and engage in independent learning.

Dr Benard Mnzava, a lecturer at IFM, emphasises that critical thinking is not just about getting the right answers. He says that it's about understanding the process, questioning assumptions, and developing the ability to think independently. He notes that by using AI to bypass this process, students are doing themselves a disservice

The misuse of AI tools raises several ethical issues. Firstly, it constitutes academic dishonesty. Universities have strict policies against plagiarism and cheating, and using AI to complete assignments falls into these categories.

Furthermore, it creates an uneven playing field. Students who misuse AI tools gain an unfair advantage over their peers who put in the effort to complete their work honestly. This undermines the integrity of the academic system.

"Academic integrity is the cornerstone of higher education. When students use AI to cheat, they are not just breaking the rules; they are undermining the value of their education and society's trust in our institutions," asserts Dr Mnzava.

Many educators are aware of the misuse of AI tools and are concerned about the implications.

Dr Catherine Mushi, a lecturer at Ardhi University (ARU), says AI tools are meant to assist students, not replace their thinking. When students use these tools to complete assignments without engaging with the material, they cheat themselves out of a valuable learning experience.

The lecturer advocates integrating AI literacy into the curriculum to teach students how to use these tools ethically and responsibly.

Similarly, Mr Mduge, the lecturer at SAUT, suggests teaching students how to use AI as a tool for learning, not as a crutch. "This means incorporating AI literacy into our courses and discussing the ethical implications of its use." He says addressing the misuse of AI tools requires lecturers to redesign assignments to make it harder for students to misuse AI.

“This might include incorporating more in-class activities, oral presentations, and reflective essays that require personal engagement with the material,” he says.

Rebecca Malimo, an assistant lecturer at ARU, points out that just as AI can be used to generate content, it can also be used to detect its misuse. Tools like Turnitin are evolving to identify AI-generated text, helping educators spot instances of academic dishonesty.

“We need to encourage students to explore topics that genuinely interest them and can reduce the temptation to misuse AI. When students are passionate about their work, they are more likely to engage deeply and produce original content,” she shares.

She recommends that institutions also play a critical role in addressing the misuse of AI tools.

“They need to set clear policies regarding the use of AI in academic work and ensure that students understand these policies. Institutions should invest in resources and support systems to help students manage their workload without resorting to unethical practices,” she says.

Mr Nzira, the assistant lecturer at UDSM, notes that "we need to create an environment where students feel supported and are encouraged to engage deeply with their studies. This means providing resources like counselling, study skills workshops, and academic support services."

He adds: "AI is a powerful tool that can transform education, but it must be used wisely. We need to educate both students and faculty about its potential and its pitfalls and work together to create a future where AI supports rather than detracts from learning."