fbpx
Could AI-Powered Tools Such as ChatGPT Provide Students With an Unfair Advantage? Could AI-Powered Tools Such as ChatGPT Provide Students With an Unfair Advantage?
With the rapid expansion of AI-powered tools enhancing existing human abilities, many are worrying about its effects on education. In short,... Could AI-Powered Tools Such as ChatGPT Provide Students With an Unfair Advantage?

With the rapid expansion of AI-powered tools enhancing existing human abilities, many are worrying about its effects on education. In short, could AI-powered tools, such as ChatGPT, be used by students to cheat? This is the question being asked by the Financial Times. As we’ve seen over the last few weeks, there is already pushback against AI technology within the art community. Here, artists are expressing that not only artificial intelligence giving non-artists an unfair advantage, but that the technology is also stealing from artists as it learns from datasets utilizing existing art.

This question is similar to one being asked by the Financial Times as they look into programs such as ChatGPT and how the rapid advancement of AI could rock the foundations of the academic world.  First, what is ChatGPT? Well, for those unfamiliar with the program, it was created by OpenAI. The program is a large language model trained on millions of data points, including books, and can produce coherent replies to questions using the power of predictive modeling. Though this is the case, ChatGPT is often incorrect when it comes to specifics and still requires plenty of oversight. Even so, it’s proven to be an effective means of writing out large sections of text.

In recent weeks it swelled in popularity due to its ever-growing ability to write in a human-like manner. For many, these pieces of text written by the program are quite convincing and it’s difficult to discern if the writer was human or AI. Because of its effectiveness at taking prompts and writing them out, it seems that academics and higher education professionals are sounding the alarm. For students, it could become a go-to tool of choice. But tech to detect this isn’t just sitting on the sidelines. Software such as Turnitin, which over 16,000 school systems across the globe use, is working to improve its ability to detect AI-assisted writing. According to Annie Chechitelli, Chief Product Officer at Turnitin, they’re developing tools to help educators to find evidence of help from AI programs such as ChatGPT.

But it seems that even companies specializing in plagiarism detection are worried about not starting a so-called “arms race” against artificial intelligence. The reason is, most students who rely on these tools tend to do poorly on exams according to a Rutgers University study in 2020. Head of Artificial Intelligence at the World Economic Forum in Davos, Kay Firth-Butterfield, is also suggesting calm when it comes to AI, “Students are not going to be getting automatic As by submitting AI-generated content; it is more of a workhorse than Einstein.”

Though many professionals within the AI community are unwilling to sound the alarm, others are not so sure and are viewing the rapid advancements in artificial intelligence in 2022 as evidence of their concerns.

ODSC Team

ODSC Team

ODSC gathers the attendees, presenters, and companies that are shaping the present and future of data science and AI. ODSC hosts one of the largest gatherings of professional data scientists with major conferences in USA, Europe, and Asia.

1