Cars. Toasters. Typewriters.
These technologies were all created for the same purpose—to make human life and daily tasks more efficient. Technology has been built to work better for human use, which has ultimately resulted in the normalization of it in society.
On Nov. 30, 2022, ChatGPT was released to the general public. The program is a largely language-based form of artificial intelligence (AI) and can assist a user in creating well-written and formatted text. Microsoft is investing in OpenAI, the maker of ChatGPT, claiming they are making a “multi-billion dollar investment,” according to the Associated Press.
Due to the high level of intelligence, ChatGPT and other AI programs have raised questions about how these technologies will affect the way people. In education, specifically, the question is how will AI change the way students learn and educators teach.
Students can use ChatGPT to curate ideas, formatting and text for their writing. Andrea Wolfe, assistant teaching professor of English, said she will be allowing her students to use ChatGPT, as long as they are transparent about it.
“I know ChatGPT exists, and I know that we want to learn how to use it,” Wolfe said. “It’s probably going to become important to have these skills of using it in the professional world. So, what I’m asking students to do is if [they] choose to use it…I want it in the header. I want to know from the beginning [of the paper] that [AI is being used].”
Wolfe assigns her ENG 114 students a research paper during the course. Students spend over half of the semester delving into a specific topic of their choice. She said she predicts some students will want to use ChatGPT to assist them in writing this paper, and other students will stick to their established writing process without using AI.
Due to how extensive Wolfe’s research paper assignment is, she said she doubts ChatGPT will be able to write a large portion of a student's paper that will meet the high criteria. However, she does believe this type of AI will allow her to narrow in on what her students may need the most help with in their writing.
“Someone who is using [ChatGPT], their phrasing is probably going to be a little bit better, and their grammar might be a little bit better,” Wolfe said. “But they may not have incorporated their quotes as seamlessly because they are going to have to pull in quotes. I can pay attention to … how [they used] what the AI generated, and add and revise in order to meet the demands of the assignment. I can focus on helping them in a different way.”
Wolfe said she doesn’t have many concerns regarding AI and is excited to use it in her classes.
Due to these benefits of AI, Huali Fu, assistant professor of art animation, said he wishes there were more programs available for his students to use. Fu teaches 3D animation, which deals specifically with 3D modeling. He said AI has been integrated into many areas of animation, such as 2D animation, but not into 3D modeling.
Fu said in general, AI can significantly reduce the amount of time one spends on creating an animation. He said this is both beneficial and harmful; it is beneficial to experienced animators because they won’t have to spend hours on something AI could do in minutes, but harmful to new animators because their work will be about the same quality as AI. This can significantly hinder opportunities for new animators.
“When you are first starting out as an artist, the things that you’re going to get to do are not probably going to be fantastic,” Fu said. “[In 2D video games] they have certain 2D images that your head artist is not going to do … that’s going to someone who’s lower down the food chain. For younger artists starting out, the AI is quite detrimental to them because the AI can do close to the same quality of work they could do.”
In terms of 2D animation, Fu said anyone could type in dimensions and a specific scene into an AI program and get a simple version of that image. He said this allows animators to take simple pieces of AI and refine and add detail to them. Fu said this will allow animators to work more efficiently.
“AI is a tool for someone who is mid-career or a senior artist, and it will make them much more productive,” Fu said. “As a senior artist, you have something that is 75% [AI generated] … [then] you manipulate it, you can change it to the way that you want, and you’re done … In business, time is money. The faster you’re able to turn around things, the faster you’ll be able to be more productive.”
In 3D animation, Fu said there are “tedious technical aspects,” that he wishes there were AI programs to assist with. There are AI programs being developed for some of these areas of animation, but Fu said what they can produce is messy and not yet usable. Fu will not be integrating AI into his classrooms, simply because it is not available yet.
With the introduction of ChatGPT last year, staff involved in the Office of the Vice Provost for Academic Affairs had conversations in spring 2023 about how to address AI usage on campus. Dr. Kristen McCauliff, associate provost for faculty affairs and professional development, said there has already been unethical usage of ChatGPT that she has dealt with in her office.
However, ChatGPT wasn’t the first AI program to stir the pot in education. McCauliff said her office has had experience in dealing with other AI programs for years.
“We have [always been] pretty familiar with artificial intelligence and the way they can function both productively and unethically,” McCauliff said. “We’ve [always been] using our ethics policy to allow faculty to punish students who used what is called in the policy, ‘unauthorized materials.’ If a student uses any unauthorized materials to complete an assignment, they can be charged with academic dishonesty.”
While the usage of AI like ChatGPT can be abused, McCauliff recognized how it can be used as a tool in classrooms. The office of the Vice Provost for Academic Affairs provides a page on academic integrity where it includes links to resources assisting faculty members in learning how to use ChatGPT, as well as how they can include it in their syllabi. It allows individual faculty to decide whether or not to include the authentic usage of AI in their syllabi. She said she understands where ChatGPT may be more beneficial than harmful.
Mike Gillilan, director of student conduct, agrees with McCauliff in seeing that AI can be used as a tool. He said the code of conduct has currently not been changed to specifically address AI, mostly because it is unneeded.
“The reason we haven’t made any changes [to the code of conduct] is [because] ChatGPT doesn’t change what cheating is [and] it doesn’t change what plagiarism is,” Gillilan said. “The code of conduct and [the] policies are designed to be flexible and incorporate a number of things. Copying or receiving assistance that is not authorized [is already in the code of conduct.]”
Gillilan said the code of conduct purposely uses phrases to make individuals aware that not everything that is deemed unethical will be specifically written in the code. This is why AI is not specifically addressed in the code.
The consensus among these faculty members is that AI, like ChatGPT, should be used in some circumstances in courses because it will eventually become a part of the careers current students will be in.
“The students that will come in future years will just be more and more familiar with [AI],” McCauliff said. “We as faculty have to become more comfortable with it too … once we become more comfortable with this technology, we’re going to realize that it helps us … my faculty colleagues will realize how much it contributes to our own workplace.”
Contact Maya Kim at mayabeth.kim@bsu.edu or on X @MayaKim03