Texas Tech puts resources and tools in the hands of faculty members as AI and its applications continue to transform higher education.
(This story is part of a series looking at examples of innovation on the Texas Tech campus. This installment looks at the university's engagement with AI.)
Artificial intelligence (AI) could be the best thing or the worst thing ever to happen to higher education. Perspective is everything, but information is power, and Texas Tech University is working to make sure its faculty are equipped and ready for whatever might happen.
“What I would say is AI is going to shift the landscape of higher education,” said Suzanne Tapp, assistant vice provost of faculty success and managing director of the Teaching, Learning, and Professional Development Center on campus. “Ready or not, it's here, and it is going to change the way we interact with our students.”
Armed with that sobering and rapidly changing truth, Tapp also serves the university as co-chair of the Resource and Guidelines Committee alongside Matt Gregory, Dean of Students and Vice Provost for Student Life. The committee is a multidisciplinary group grappling with a handful of priorities, including providing AI guidance to faculty in real time.
“This is a really broad group, and we are looking at AI from multiple perspectives,” she said. “One of the things we're seeing is faculty who are either really focused on the cheating and academic misconduct component or faculty who are interested in incorporating it into their classrooms.”
Tapp understands both positions. There is a worry that left to their own devices, students would exclusively rely on AI to generate all their assignments. At the same time, AI offers exciting possibilities and implemented judiciously, can amplify the educational mission of Texas Tech.
“I think our faculty have very sound reasons for either perspective,” she said. “On the one hand, there is a concern that our students will not gain foundation-level knowledge, and the ability to write is so important.”
Underscoring the worry is another inconvenient reality – detecting whether work is AI-generated is virtually impossible.
“One problem is there is not a verifiable way to detect AI use at this point,” Tapp said. “Anyone saying they have a tool that can tell whether content is AI- or human-generated wants to make some money. Maybe that is something that will come, and there will be interesting strategies ahead of us.”
Tapp explained there are several reasons AI work cannot be reliably detected, including biases against non-native English speakers that can generate false positive results. Work is ongoing in the area, though, with institutions such as Stanford University recently announcing beta testing on an AI-detection project and vendors racing to find solutions.
For now, instructors still have options if they have suspicions about the originality and authenticity of student work.
“If a faculty member had questions about certain parts of a writing assignment, they could ask the student about it and say something like, ‘Tell me more about this' and use oral interviews and testing to assess competence," Tapp said.
“That's a challenge for faculty members with really large sections. It's not a perfect strategy because it's time-consuming, but it's an interesting way to ask students to prove their knowledge.”
While academic corner-cutting is a legitimate concern, Tapp's group is investigating ways that faculty members can help their students engage AI because developing critical thinking skills is also an obligation of the higher education community.
“The committee's perspective leans toward incorporating AI and thinking about its future,” she said. “It is our responsibility to teach students how to critically use AI and how to think about it ethically. In other words, let's look at some of the concerns that come with AI, and let's get our students to ask questions about those concerns.”
For example, in a class last summer, Tapp asked students how many were familiar with AI. Half were and half weren't.
“And the half that didn't know were scared,” she said. “That is because there was some misinformation among students creating fear, so let's take on that fear and misinformation and break it all down.”
From there, Tapp shifted directions, walking students through a presentation on AI, leading them into a discussion about a previous topic that they asked AI to explore and then went through the results together.
“We began by putting on a critical lens,” she said. “We looked at the information, checked it for accuracy and bias and asked where it originated. The purpose was to get them to evaluate it critically.
“It's a great tool that faculty members can leverage in a meaningful way. But there is also an ethical piece that has to be explained.”
Tapp said numerous faculty members have been creative in their approach to AI, giving students the opportunity to explore its contours and come away with a deeper understanding of what it can do and what it shouldn't be asked to do.
“We are seeing really creative, really interesting things happening on campus in relation to AI,” she said. “One of my favorite things is seeing faculty add reflection into their assignments. They look at what ChatGPT generated and then they ask students to comment on the content and evaluate it.”
Because the abilities and applications of AI are rapidly expanding, it's difficult to say how large a disruption it will wind up being for higher education.
“I've heard it compared to the calculator,” Tapp said. “But I think you could compare it to the internet itself, as it became a tool more people had access to and became more commonly used. People would wonder about the credibility of a website, and we had to teach our students how to know what was credible, what was a scam and what came from a biased point of view. Some of the questions we asked then, we're asking now.”
The committee began meeting last year, gathering resources and information as breakthroughs in AI gave it a momentum all its own.
“We realized quickly that we were going to have to decide which side we wanted to come down on,” she said. “Were we going to try and beat this or were we going to try and use it, and we decided we were not going down the detection path. That was one of our first decisions.”
Instead, the emphasis shifted to being proactive. The committee put together workshops and webinars, trying to get as much information in the hands of as many members of the campus community as possible.
“There has almost been a crowdsourcing feel to it,” she said. “Like, ‘Here is what we're seeing or what we're hearing from students. What are you seeing and hearing?' We've received really good engagement with the approach.”
Tapp said the University Library has been a repository of information, hosting a variety of related resources on its webpage with a consistent focus on demonstrating how AI can be a tool for faculty members in teaching and research.
The group's work culminated in some ways as it developed several AI-related syllabus statements for faculty to use as the fall semester began.
“Essentially, we created a framework of possibilities as recommended statements that faculty could include into their syllabi,” she said. “The university is not in a place where we're going to have one statement that represents everyone, so we came up with four variations for faculty to choose from, based on their teaching philosophy and learning goals for their class.”
The broadest options range from AI can be used for everything in a class to it can't be used at all. In between, are two other possibilities: use AI only for specific assignments or use it but cite and fully disclose the extent to which it was used.
“We're not unique in our perspective,” she said. “A number of schools have taken a similar approach.”
Tapp said the flexibility was also necessary because Texas Tech has such a wide variety of disciplines represented on campus.
“I think when people initially thought about higher education and AI, they thought writing,” she said, “but there are so many applications. For example, coding is a huge component to think about. In computer science, learning how to code is foundational.
“In chemistry, we have a faculty member who has ChatGPT in class and poses questions, so it's almost like another student. The class can see how the questions are handled and see if there are flaws in the logic, which is an interesting approach.”
“I think one of the most important things is realizing this conversation about AI is ongoing,” she said. “The goal of higher education is to prepare citizens, so what we have to do is figure out how to help students think critically and reflectively about AI and what it is, and I know this: Texas Tech will not be left behind."