© 2024
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Could AI be the next college teaching assistant? Some Colorado professors believe so

A student wears a black blindfold and smiles while seated in front of a laptop.
Emma VandenEinde
/
KUNC
Student Ashley Stafford wears a blindfold and tries to upload an image into CoPilot, an AI software, on April 15, 2024, at the University of Colorado Boulder. Stafford said the voiceover function was 'annoying' because they would accidentally tab past the command they needed.

It’s a calm afternoon in the computer lab-style classroom at the University of Northern Colorado (UNC). Junior Vivian Johnson is busy typing ideas into her computer. Today’s assignment is all about adapting beloved children’s stories using Chat GPT. Johnson asked the AI software to write “Cat In The Hat” in a science fiction style.

“It switched it to, like, the hats gave them magic powers and then strange things started happening in the town,” she said. “And they found out it was because of this mad scientist that was trying to get powers from the plants and animals.”

But the exercise is not just about writing silly stories. The driving question is, can AI be used as a tool – just like essays and class projects – to teach children’s literature?

The elementary education students continue to refine the stories by adding more vocabulary words, or making the stories readable at a first-grade level. It’s fun and educational.

Three students sit in front of computers talking.
Emma VandenEinde
/
KUNC
Holly Hyman (middle) works with her group partners on an AI class assignment at the University of Northern Colorado. At this point in the activity, the class was asked to look at children's books and see how they could improve their story using Chat GPT.

For example, student Holly Hyman worked with her group partners to ask Chat GPT to rewrite their story with more of a focus on their main character’s family history.

“Despite the love and warmth of her adoptive family, Lily often found herself wondering about her past. Who were her parents? Why did they leave her behind?” she read from the program. “Hey, there you go, that’s some character development!”

Johnson said she already knew what Chat GPT was, but she never viewed AI as a teaching tool. Now, it’s a whole new discovery. She hopes to use the technology to create content for second language learners in her future classroom.

“I love this class, it’s just so interesting,” Johnson said. “The buzzword ‘AI’ is really scary to some people, and I'll admit that I was even kind of that way. But I've noticed a lot more on the benefits of AI…I feel like we get a lot of good strategies out of this.”

Get top headlines and KUNC reporting directly to your mailbox each week when you subscribe to In The NoCo.

* indicates required

Artificial Intelligence has been used to edit photos, enhance home appliances and more. But now it’s being widely introduced in the classroom. This story activity is one of many class activities designed to show these future teachers what innovative things AI can do.

“They can have their students…analyze those stories,” said Matthew Farber, professor of a UNC class called "Integration of Technology into Content and Pedagogy." “Then they get to work closer to the evaluation of narrative and literary devices through using AI.”

As AI seeps into the classroom, it hasn’t come without critique. Universities are enacting policies on using the technology to write papers. Some teachers ban it altogether. Even on the student side, more than half of 1,000 surveyed college students believe using AI is cheating.

But Farber believes it won’t replace teachers or their work. He compared AI to computers and the initial fear they would become teaching machines. He referenced how Seymour Papert, the creator of Logo programming language and tools that work with Lego robotics, helped change the perspective on computers.

A man stands in front of a projector screen.
Emma VandenEinde
/
KUNC
Matthew Farber, professor of a class called "Integration of Technology into Content and Pedagogy" at UNC, stands in front of the class and leads them in a discussion of the 'hackathon' his students participated in with children's literature. He wants them to consider what makes a good prompt for the AI software to work with.

“Papert said, ‘It's a children's machine, children learn by teaching the computer,’” Farber recalled. “That is the approach that I'm taking with AI, that AI is not teaching students so much as we'll learn by teaching AI and working with AI.”

Farber also emphasized that many kids have already interacted with AI outside of the classroom, and not just in the form of Chat GPT.

“Anytime they play an app or a video game that has a (virtual) second player, whether you're playing chess online, or you're playing Minecraft, all of the other non-playable characters are AI,” he said. “So how do we harness that to be a power for good?”

He believes children should not be passive consumers of technology, but rather should learn how to reflect on it and see it as a tool.

“The big takeaway is, how do we play with this tool in a thoughtful way, in an ethical way, that will then help us understand our humanity better?” he said. “It’s really important to embrace it.”

‘A profound benefit, if we get it right’

AI is still a new concept for teaching. Last year’s Educator Confidence Report found around 75% of educators wanted to increase their use of it. But only 20% felt confident enough to do so.

Alex Kotran is the co-founder of the AI Education Project (aiEDU). It helps equip teachers with AI lessons and knowledge to prepare students for a world where AI is everywhere. He started the project in part because his mother, who’s a teacher in Akron, Ohio, asked him to present an “AI 101” class to her students.

“It was more surprising, like surely (she must) have a teacher that talks about this, I mean, how does she talk to students about the future of work?” he said. “How is it possible that school students who are most impacted by this, like they're not part of these conversations? It just doesn't make sense.”

He looked around for curricula in schools, but could only find lessons in computer science classes and summer camps. He wants to close that gap in education and bring it out of STEM. He believes every graduating student should be learning about AI to be successful in the job market.

A person stands at the front of a classroom lecturing with students at desks with laptops seated watching.
Emma VandenEinde
/
KUNC
Larissa Schwartz, professor of the "Generative AI" class at CU Boulder, asks her class to consider how AI is doing in terms of accessibility. She shared how AI has helped web designers notice accessibility issues like text legibility and color contrast, but these programs still have a long way to go in order to be fully usable.

"If you're a truck driver, if you're an accountant, if you're an investment banker—like, every vocation is going to be impacted in some way. Some of them are going to be significantly impacted,” he said.

aiEDU has connected with the Colorado Education Initiative to form a steering committee that will look into best practices for AI by testing and scaling approaches.

“There's only a few states, Colorado being an example, that have done consistent convening… (and) have a sustained conversation about AI in education,” he said. “Superintendents and school board associations are seeing this and they're realizing, ‘Okay, this is actually something that our school leaders really care about, and they want to see action.’”

Data from an international educational journal shows the use of AI in the classroom can actually increase student engagement and learning outcomes. But Kotran thinks some AI tools are not completely accurate yet and might be used as educational crutches to metacognitive skill development.

“It'll definitely be a net good,” he said. “But the question is, is it going to be so good?”

Testing its limitations

Kotran is not the only skeptic. University of Colorado Boulder student Ashley Stafford pulls up a picture of a needle-felt cat on their computer during their "Generative AI" class. Today, the class is focused on accessibility. The assignment is to find any photo on the internet, describe it in great detail to a partner and draw it.

”The head has two big eyes that are brown with huge irises,” Stafford said to their group partner, Chi Chi Kari. “The creature is lying on its side slash stomach in a relaxed position.”

Professor Larissa Schwartz is simulating a real-life technology called Be My Eyes, which allows blind or low-sighted people to call volunteers via video chat to describe what is in front of them. Since launching, the program has recently added an AI feature which will describe the contents of a photo in place of a human.

But when they put the original photo into the AI software CoPilot, things go awry.

An open laptop computer shows an image of a small white animal while someone holds up a white piece of paper with a drawing on it.
Emma VandenEinde
/
KUNC
Chi Chi Kari (left hand) holds up her interpretative drawing of a needle-felt cat image and compares it to the real image that Ashley Stafford (right hand) described. The class activity was designed to see how reliable a human description of a photo is compared to an AI description.

“I'm like, ‘Describe the image’ and it's like, ‘It's something of a delightful fantasy world. Its simplicity and unique design,’” Stafford recalls. “I'm like, ‘Dude, just tell me what it looks like,’ you know? It’s not my friend and I want it to stop acting like it is.”

Other students found CoPilot could not identify key celebrities in the photos or distinguish various Asian symbols.

Student Jackson Greer, who studies creative technology and design, said it’s hard to argue the technology isn’t helpful to some degree – it’s either trusting the computer or getting no information at all. But he said AI needs to be specific and built for its purpose to be effective.

"If it's just describing how whimsical and magical the cheese in the fridge is, like, I don't care (if I was) a blind person, I want to know where the cheese is,” he said.

Another common exercise Schwartz will do is have her class blindfold themselves and try to upload a picture to CoPilot using VoiceOver technology. As students pressed the tab button furiously to find the right setting, they got frustrated as the computer whizzed through the commands verbally. Some even found the program would give complicated commands, like clicking “Command + Shift + Option Up Arrow.”

“You couldn't go back,” Stafford said about the activity. “So I'm tabbing through it, and it says the option I need and I accidentally clicked past it. So I had to like go all the way back around, which was very annoying…I kept peeking under the blindfold.”

But the class isn’t all doom and gloom. Throughout the course, Professor Larissa Schwartz touches on how AI is used in industry, education and accessibility. She’s shown how blind athletes rely on AI to guide them on trails, or how designers use it for creative inspiration.

“I think if you can collaborate with your ideas, and then moving that into these different applications, you're only going to come up with a better solution,” Schwartz said.

A woman sits pointing at an open laptop on the lap of a male student.
Emma VandenEinde
/
KUNC
Professor Larissa Schwartz helps a student during her "Generative AI" class. Schwartz has experience working with blind athletes who use AI and wants to teach about accessibility in her class since it is not as widely discussed.

Some students said they wouldn’t know about the breadth and depth of AI if it were not for this class.

“I know myself and some other people were coming in expecting a more technical aspect of like, here's how to program AI in Python,” Stafford said. “But it's been really interesting to see the more human applications of it.”

But Schwartz knows that all of these technologies are not going to be reliable. She said caution and consideration when using AI is key, especially for those with a disability.

“Anyone working within technology, whether you're developing or you're designing, you need to have an understanding of every single type of person that is using technology, not just people that can have all five senses,” she said.

Regardless, AI is not going away. Global data company Statista predicts more than 240 million people in the U.S. will be using AI by the end of the decade. That includes educators and students.

I'm the General Assignment Reporter and Back-Up Host for KUNC, here to keep you up-to-date on news in Northern Colorado — whether I'm out in the field or sitting in the host chair. From city climate policies, to businesses closing, to the creativity of Indigenous people, I'll research what is happening in your backyard and share those stories with you as you go about your day.
Related Content