Gati Aher already has some ideas about how artificial intelligence could improve her after-school job.
It’s a small business with a heavy focus on marketing, where some employees spend hours each day tracking their ad clicks. AI, she says, could handle a lot of that grunt work while helping the company strategically target its advertising dollars.
“If you have AI communicate over the network to direct those ads to people who actually want them, you could use technology more efficiently,” says Aher, a graduate of Burlington High School in Massachusetts.
That, she admits, barely scratches the surface of AI’s potential. Now that she’s coded her first chatbot, she understands just how powerful the technology can be – and it’s a skill she plans to develop.
“I think once we integrate AI into more things, it will allow us to be smarter about what decisions we’re making for companies,” she says. “Right now, I myself don’t really know how to utilize it to expose its potential. Once I figure that out and find other people willing to bring that in, our business could do better at marketing.”
Being the first generation to grow up alongside AI puts Aher and her peers in a unique position. As lifelong consumers of the technology, they’re naturally adapted to using it. But because AI operates so invisibly within their lives, they may also be inclined to take it for granted.
Now, with a new generation of AI-powered tools at their fingertips, including the newly released ChatGPT chatbot, students can use artificial intelligence to create in all sorts of new ways, from animating videos to composing symphonies, without ever peeking under its hood.
“Most students don’t know what AI is. They take for granted a lot of the things technology does for them,” says tech integration specialist Deb Norton, who teaches ISTE U’s course on artificial intelligence aimed at educators. “By teaching them what it is and how to work with it effectively, we have big aspirations that kids will be able to think about what else it might be good for, what they might do with it.”
ISTE has partnered with GM to create the professional learning program AI Explorations and Their Practical Use in School Environments in an effort to cultivate future AI programmers and provide professional learning for educators to support student-driven AI explorations.
While using tools that have been supercharged with AI has its value, many educators believe it doesn’t go far enough in preparing students to become stewards of this revolutionary technology. To truly understand how AI works, and to become effective problem-solvers with it, students need to learn how to build it themselves.
A different type of coding
Instructional technology specialist LeRoy Wong isn’t a programmer, but he doesn’t let that hold his students back.
As the facilitator for Burlington High School’s student-run help desk, he gamely agreed to advise an after-school coding club – and when the opportunity arose to participate in an ISTE and GM-sponsored pilot involving chatbots, he followed where his students’ interests led.
Using programming tools such as Lambda and Amazon Lex, they were challenged to code an AI chatbot solution to a classroom or schoolwide need. Their goal was to create a chatbot that would provide tech support to teachers and students.
“It was a struggle for me,” Wong confesses. “Help desk isn’t a CS class per se, though I feel like we’re doing more things in computer science.” Elements involving the Python coding language posed a particular challenge, but he was able to get some help from professionals in the tech field. “We managed to learn enough to get it working. I’d like to get it working even better.”
It’s one thing to talk about machine learning in class or help students create with AI tools. But even tech specialists and computer science teachers can be intimidated by the prospect of coding artificial intelligence.
“The truth is, it’s not that difficult of a concept to truly understand,” Norton says. “Is it difficult to create? Yeah, on some level, but there are also some basic low-level tools for incorporating or creating AI. It isn’t just for people who are into computers or teaching science. It’s for any educator and any student at any level.”
For Aher, building AI was both easier and harder than she expected.
“It didn’t have as much programming as I thought it would. Amazon Web Services provided a lot of the underlying framework for machine learning and intelligence, but using that as a development tool was kind of tricky,” she says.
“I think coding AI is different in the sense that when you’re writing a piece of code in class, you’re using very simple libraries such as math function libraries and graphics. With Amazon, you’re drawing from a larger code base. The functions still work as functions, but the objective is more abstract. Mapping out what functionalities you want your chatbot to have is the tricky part.”
Programming AI, she learned, isn’t just about writing lines of code. It requires students to think about the big picture and understand how the various pieces of code interact with each other.
Although educators have typically considered artificial intelligence a periphery topic within computer science, researcher Ben Shapiro argues that AI technologies such as machine learning have transformed the core of what tomorrow’s computer scientists will need to know. Traditional coding requires students to think in terms of algorithms and data structures – in other words, to think like mathematicians. Machine learning (ML), on the other hand, requires them to think more like scientists.
“While traditional software is built by human programmers who describe the steps needed to accomplish a goal (how to do it), a typical ML system is built by describing the objective that the system is trying to maximize (what to achieve),” says Shapiro, assistant professor of computer science at the University of Colorado and co-author of How Machine Learning Impacts the Undergraduate Computing Curriculum. “To succeed with ML, many students will not concentrate on algorithm development, but rather on data cleaning, model choice and statistical testing.”
Reverse engineering AI
In David Lockett’s middle school STEM classes, students take a more experimental approach to AI. His project-based learners are testing the limits of machine learning by creating with a variety of emerging AI technologies.
They doodle in Quick, Draw! to see whether machine learning can recognize their drawings. They conduct AI symphonies through Google’s Semi-Conductor. They compose music with the help of coding tools such as Apple Swift. They use AI to pare down existing musical compositions so they can better understand how various elements are combined to create a song.
The more they explore AI-powered creative tools, the more curious they get. How does machine learning actually work? What else can they do with it?
During a lesson on Alexa development apps in coding class, students listened to Lockett deliver voice commands while a monitor showed them the coding behind the virtual assistant’s responses. It’s a form of reverse engineering – getting to know what machine learning can do first and then dissecting how. Once they saw how simple the code was, they began clamoring to build their own AI chatbots.
Since then, Lockett’s students have been obsessed with coding AI. They brainstorm project ideas in the bus line. They go home and immerse themselves in machine learning technologies.
“They’re just amazed by the possibilities that can come from it.”
Ethics, empathy and machine learning
Not all of those possibilities are good, however. While AI can help solve humanitarian problems, it can also be used to exploit users’ private data, manipulate public opinion and widen inequality gaps – to name a few.
“Like any other tool, people can misuse it,” says Yiannis Papelis, research professor and director of the Virtual Reality and Robotics Lab at the Virginia Modeling, Analysis and Simulation Center. “There’s a saying that to err is human, but to really screw up you need a computer. This applies a thousand times to AI. It can be used for good or bad.”
That’s why empathy plays a key role in AI instruction for April DeGennaro, a gifted education specialist at Peeples Elementary School in Fayetteville, Georgia. She’s teaching fourth graders about machine learning using Cozmo robots, which display personality and emotion as they learn from their users. Not only can students interact with the robot, but they can program the AI to perform new functions.
During a recent project, students collaborated in groups to devise problems and then program the robot to solve them. They also made up stories around the problems. In one scenario, Cozmo was a seeing-eye robot for a blind person at the grocery store. His task was to find three specific items, represented by cubes bearing different symbols. Once the robot learned to recognize the symbols, he was able to fetch the items.
DeGennaro hopes these experiences with Cozmo will inspire her students to become ethical creators of AI.
“We talk a lot about empathy and how so much of the AI community is working on solutions to make the world better,” she says. “Robots can map out a terrain and find people underground or rescue people who have been lost in water accidents. There are so many things AI is doing in the field that are so helpful and so positive.”
On the other hand, AI toys like Cozmo can also help highlight the privacy concerns smart devices raise. When children interact with these toys, what information are they revealing about themselves? Who owns this information once it’s captured? How will it be used?
To navigate these legal and ethical thickets, students “will need to be well-rounded and capable of considering a range of implications, including legal protection, ethical considerations, and what will happen when machines become more like humans,” says Michelle Zimmerman, author of the ISTE book Teaching AI: Exploring New Frontiers for Learning. “Will we treat AI as machines or as humans? These are critical conversations as we attempt to impose order on the Wild West frontier of AI.”
Bias poses another area of concern. Because AI learns from data furnished by humans, machines can inherit the biases of their programmers. Plus, developer bias influences which problems machine learning will ultimately solve. To encourage a more diverse pool of creators, students of all genders and backgrounds need in-depth exposure to AI. One group that’s making strides in addressing this gap is AI4ALL, a nonprofit working to increase diversity and inclusion in AI by creating pipelines for underrepresented talent.
Building AI helps pull back the curtain on what makes artificial intelligence tick, showing students that the keys to this powerful technology are within their grasp.
“The connection I’m making for them is that they’ve already learned to code, and there is coding behind AI,” DeGennaro says. “It is not magic, just like the computer is not magic. Someone coded it to do what it does.”
And so can they.
Nicole Krueger is a freelance writer and journalist with a passion for finding out what makes learners tick. This is an updated version of an article originally published June 17, 2020.