What Future Inventors Need to Know About Biological and Artificial Intelligence
Author: Dr. Shree Bose
Harnessing the Connections Between Biological and Artificial Intelligence to Shape STEM Education
Estimated Reading Time: 8 minutes
Piper has always been about teaching the first step of learning about technology — our goal is to instill familiarity with the basics of hardware and software engineering.
I have long thought that our goal isn’t to turn every kid who uses our products into a computer scientist; our goal is to broaden their perspective on how technology can be applied in innovative ways.
We do this by sparking curiosity and encouraging students to think beyond the conventional uses of technology. By showing them how to connect what they know with innovative applications, they learn to see the potential in every technology they use.
In this blog, you’ll learn how to introduce the basics of artificial intelligence to young learners through hands-on, approachable methods that build confidence and understanding.
Foundations of Artificial Intelligence
So, what are the building blocks of artificial intelligence (“AI”), and how do we approach them in a way that inspires confidence while still covering the essential content that is important to know?
At the end of the day, kids know what “intelligence” means — so let’s start there.
Understanding biological intelligence first
If you asked a room full of kids what a smart — or “intelligent” — response to seeing a bear on a hike is, many might say to back away slowly. (Turns out, running is not a smart move!)
But what exactly happens in the nano-seconds between seeing a bear and deciding to act? The cascade of events is quite complex.
The rods and cones in the back of your eye detect light and colors, and synapses trained on all the pictures and posters of bears you’ve ever seen recognize the pattern as a bear.
The bear’s growl, traveling through space as sound waves, hits the thin membrane of your eardrum and reverberates through the small bones in your ear, activating different synapses trained on growls you’ve heard on TV and in movies.
The combination of these inputs then sets off complex brain processing that immediately triggers the fear response, decides what to do, and sends coordinated nerve impulses out to your muscles to make you move.
Biological intelligence is truly a beautiful, intricate thing. How could humans ever “teach technology to do what our biological brains do” to create Artificial Intelligence?
Biological and artificial intelligence — linking what’s known to what’s new
Understanding AI begins with understanding biological intelligence (i.e., how your human body and brain perceive and react to sensory input).
The simplest concepts that kids already know and understand instinctively — seeing and hearing — are foundational inputs for AI. And for students, this is the place to start.
There is perhaps no better duo to teach the corollaries between biological and artificial intelligence inputs than our quintessentially artificial robot, Piperbot, and his trusty biological Martian mouse buddy, Pip.
Where Pip has eyes to see and understand the world, Piperbot has cameras and neural networks to process images. Where Pip has eardrums that detect the vibrations of sound waves and the neurons to process those into meaning, Piperbot has microphones to detect sound and AI algorithms to “understand” what the sounds mean.
Supporting students in making complex connections between artificial intelligence and biological intelligence
Granted, the next steps in our “biological intelligence” cascade get a bit harder to connect with what kids know.
How does your brain look at the pattern of lights and colors and decide there is a bear? How do you know that the sound is a bear’s growl?
In a basic sense, your brain is “trained” on all the depictions of bears you’ve ever seen — that one clip of Yogi Bear, that one poster of Smokey Bear, that TikTok of the bear getting into a car, and so on.
Your brain has created an understanding of what a bear is, and in a split second, you can tell when you come across one in the wild. AI, particularly machine learning, does a very similar thing — we use a “training set” to teach the AI what something looks or sounds like.
Learning voice recognition and AI through hands-on experience
There’s no better way to learn than through experience, and we’ll use one of our newest Piper Make missions, Speech Command, to help illustrate AI voice recognition.
In the mission story, Piperbot’s spaceship malfunctions. Piperbot and Pip have to get the ship’s fabricator working again to make parts to repair the ship. In this mission, students train Piperbot’s AI assistant, PAL, to listen for certain commands to start up the fabricator. PAL is trained with a database of 100 recordings of different commands, including “stop” and “go.”
By associating a recording with its meaning, our Piper Make AI model creates a voice recognition algorithm that recognizes and interprets what “stop” and "go" sound like. When students say "go" and see a green LED light up—followed by their fabricator in the Explore tab pumping out parts—it's a cool, tangible illustration of all the above concepts.
Nurturing critical thinking and calling out bias
"One of Piper’s key goals is to get kids to create technology rather than just consume it, and we think that understanding the pitfalls of these powerful technologies is a cornerstone of that endeavor. "
Take, for example, our Piper Make Speech Command mission above. If the recordings the model was trained on included only American-accented people saying the word “stop,” it would be quite feasible for the model to not recognize “stop” in other accents, right? Pausing throughout our missions to point out these moments where algorithmic bias can be introduced is incredibly valuable. When creating AI products and services, sometimes unintentional instances of algorithmic biases can have incredibly far-reaching consequences. We can help students become more aware of how the data used to train AI models can shape their outcomes, fostering a more thoughtful and ethical approach to building the next generation of AI.
Understanding how things go wrong
What about when AI simply does not give us the right output?
Let’s use our second Piper Make AI mission, Pose Detect, to show kids what this looks like. With the ship up and running again, Piperbot’s navigation remains somewhat out of whack and needs help.
By using the camera to detect the student’s body movements, and using camera images to “see” where your arms, legs, and faces are, PAL can steer the ship in different directions. But sometimes the model misunderstands where your nose or arms are! And, like much of AI out there, there is no clear way to track exactly why the model thought your hand was your nose.
These mistakes are low stakes when it’s a simple video game but high stakes when you’re talking about using AI for hiring, or for law enforcement, or the many other ways that our users will be using AI in the coming years.
It’s the same when it comes to using generative AI to create new blocks of text, videos, or images. Validating the results is the responsibility of a human. After all, you rarely confuse a hand for a nose.
Check out Pose Detect’s spaceship steering:
The power of the "Aha Moment”
The magic of learning is clearly seen in the “aha moments” – that moment when kids’ faces light up as they program an LED to light up for the first time, or they make their robot dance and accomplish a task in the storylines of our characters, Piperbot and Pip.
But Piper’s profound success comes afterward– in what kids carry with them from the experience: a kid who played the Piper Make Light Show growing up to be an artist incorporating lights and sounds in their art; or the kid who played with our Piper Make Temperature Sensor growing up to be a surgeon who designs a cautery which can turn itself off when it gets too hot.
These aren’t just hypotheticals to us.
We see them at the end of each Piper program in the Piper Make-A-Thon, when students come up with the coolest innovations, from automated, full-service laundry machines to parking garage sensors.
"While we teach the zero-to-one of what you can do with technology, our users expand that to possibilities we could have never even dreamed of."
Our mission: to help you responsibly shape tomorrow’s technology leaders
In a world where technology continues to evolve rapidly, nurturing a generation of fearless learners is essential.
By providing kids with a foundational understanding of technology — especially concepts like artificial intelligence — we’re equipping them with knowledge and empowering them to explore, innovate, and improve the world around them.
"Children possess an innate curiosity and a willingness to experiment without fear of failure. Our mission is to channel that fearlessness into a well-informed perspective on technology. "
By engaging with projects like Speech Command and Pose Detect, students learn how to create and control technology and how to approach it critically. They discover the importance of diverse perspectives, recognize potential biases, and understand the responsibilities that come with creating AI.
As these young innovators carry their experiences into the future, they’ll be armed with the confidence to tackle complex challenges and the wisdom to navigate the ethical landscapes of their creations.
We’re excited to witness their fearless spirit shaping tomorrow's technology — one “aha moment” at a time.
FREQUENTLY ASKED QUESTIONS
1. What is the connection between biological and artificial intelligence?
Play Piper’s AI-centered missions seek to demonstrate the differences and similarities between biological intelligence and artificial intelligence (”AI”). Missions are designed so that students build deep conceptual knowledge of AI — not just how it works, but why it works and why it exists. Biological intelligence is more familiar, as it deals with how humans take in sensory information through and recognize patterns. Exploring biological intelligence through storytelling helps students understand how AI seeks to replicates these human processes using technology, such as cameras, microphones, and neural networks.
2. How does Play Piper teach AI concepts to kids?
Play Piper starts with concepts kids already know, like seeing and hearing, and links them to AI concepts such as image processing and sound detection in machines. They use a cascading structure that meets students where they are, meaning they begin by assessing what students already understand and build on that knowledge, moving from teaching less complex concepts to more complex and abstract STEM concepts.
3. What is Pose Detect, and what does it teach about AI?
Pose Detect is one of 10 missions that involves storytelling with the characters Piperbot and Pip on Mars. The activity teaches students how AI interprets body movements, showing them how AI can make mistakes, like confusing body parts, and reinforcing the need for human oversight in AI applications. It involves hands-on building using a Raspberry Pi microcontroller, wires, LEDs, a breadboard, and more. Students also utilize digital tools like drag-and-drop block coding.
4. What are training sets in AI, and how does Play Piper use them?
Training sets are collections of data used to teach AI models to recognize patterns. Play Piper uses missions like Speech Command to demonstrate how AI can be trained to understand commands like “stop” and “go.”
5. How does Play Piper encourage responsible use of AI?
Play Piper seeks to empower students to explore, innovate, and challenge the status quo, equipping them with the skills and confidence to navigate future technological challenges responsibly. Critical thinking is part and parcel to Play Piper’s approach to AI teaching and learning. Play Piper teaches students about algorithmic bias by showing how training data, like voice commands with different accents, can lead to biased AI outputs if not properly accounted for. Play Piper encourages students to create technology, not just consume it, teaching them to identify and mitigate biases and inaccuracies in AI models.