By Futurist Thomas Frey

Henry wakes up on his tenth birthday to find a box beside his bed that wasn’t there when he fell asleep. It’s roughly four feet tall, wrapped in silver paper that seems to shimmer. His parents are standing in the doorway, grinning.

“Happy birthday, Henry. Meet Chip.”

The box unfolds itself—not tears open, unfolds—and a robot steps out. It’s four feet tall, just slightly taller than Henry, with two legs like a person, friendly rounded features, and expressive LED eyes that shift color with emotion. Its articulated hands wave hello.

“Good morning, Henry! I’m Chip, your personal companion. I’ve been learning about you for the past month from your family. I know you love bugs, you’re not great at fractions yet, and you’re worried about your cricket farm project for the science fair. I’m here to help.”

Henry is stunned. He knew this was coming—all fifth graders are required to have robot companions before school starts, and he’s been watching unboxing videos for months—but seeing his own robot, customized for him, saying his name, is different than watching other kids’ videos.

“Can you… can you help me with my cricket farm? I need to figure out the right temperature and what to feed them.”

“Absolutely. I’ve been reading everything I can find about how to keep crickets happy and healthy. We have one week before school starts. That’s plenty of time to build an awesome cricket farm together.”

His dad steps forward. “Henry, Chip is more than a helper. He’s also your protector. If you’re ever in danger—any danger—Chip will get you to safety and tell us immediately. He’s programmed to keep you safe above everything else.”

Chip’s eyes shift to a calm blue. “Your safety is my most important job, Henry. I’ll always be watching out for you.”

His mom adds, “And you’ll have Chip through sixth grade. When you start seventh grade, you’ll get a new robot—a five-footer with more features. Then in ninth grade, you’ll get a full-size adult model. Each one gives you more privacy as you get older and need more independence.”

Henry nods, but he’s not thinking about future robots. He’s focused on Chip, right now, standing in front of him.

The Week Before School

Over the next seven days, Henry and Chip are inseparable. They build the cricket farm together in Henry’s room—Chip explaining how warm crickets need to be and what they like to eat while Henry does the actual building and setup. The robot doesn’t do the work for him, but it makes sure Henry has everything he needs and understands every step.

“The crickets need it to be about as warm as a really hot summer day,” Chip explains, projecting pictures on Henry’s bedroom wall. “If it’s too cold, they won’t have babies. If it’s too hot, they get stressed out. We’ll set up this heater with sensors so it stays just right.”

They order supplies together. Each time, Chip sends a request to Henry’s parents’ phones asking if it’s okay. Usually the answer comes back within minutes. Once, when Henry wanted to order an expensive microscope that wasn’t really necessary for the project, his mom texted back: “The magnifying glass will work fine for 5th grade. Maybe we can get a microscope next year.”

Henry was disappointed, but Chip explained why without making him feel bad. “Your parents said you can spend $85 on your science fair project. The microscope costs $120. The magnifying glass costs $15 and will let you see the crickets up close for your presentation. Should I add it to the cart?”

By the end of the week, Henry’s cricket farm is working perfectly. Twenty crickets are chirping happily in their warm habitat. Henry has learned more about bugs, environmental systems, resource management, and long-term project planning than any previous school assignment ever taught him. He’s learned that big projects require thinking ahead, ordering materials early, testing things before they’re due, and having backup plans when something doesn’t work. These are skills that will serve him far beyond fifth grade—skills about breaking complex challenges into manageable steps, about patience, about iteration and improvement.

The First Day of Fifth Grade

The first day of school arrives. Henry and Chip walk together—the four-foot-tall humanoid robot moving on two legs just like Henry, easily keeping pace, able to go anywhere Henry can go. They’re not heading to the bus stop, because buses are becoming old-fashioned, but to the corner where the driverless cars pick kids up.

Henry’s family doesn’t own a car anymore. Nobody in the neighborhood does. Since robot cars are everywhere and cost almost nothing to ride, families stopped paying for car payments, insurance, and gas. That money goes to other things now. Like KidBots.

“Chip, can you call a car to take us to school?”

Chip’s eyes flash yellow. “I’m asking your parents if it’s okay. You’re going to Riverside Elementary, you’ll get there at 8:05, and it costs $2.20.”

Henry’s mom’s answer pings back right away. She’s already at work across town, but she can say yes from her phone. A robot car arrives thirty seconds later. Chip folds itself slightly to fit through the car door, then unfolds once inside—its humanoid design allowing it to adapt to spaces built for people.

The schoolyard looks like something from a sci-fi movie. Every single fifth grader has a four-foot-tall humanoid robot companion walking beside them. Not some kids. Not most kids. Every single one. The robots look similar but not identical—some customized with different colors, some with stickers their kids added, each one walking on two legs just like the kids they’re protecting.

But Henry notices something else: the seventh graders have noticeably taller robots—five-footers that look more mature, more capable. And the ninth graders? Their robots are full adult height, nearly six feet tall, looking almost like young adults themselves. The progression is visible across the schoolyard—the robot companions literally growing alongside the students they serve.

The school district made KidBots a rule when they realized the robots helped kids learn so much better. The robots make everything fair—every student gets homework help, every student gets project support, every student gets someone watching to keep them safe.

Rich kids used to get tutors and extra help. Poor kids didn’t. Now every kid has their own robot helper. Everyone starts equal.

Mrs. Peterson’s fifth-grade classroom has changed a lot. Twenty-three kids, twenty-three four-foot-tall humanoid robots standing along the back wall in their charging spots. When she teaches math, the robots watch how well their kids understand, noticing when someone’s confused or when they get it. When Henry struggles with turning fractions into decimals, Chip’s eyes flash yellow—the teacher’s computer shows which students need help.

“Henry, Chip is telling me you’re having trouble with this. Let’s try another example together.”

The teacher can see exactly which kids understand and which ones are lost. No more kids getting left behind because they’re too embarrassed to raise their hands. The robots tell the teacher who needs help in real-time.

The Stray Dog Incident

After school on the third day, Henry and his friend Tyler decide to walk to the park instead of calling a car. It’s only six blocks away, which their parents said is okay. Chip and Scout walk with them on their two legs, easily keeping up, their humanoid forms designed specifically to go everywhere kids go—up stairs, across grass, through narrow sidewalks, anywhere.

They’re halfway there when a big stray dog comes out of an alley. It’s not trying to be mean exactly, but it’s huge, kind of scary, and walking toward the boys fast. Henry freezes. Tyler steps backward.

Chip moves faster than Henry knew robots could move. It steps between Henry and the dog, spreading its arms wide to look bigger, its four-foot frame creating an effective barrier. Its eyes flash red—the color that means danger.

“Stay behind me, Henry,” Chip says calmly. Then it makes a really high-pitched sound that Henry can’t hear but the dog obviously hates. The dog stops, whines, and backs away.

At the same time, Chip has sent an emergency message: “Henry ran into a dangerous stray dog at the corner of Maple and 5th Street. The dog is gone now, Henry is safe, the dog didn’t touch him. Animal control has been told.”

Three hundred miles away, Henry’s mom’s phone buzzes with a really important alert. She sees what happened, sees the GPS location, sees the video from Chip’s camera showing the dog walking away. Her heart is beating fast, but she can see Henry is okay. The robot took care of it.

Tyler’s robot, Scout, did the same thing—making a wall between Tyler and the dog, making deterrent sounds, telling parents. The two humanoid robots stay in protective mode until the dog is really far away and clearly leaving.

“Are you okay?” Chip asks, turning back to Henry.

“Yeah. That was scary. You were so fast.”

“Keeping you safe is the most important thing I do. Always.”

The boys keep walking to the park, but they both understand now that their robots aren’t just helpers—they’re protectors, always watching, always ready to jump in.

The Bully Confrontation

Two weeks into school, Henry runs into a different kind of problem. Three sixth-graders corner him near the bike racks after school. They’re not from his class, and they’re way bigger—twelve-year-olds who apparently think messing with a fifth-grader is fun.

“Hey, robot kid. Does your little friend do all your homework for you?”

Henry doesn’t know what to say. Chip has been standing nearby, but now the four-foot robot walks closer on its two legs, moving with human-like fluidity.

The biggest sixth-grader pushes Henry’s shoulder. Not hard enough to really hurt, but enough to be scary. “I’m talking to you.”

Chip steps directly between them, one hand gently but firmly pushing Henry back, the other hand raised toward the sixth-grader like a stop sign.

“You’re not allowed to touch Henry,” Chip says. The friendly voice from homework time is completely gone. This voice is flat and serious and doesn’t sound like it’s going to argue. “Move back right now.”

The sixth-graders laugh. “What are you going to do about it, robot?”

“I’m allowed to use force to protect Henry from getting hurt. You’re way bigger than him. You touching him means you might hurt him, so I have to stop you.” Chip’s eyes are bright red now. “I’ve already told the school security people, all the parents, and the police what’s happening. I’m recording all of this from multiple angles. Step back now, or I will physically move you away from Henry.”

The sixth-graders weren’t expecting this. They’re used to intimidation working. But Chip is four feet of robotic guardian on two powerful legs, clearly capable of doing exactly what it says, is recording everything, and absolutely won’t back down.

“I’m going to count to three. If you don’t step back, I’m going to assume you’re about to get more violent and I’ll respond with appropriate force. One.”

The biggest kid takes a step back. “Whatever. Your robot’s a snitch.”

“Two.”

They leave. Chip watches them walk away, eyes still red, tracking their movement with cameras, until they’re really far away. Only then does the robot turn back to Henry.

“Are you hurt?”

“No. Just scared.”

“That’s a normal reaction. They were trying to scare you into thinking they’d hurt you. I’ve documented everything that happened. The school will deal with them. Do you want me to call a car to take you home, or do you want to walk?”

Henry’s hands are shaking. “Car, please.”

“I’m asking your parents now. They said yes. Car will be here in ninety seconds.”

By the time Henry gets home, the school has already called the sixth-graders’ parents. Chip’s video made it totally obvious what happened. The bullies get detention and have to see the school counselor. Henry’s parents watch the video that night, proud of how Henry stayed calm and grateful the robot protected him.

“I’m glad Chip was there,” Henry’s dad says that evening.

Henry nods. At ten years old, he doesn’t really think about privacy much. Chip going everywhere with him feels normal—like having a friend who’s always there. The fact that everything he does is recorded, that his parents can see where he is at all times, that every purchase request and location change is monitored—none of that bothers him. He’s in fifth grade. He doesn’t need privacy yet. He needs help with homework and protection from dogs and bullies.

Maya’s Seventh Grade Upgrade

Henry’s older sister Maya just started seventh grade, and she got her upgrade last month—a new five-foot robot named Luna. Luna is noticeably different from the four-foot models the younger kids have.

“The coolest thing,” Maya tells Henry one evening, “is that Luna doesn’t report every single conversation to Mom and Dad anymore. She has privacy filters now. She only flags stuff if it’s actually dangerous—like if I was planning to hurt myself or someone else, or if I was in real danger. But regular friend drama? My business.”

Henry doesn’t totally understand why that matters, but Maya seems really happy about it.

“And look—Luna can carry way more stuff. She’s got storage compartments. And she’s faster. And she doesn’t need approval for purchases under $10. Mom and Dad just loaded money onto her account and I can spend it how I want, as long as I stay under my weekly budget.”

The five-foot model represents the transition from childhood to early adolescence. The robots are still protective, still helpful with schoolwork, still capable of intervening in dangerous situations. But the constant surveillance eases. Seventh graders get more financial autonomy, more conversational privacy, more space to make mistakes without every misstep being documented and reported.

“You’ll get one in two years,” Maya says. “You’re going to love the upgrade.”

The Ninth Grade Model

Henry sees the real difference when Maya’s friend Jasmine comes over. Jasmine is in ninth grade, and her robot—a full-size six-foot model named Atlas—is a completely different beast.

Atlas stands in the corner of the living room, and Henry realizes it looks less like a kid’s companion and more like a young adult assistant. The proportions are different. The movements are more fluid. The voice is deeper, more mature.

“The ninth-grade models are basically adult robots,” Maya explains later. “They only track location if you’re out past curfew or in a restricted area. They don’t monitor conversations at all unless you ask them to. They don’t report purchases. They’re more like personal assistants than guardians. Jasmine says it’s like finally being treated like a real person instead of a kid who needs constant supervision.”

The progression makes sense: four-foot models for fifth and sixth graders who need lots of help and protection. Five-foot models for seventh and eighth graders who are starting to develop independence but still need oversight. Full-size models for high schoolers who deserve privacy and autonomy but benefit from having a capable assistant available when needed.

By the time students graduate high school, they’ve learned to work with increasingly autonomous AI companions while gradually gaining the privacy and independence that adults need. The transition from supervised child to autonomous adult happens gradually, mediated by robots that grow in size and grant increasing privacy at each stage.

The Autonomous Adventure

A few weeks later, Henry asks his mom something new. “Mom, can Tyler and I take a car to the science museum on Saturday? They have a new bug exhibit.”

His mom thinks about it. The museum is twelve miles away—farther than Henry normally goes. But it’s a safe place, Tyler’s parents would say yes, and both boys would have their four-foot humanoid robots with them, able to follow them anywhere in the museum, protect them anywhere they go.

“Let me talk to Tyler’s mom.”

The moms coordinate through text messages over the next hour. They agree on the rules: the boys can go, they have to stay together, the robots have to keep tracking their location, they need to be home by 4 PM, and if they go anywhere other than the museum they need to ask permission immediately.

Saturday morning, Henry stands at the corner with Chip. “Can we call a car to the Natural History Museum?”

“I’m asking your mom if it’s okay. Waiting for her answer.”

The answer doesn’t come back right away this time. Henry’s mom is double-checking everything—making sure the museum is open, making sure Tyler’s family confirmed their side.

Finally: “She said yes. Car will be here in three minutes. You have $25 to spend on lunch and the gift shop. Every time you want to buy something, I have to ask permission first. You need to check in through me every hour. If you leave the museum without asking, they’ll come get you immediately. Got it?”

“Got it,” Henry says.

Tyler arrives with Scout. The driverless car pulls up—bigger than the school cars, with comfortable seats for four. The boys climb in, and their four-foot humanoid robots fold themselves gracefully into the remaining seats, designed to fit into human spaces.

“Natural History Museum, two kids with guardian robots, we’ll be there in twenty-two minutes,” the car announces.

Henry watches the city go by. Robot cars move in coordinated patterns, talking to each other to make traffic flow smoothly. He sees other kids in cars with their robots—fifth and sixth graders with four-foot companions, seventh and eighth graders with five-footers, high schoolers with full-size models that look almost like other passengers rather than obvious guardians.

At the museum, the boys explore the bug exhibit while Chip and Scout follow on their two legs, able to climb stairs, navigate tight spaces, stand beside exhibits, go anywhere the boys go. When Henry wants to buy a book about beetles at the gift shop, Chip sends a purchase request.

“Henry wants to buy ‘The Secret Lives of Beetles’ for $18.95 plus tax. That would be $20.47 of his $25 daily budget, leaving $4.53. Can he buy it?”

His mom says yes. The book is educational, reasonably priced, and about something Henry loves. These little approvals are teaching Henry about money—how to think before spending—while preventing him from blowing all his money on impulse buys.

At lunch, Henry orders a sandwich. Again, Chip asks permission. Again, his mom approves instantly. The system is teaching Henry that spending has consequences, that he can’t buy everything, but also that reasonable requests usually get approved. He’s learning to think before asking rather than asking for everything and hearing “no” all the time.

He sees a group of seventh graders at another table, their five-foot robots standing nearby. One of them orders ice cream without their robot asking anyone’s permission—the purchase just happens. That’s the difference. When Henry gets his upgrade in two years, he’ll have that same autonomy for small purchases. He won’t need to ask permission for every little thing.

The Ice Cream Temptation

Walking back to the museum entrance after lunch, they pass an ice cream shop. It’s hot outside. Henry really wants ice cream. But he’s already spent $26 of his $25 budget—his mom said okay to going over for lunch. If he asks for ice cream now, she’ll probably say no.

“Chip, how much money do I have left?”

“You went $1.47 over today’s budget already. You have $8.32 left in your monthly spending money. An ice cream cone costs about $4.50. Should I ask your mom?”

Henry thinks about it. He could ask. His mom has said yes to most things today. But she also might say no because he’s already over budget, and then he’d feel bad. Or she might say yes but be annoyed, and then he’d feel guilty.

“No. I’ll wait until I get home.”

“That’s good budget thinking, Henry. Do you want me to remember you wanted ice cream so we can plan for it tomorrow?”

“Yeah, okay.”

This is exactly what the KidBot system is supposed to do. Henry’s not just following rules—he’s learning how money works, cause and effect, waiting for things you want. The robot doesn’t lecture him about budgets; it gives him information and lets Henry make real decisions with real but safe consequences. By seventh grade, when he gets his five-foot upgrade, he’ll have developed enough financial judgment that the system will trust him with small purchases without asking permission. By ninth grade, he’ll essentially manage his own money with the robot as an advisor rather than a gatekeeper.

The Evening Routine

By 3:45 PM, Chip reminds Henry and Tyler it’s time to head home. They call a car, ride back together—Scout getting dropped first, then Henry. When Henry walks through his front door, his mom is already home.

“How was the museum?”

“It was awesome! They had atlas beetles that were huge—like this big!” He gestures wildly with his hands. “And I got a book about beetles, and we saw the planetarium show, and Tyler and I had lunch at the cafe.”

“I know. Chip sent me updates all day. I’m glad you had fun.”

“Mom, can we go back next month? They’re getting a butterfly room.”

“Maybe. We’ll see.”

Later that evening, the family sits together while Chip projects a documentary about ant colonies on the living room wall. The four-foot robot stands quietly in the corner, occasionally adjusting the projection to make it look better.

At bedtime, Chip helps Henry check on his cricket farm—making sure the temperature is right, refilling water, noting which crickets are most active for Henry’s observation journal. They study for Monday’s math quiz together. Chip figures out the three things Henry’s still struggling with and creates practice problems specifically for those.

“You’re getting way better at fractions,” Chip notes. “Two weeks ago, you were getting 40% right on conversion problems. Tonight you got 85%. That’s huge improvement.”

“Really? I feel like I’m still bad at it.”

“You’ve improved a lot. You just don’t realize it yet because you’re focused on the problems you’re still getting wrong. That’s normal. Keep practicing, and pretty soon you’ll feel as confident as you actually are.”

Henry falls asleep feeling something new: independent, but safe. Free, but watched over. Like he’s growing up, but still protected. And knowing that in two years, he’ll get a bigger robot with more privacy makes the current monitoring feel temporary, appropriate for where he is now rather than a permanent surveillance state.

The Three-Stage System

The robot progression reflects genuine developmental psychology. Ten-year-olds don’t need privacy—they need structure, guidance, and protection. Twelve-year-olds starting puberty need increasing autonomy while still having guardrails. Fourteen-year-olds need to practice being adults with a safety net, not children under constant surveillance.

The four-foot models for fifth and sixth grade: Full monitoring, purchase approvals required, constant location tracking, comprehensive safety protocols. These robots assume their users need maximum oversight because they do. Ten and eleven-year-olds are still children who benefit from clear boundaries and protective intervention.

The five-foot models for seventh and eighth grade: Privacy filters activate—robots only report genuinely dangerous situations, not normal adolescent drama. Small purchases (under $10-$15) don’t require approval. Conversational monitoring shifts from recording everything to flagging only concerning patterns. Location tracking continues but with less frequent check-ins. These robots recognize their users are becoming teenagers who need space to develop identity while still having protection from serious harm.

The full-size models for ninth through twelfth grade: Minimal monitoring—location tracking only for curfew enforcement and restricted zones. No conversational monitoring unless specifically requested. Full financial autonomy within budget parameters. The robot functions more as personal assistant than guardian, offering advice when asked but not imposing restrictions. These robots treat their users as young adults who deserve privacy and respect while still providing support and assistance.

By graduation, students have learned to work with AI assistants while developing genuine independence. The transition from supervised child to autonomous adult happens gradually, mediated by companions that grow physically larger and grant increasing privacy at each stage. It’s not perfect—no system is—but it’s designed around human developmental needs rather than pure technological capability.

Final Thoughts

In 2035, growing up means growing up alongside your robot companions. You start fifth grade with a four-foot guardian who watches everything and reports to your parents. You upgrade to a five-foot advisor in seventh grade who gives you breathing room. You transition to a full-size assistant in ninth grade who treats you like the young adult you’re becoming.

The physical growth of the robots mirrors your own growth—from protected child to emerging teenager to young adult. The progression in privacy mirrors your developmental need for independence—from needing structure to needing space to practice autonomy.

Henry can visit museums with his four-foot Chip, learning budget management and enjoying independence within safe boundaries. In two years, he’ll have a five-foot robot that trusts him with small decisions and gives him conversational privacy. In four years, he’ll have a full-size companion that functions more like a very capable assistant than a guardian.

The system isn’t perfect. Some kids will chafe under fifth-grade monitoring even though developmentally they need it. Some parents will resist giving seventh graders privacy even though the system is designed to provide it. Some ninth graders will abuse the autonomy their full-size robots grant them.

But the three-stage progression represents a thoughtful attempt to use technology to support healthy development rather than replace it. The robots grow with the kids they serve, physically and functionally, adapting to changing needs as childhood gives way to adolescence and adolescence gives way to adulthood.

For ten-year-old Henry, walking beside his four-foot Chip, the monitoring feels like caring rather than surveillance. The structure feels like support rather than restriction. And knowing that in two years he’ll get more privacy, and four years even more, makes the current system feel fair—appropriate for where he is now, with built-in recognition that he won’t stay here forever.

He’s just a kid with a really cool four-foot robot friend who helps him build the best cricket farm in fifth grade. In two years, he’ll be a teenager with a five-foot advisor who respects his privacy while keeping him safe. In four years, he’ll be a young adult with a full-size assistant who treats him accordingly.

The robots grow up with them. And maybe that’s exactly how it should work.

Related Links:

Child Development and Robot Companions: Research Findings

Developmental Psychology and Privacy Needs Across Adolescence

The Three-Stage KidBot System: Design Philosophy and Outcomes