Raising AI Kids: Issue 12
Sandboxing 101 — Let Them Build Without You Losing Sleep
David had the talk with Sam. The vibe coding talk. "You can build things now," he said, feeling pretty good about it. "Just describe what you want to the AI and it'll write the code."
Sam's eyes lit up. Within 20 minutes, she had a working habit tracker running in her browser. Within 25 minutes, she had modified it to include a points system. Within 30 minutes, she had broken her family laptop.
Not catastrophically. But enough that the browser was acting strange and David spent his Saturday afternoon restarting things and checking for malware. Which, of course, there wasn't any. But he didn't know that for the first hour.
The lesson: vibe coding is great. Until it's not. And the difference is whether you're running code in a sandbox or on the family machine.
What a Sandbox Actually Is
Think of it this way. Your kid wants to try a new recipe. You let them cook on the actual family stove, with the actual family oil, next to the actual family cat? Or do you hand them a bowl, a spoon, and some ingredients on the kitchen counter and let them figure it out?
A sandbox is the digital equivalent of that bowl and spoon. It's a space where code can run, experiment, and break — without touching anything that matters. The worst thing that can happen is you refresh the page and start over.
The reason this matters is that code from an AI is like a recipe written by someone who's never tasted anything. It looks right. The steps make sense. But until you actually make it, you don't know if the sauce will curdle or the oven will catch fire. Sandboxing is the fireproof apron. It's the floor mat. It's the thing between "fun experiment" and "ruined Saturday."
The Sandboxes You'll Actually Use
You don't need many. You need two: one for web stuff (HTML, CSS, JavaScript) and one for everything else (Python, data, automation).
For web stuff, the best starting point is Claude Artifacts or ChatGPT Canvas. Both let AI write code and run it right in the browser window, with zero setup. Your kid sees a result instantly. They can modify the code, run it again, break it, fix it — and none of it touches your laptop. This is the entry point. Use it.
For anything more serious — Python projects, data analysis, anything that needs to run for a while or use files — Replit is the tool. It's a full coding environment in your browser. You write code, run code, save it, share it, and come back to it later. The free tier is genuinely sufficient for family learning. It's isolated, hosted by Replit, and won't touch your computer no matter what your kid runs.
That's it. Two tools. Artifacts or Canvas for quick web experiments. Replit for anything bigger. Everything else is scope creep.
Why This Actually Works as a Parent
Here's the thing nobody tells you: sandboxing isn't just about safety. It's about permission.
When your kid knows they can run code, break code, and modify code without asking permission or risking anything, they try things. They get curious. They follow the thread of "what if I change this?" And that curiosity — that willingness to experiment without fear — is the actual skill we're trying to build.
Without a sandbox, every experiment has a potential cost. "If I mess this up, Dad's going to be mad." So they don't try. They don't experiment. They copy exactly what the AI says and call it done. The learning stops before it starts.
With a sandbox, the cost of trying is zero. They can be genuinely curious. They can follow the thread. They can build something, break it, understand why it broke, and rebuild it better. That's where the learning actually happens.
Your job isn't to supervise every line of code. Your job is to make sure they're in a space where it's safe to be wrong.
The Real Risks (And Why a Sandbox Blocks All of Them)
Here's what can go wrong when code runs outside a sandbox. I'm not trying to scare you — I'm trying to make the case for the 30 seconds of setup that prevents all of it.
Code can read files on your computer. Not usually, and not from reputable AI services, but it can. A badly-written or malicious script could access documents, photos, or sensitive files. Code can install software. It can change system settings. It can open network connections and send data somewhere you didn't expect. And sometimes code is just wrong — it has bugs — and those bugs can make your computer slow, unstable, or unresponsive.
None of this is likely with code from a mainstream AI. But "not likely" is not "never." And the downside — identity theft, data loss, a compromised machine — is real. The sandbox eliminates all of it. The worst thing a sandbox does is give you a blank screen when you refresh.
That trade is obvious.
How to Make This a Habit
You don't need to set up anything complicated. You just need one rule:
Before code runs on your family machine, it runs in a sandbox.
That's the only rule. Everything else follows from that. If your kid gets code from an AI, it goes into Artifacts first. If it works and they want to build on it, it goes into Replit. If they need it to do something more than a sandbox can handle, you have a conversation about what that means and whether it's worth the risk.
The goal isn't to limit what they can build. It's to make sure the building is happening somewhere that can't hurt anything. Once that habit is in place, you can stop worrying about the "what if" scenarios. You're not saying no to building. You're saying yes to building safely.
✅ Do Now: Set Up the Safety Net Once
This week, spend 10 minutes setting up the two sandboxes your family will use:
1. Open Claude.ai (free), ask it to write a simple webpage, and click Run to see Artifacts work. That's it. You've now used the web sandbox.
2. Go to replit.com, create a free account, and create a new Python project. Type print("hello") and hit Run. You've now used the serious sandbox.
Show your kid both. Tell them: "Anything we build with AI goes in one of these two places first. Everything else is off-limits."
One 10-minute setup. Infinite safe experiments. That's the deal.
What's Next
Next issue: AI at Work — what it actually means for your kid's future. What does "career ready" even mean when AI is changing what careers look like? We're going to have the conversation most parents are avoiding — and give you actual language for it.
Until then — set up the sandboxes. Make the rule. And let them build.
— The Raising AI Kids crew
Raising AI Kids is for parents navigating the AI era — one issue at a time.
P.S. — If your kid asks why you're so serious about the sandbox rule, be honest. "Because I don't want to spend Saturday fixing my computer instead of having fun with you." That's the real reason. Kids get it.