One of the earliest things we did when we started using AI tools in our home-ed was just… type things in and see what came out. Which is fine as a starting point. It’s how most people start. The problem is that it can stay at that level indefinitely and the gap between “occasional vague query” and “structured, useful prompt” is where most of the real skill lives. It is a skill your soon-to-graduate-from-home-ed teenagers are going to need to master.
Prompt writing is genuinely teachable. It’s also one of the few AI-adjacent skills that transfers across every tool your teenager will ever use, and is one AI skill-set that employers are already looking for. It’s worth doing properly rather than just hoping they pick it up by osmosis.
This is how we approached it.
Start offline
Before anything opens on a screen, talk through what a prompt actually is. It’s not a Google search (those are usually fragments — three words, a postcode, a brand name). A prompt is a brief: you’re telling a piece of software what you want it to do, who you want it to be, what you already know, and what the output should look like. The software can’t read your mind. It works entirely with what you give it.
A useful framework for this is RCTF: Role, Context, Task, Format. It sounds more complicated than it is.
Role: who do you want the AI to be? A science tutor? A sceptical editor? A travel writer who specialises in medieval history? Giving the AI a role shifts the register and depth of the response in ways that are immediately obvious once you’ve tried it.
Context: what does it need to know to be useful? Is the answer from the AI for a twelve-year-old who has never studied the topic before? Is it a starting point for an essay or a final check? Context prevents the AI from pitching its response at entirely the wrong level.
Task: what, specifically, do you want it to do? Explain, summarise, argue for a position, generate five options, critique a draft? Vague task instructions produce vague results every time.
Format: how do you want the output presented? A bulleted list, a short paragraph, a dialogue, a table? Specifying format stops you getting four paragraphs of prose when you wanted a quick comparison.
Give your learner a pen and paper and ask them to write three prompts on any topics they like — something they’re currently studying, something they’re curious about, something completely random. The only rule is that each prompt has to use all four RCTF elements. This forces thinking before touching the keyboard, which is good discipline and also usually produces noticeably better prompts than starting directly on screen.
If they’ve never written a prompt before, model one first. Think aloud through each element. That visible thinking is often more useful than any amount of explanation.
Then go online
Once they have their handwritten prompts, open Claude or ChatGPT (or both — comparing outputs across tools is a lesson in itself) and run them.
The almost inevitable result is that some prompts work better than others. This is the productive part. Look at the outputs together and ask: what did it do well? Where did it pitch it wrong? What would you need to add to the context for it to get closer? Can you chain a second prompt onto this to refine it?
Prompt chaining is worth introducing here: the idea that you don’t need to get everything in one go. You can ask for a draft, then ask for it shorter, then ask for the examples to be more specific to a particular age group. Each prompt builds on the last. Most good AI-assisted work looks less like one perfect prompt and more like a conversation with progressive refinement.
Ask them to rewrite at least one of their original prompts based on what they saw in the output, run it again, and compare. The improvement is usually immediate enough to be satisfying.
A note worth making here, especially with teenagers who are already using AI tools for schoolwork: there’s a meaningful difference between using AI to avoid thinking and using it to think better. A weak prompt that produces a complete essay for copying is a very different use of the same tool than a structured prompt that generates a framework your learner then interrogates, expands, and rewrites in their own voice. The skill is in the intentionality. This is a good moment to state that distinction explicitly. (We have AI usage rules that help us prevent the copy-paste instinct – you can read those here.)
Back offline to consolidate
This is the step that’s easiest to skip and probably the most useful for actual retention.
Ask your learner to start a prompt notebook. Not a folder on a computer (though a Prompt Log document is useful later as your teens study different elements of AI). In this context, a prompt notebook is a physical notebook, specifically for AI prompt structures that worked. Writing things down by hand is a different cognitive process from typing, and the notebook becomes a personal reference that genuinely gets used, like a vocabulary book for language learning.
At minimum, the entry should include: the RCTF framework in their own words, the two or three prompt structures that produced the best outputs, and one thing they’d do differently.
If you want to extend the session, ask them to write a short reflection: what does a good prompt have in common with other kinds of clear writing they’ve done? The answer, that clarity of purpose, audience awareness, and specific instruction produce better results than vagueness, in any medium, is actually the more important lesson underneath all the AI-specific content. AI is being used as a tool to reinforce all the research skills they’re developing across their education.
This post is part of a series supporting our complete AI Skills Curriculum. If you’d be interested in using this curriculum with your home-learners – a six-month programme that builds real, employable AI skills – sign up below.
