How to Structure Your Prompts With XML For Better Results
A weird thing about large language models?
They often work better when you move beyond conversational language (with Claude, at least).
A little structure can create outsized improvements. That’s where XML comes in.
What is XML?
Claude defines XML as:
A markup language that uses tags to label different parts of information. Think of it like organizing a filing cabinet with labeled folders instead of throwing everything in a drawer.
Think of XML as content signposts. They give the AI a heads up as to the purpose of a chunk of content.
If you write:
<context>
Background information here
</context>
<task>
What you actually want done
</task>XML helps AI understand what each piece of information is for.
Claude was specifically trained on XML structure, so these tags work especially well compared to other LLMs."
Creating XML tags
But before you run your prompt, let the LLM help you organize the content and write the best possible tags.
Tell it:
<task>
I'm going to give you a rough prompt. Please:
1. Identify what information is missing that would make this stronger
2. Reorganize it with helpful XML tags
3. Suggest any additional sections that would improve clarity
4. Show me the restructured version
</task>And then Claude will optimize the content and create or adjust your XML tags.
An example from my lesson today
Here’s an example where I asked Claude to help me sell my MacBook.
Prompt #1: My natural language prompt
I want to sell my MacBook Pro soon. I want to optimize for the highest price while minimizing the risk of fraud or scams. Tell me the best strategy for selling it - which platform to use, how to price it, and how to protect myself from scams.
Prompt #2: Claude’s rewrite
<context>
I want to sell my MacBook Pro soon. I want to optimize for the highest price while minimizing the risk of fraud or scams.
</context><task>
Help me figure out the best strategy for selling it. Before giving me your recommendation, think through this step by step:1. What are all the platform options and what are the trade-offs for each (price vs. safety vs. convenience)?
2. What factors affect pricing (condition, specs, timing, platform)?
3. What are the common scam patterns for each platform and how do I avoid them?
4. Given all of that, what’s the optimal strategy?
Show your reasoning process, then give me your recommended approach.
</task><my_priorities>
Highest priority: Don’t get scammedHigh priority: Get fair/good price (not necessarily absolute maximum)
Medium priority: Sell reasonably quickly (not looking to wait months)
Lower priority: Convenience (I can handle some hassle if it’s safer)
</my_priorities>
You can see the different in the resulting output here. It’s massive. More detailed, more analytical, more background shown on Claude’s reasoning process.
What This Means
You don’t need to become a prompt engineer. You need to understand one principle:
Clarity + organization beats cleverness.
Separate context, tasks, and constraints. And let Claude help you create the best possible prompt.
The process I built today works for any moderately complex prompt. With a little bit of XML, your Claude output won’t have you saying FML1.
Next up: Lesson 4 — saving time by creating persistent context.
Sorry. Had to fit this dumb joke in somewhere.


Brillaint approach to prompt optimization. The meta-prompt strategy (asking Claude to restructre the prompt) is underutilized. It basically turns the model into a collaborator that understands its own processing architecture. I've been doing this for API calls and the latency decrease alone is worth it.