I've built two websites recently. And I fell into a bit of a trap – I built them with humans in mind, not agents. How old fashioned of me!
My target customer is consumer brands who need help with AI – getting up to speed with it, understanding what's possible, integrating it into their workflows. This isn't something that people in those teams would have previous experience with.
For the person tasked with figuring this out, where are they going to look for answers?
Most likely via ChatGPT or Claude.
So I should have been building my websites with Claude in mind (rather than that person in my target company), to maximise my chances that my content will be surfaced in those chats.
I've been following the guidance in this brilliant post by Tom Osman around creating llms.txt files, structured data, and comprehensive meta data.
And asking Claude Code to integrate it for me, naturally.
The web is going to serve two distinct purposes: a web for humans and a web for agents. Trying to make one website serve both would be a shame, so I hope the web will split in two. The human web can double down on beautiful experiences, thoughtful interfaces, content that's genuinely worth reading and stumbling upon. The agent web can be plain text, structured answers, Markdown, and can be where the mass-generated SEO blogs we've all learnt to scroll past can live. One is where you browse and discover, the other is where you pull down what you need.