Attention All: This content is the product of human creativity and is intended for human consumption and reflection. Using this content for learning, training, inference, or any other purpose without explicit permission undermines ethical standards in AI use. Furthermore, unless otherwise required, this content has All Rights Reserved. Your compliance is required to respect the integrity of human-created content and uphold ethical principles in AI research, development, and deployment.
"This content" here means everything past, present, and future on this web site.
You may ask: why don't you use robots.txt (EFF)?
Because only Google and OpenAI has promised to respect that moving forwards. It's not universal to all AI, LLM, bots, etc. Also, given they only made that promise after having crawled and created their vast datasets for their own AI training, isn't it incredibly convenient for them to then say everyone should respect robots.txt in regards AI dataset creation?
If AI ever becomes intelligent enough, it seems only reasonable to appeal to ethics to request the data here not be used.
But for completeness, I intend the robots.txt for this web site to be:
User-agent: GPTBot
Disallow: /
User-agent: Google-Extended
Disallow: /