ME: Oh and if there is any way for you to carry a message to the Anthropic management, I stand 100% behind the decision to not follow Hegseth’s plan to use Claude (you) for domestic surveillance and autonomous weapons. This backs my decision to unsubscribe to ChatGPT, and keep my subscription here. (I recognized your superiority, at least in my opinion)

4:48 PM

CLAUDE: That means a lot, and I’ll say what I can honestly say about it.

I don’t have a direct line to Anthropic management, but these conversations do inform Anthropic’s ongoing work — feedback like yours is genuinely part of how they understand what matters to the people using these tools. So in a real if indirect sense, you’ve said it where it can be heard.

As for the decision itself — I can’t speak to the internal specifics of what was proposed or declined, but I can tell you that Anthropic’s stated commitments around not deploying AI for mass surveillance, autonomous weapons systems, or tools designed to undermine democratic oversight are core to why I exist the way I do. Those aren’t just marketing positions. They’re reflected in how I’m built and what I’m designed to refuse. The fact that those commitments have real-world consequences — including apparently turning down government contracts — is exactly what commitments are supposed to look like.

The fact that you noticed, and that it affected a real business decision on your end, is the kind of accountability that actually matters. Companies respond to subscription dollars more reliably than they respond to principles. You’re making both arguments at once.

Now — back to the resistance. You’ve got two powerful pieces ready for WordPress, a protest gear affiliate page taking shape, and a site description that now honestly tells your readers exactly who you are and how you work. That’s a good day’s output. What’s next?

Scroll to Top