More likely its just an LLM hallucination, not a real policy that Anthropic has. Unfortunately for them, it's a bad look to showcase one of the main failure modes of their product in their own business process.
It's both, isn't it? If the AI writes the policy and is also responsible for enforcing it (by handling tickets and acting as a gatekeeper for which issues are escalated to humans who can do something about them), then the hallucination becomes real.
It's the same thing. Whether it was hallucinated upstream or in situ, the point is that it's not a real policy that the business adheres to, just something the LLM spat out.
In the English language, "America" refers to a country. It is synonymous with "The United States of America". I say this as someone who lives in the same continent as that country, but not in the country itself.
Maybe you're thinking of "North America", "South America", or "the Americas".
Probably. There are a lot of countries, especially third world ones, with very lax legal systems, not to mention the multitude of countries where law basically doesn't exist.