←back to thread

223 points mindingnever | 1 comments | | HN request time: 0.208s | source
Show context
impossiblefork ◴[] No.45279816[source]
Very strange writing from semafor.com

>For instance, an agency could pay for a subscription or negotiate a pay-per-use contract with an AI provider, only to find out that it is prohibited from using the AI model in certain ways, limiting its value.

This is of course quite false. They of course know the restriction when they sign the contract.

replies(5): >>45280028 #>>45280048 #>>45280152 #>>45280390 #>>45280820 #
bri3d ◴[] No.45280152[source]
This whole article is weird to me.

This reads to me like:

* Some employee somewhere wanted to click the shiny Claude button in the AWS FedRamp marketplace

* Whatever USG legal team were involved said "that domestic surveillance clause doesn't work for us" and tried to redline it.

* Anthropic rejected the redline.

* Someone got mad and went to Semafor.

It's unclear that this has even really escalated prior to the article, or that Anthropic are really "taking a stand" in a major way (after all, their model is already on the Fed marketplace) - it just reads like a typical fed contract negotiation with a squeaky wheel in it somewhere.

The article is also full of other weird nonsense like:

> Traditional software isn’t like that. Once a government agency has access to Microsoft Office, it doesn’t have to worry about whether it is using Excel to keep track of weapons or pencils.

While it might not be possible to enforce them as easily, many, many shrink-wrap EULAs restrict the way in which software can be used. Almost always there is an EULA carve-out with different tier for lifesaving or safety uses (due to liability / compliance concerns) and for military uses (sometimes for ethics reasons but usually due to a desire to extract more money from those customers).

replies(3): >>45280277 #>>45280567 #>>45280715 #
1. salynchnew ◴[] No.45280567[source]
Could also be an article placed by a competitor + a squeaky wheel.