←back to thread

223 points mindingnever | 2 comments | | HN request time: 0.001s | source
Show context
impossiblefork ◴[] No.45279816[source]
Very strange writing from semafor.com

>For instance, an agency could pay for a subscription or negotiate a pay-per-use contract with an AI provider, only to find out that it is prohibited from using the AI model in certain ways, limiting its value.

This is of course quite false. They of course know the restriction when they sign the contract.

replies(5): >>45280028 #>>45280048 #>>45280152 #>>45280390 #>>45280820 #
bri3d ◴[] No.45280152[source]
This whole article is weird to me.

This reads to me like:

* Some employee somewhere wanted to click the shiny Claude button in the AWS FedRamp marketplace

* Whatever USG legal team were involved said "that domestic surveillance clause doesn't work for us" and tried to redline it.

* Anthropic rejected the redline.

* Someone got mad and went to Semafor.

It's unclear that this has even really escalated prior to the article, or that Anthropic are really "taking a stand" in a major way (after all, their model is already on the Fed marketplace) - it just reads like a typical fed contract negotiation with a squeaky wheel in it somewhere.

The article is also full of other weird nonsense like:

> Traditional software isn’t like that. Once a government agency has access to Microsoft Office, it doesn’t have to worry about whether it is using Excel to keep track of weapons or pencils.

While it might not be possible to enforce them as easily, many, many shrink-wrap EULAs restrict the way in which software can be used. Almost always there is an EULA carve-out with different tier for lifesaving or safety uses (due to liability / compliance concerns) and for military uses (sometimes for ethics reasons but usually due to a desire to extract more money from those customers).

replies(3): >>45280277 #>>45280567 #>>45280715 #
axus ◴[] No.45280715[source]
A classic:

THIS SOFTWARE PRODUCT MAY CONTAIN SUPPORT FOR PROGRAMS WRITTEN IN JAVA. JAVA TECHNOLOGY IS NOT FAULT TOLERANT AND IS NOT DESIGNED, MANUFACTURED, OR INTENDED FOR USE OR RESALE AS ONLINE CONTROL EQUIPMENT IN HAZARDOUS ENVIRONMENTS REQUIRING FAILSAFE PERFORMANCE, SUCH AS IN THE OPERATION OF NUCLEAR FACILITIES, AIRCRAFT NAVIGATION OR COMMUNICATION SYSTEMS, AIR TRAFFIC CONTROL, DIRECT LIFE SUPPORT MACHINES, OR WEAPONS SYSTEMS, IN WHICH THE FAILURE OF JAVA TECHNOLOGY COULD LEAD DIRECTLY TO DEATH, PERSONAL INJURY OR SEVERE PHYSICAL OR ENVIRONMENTAL DAMAGE.

replies(4): >>45280853 #>>45281180 #>>45282916 #>>45285648 #
1. m463 ◴[] No.45282916[source]
reminds me of jslint

author added "must be used for good, not evil" to the license

...and IBM asked for an exception.

https://en.wikipedia.org/wiki/JSLint#License

https://news.ycombinator.com/item?id=5138866

note that restricting use of software makes it non-free gpl-wise.

RMS said the GPL does not restrict rights of the USER of software, just that when the software is redistributed, the rights are passed along.

replies(1): >>45283929 #
2. kentonv ◴[] No.45283929[source]
Good old Douglas Crockford. He also put the "must be used for good, not evil" restriction on JSON, which he invented. Obviously JSON is used for all kinds of evil, though.

A much younger, more naive me (~20 years ago) actually emailed him to complain about the ambiguous terms and he replied saying something to the effect of "It's obviously unenforceable, get over it."