←back to thread

Claude for Chrome

(www.anthropic.com)
795 points davidbarker | 2 comments | | HN request time: 0.414s | source
Show context
stusmall ◴[] No.45033056[source]
It's wild to see an AI company put out a press release that is basically "hey, you kids wanna see a loaded gun?" Normally all their public coms are so full of optimism and salesmanship around the potential. They are fully aware of how dangerous this is.
replies(8): >>45033105 #>>45033148 #>>45033197 #>>45033279 #>>45033315 #>>45033347 #>>45033852 #>>45037231 #
ankit219 ◴[] No.45033315[source]
This is what they need for the next generation of models. The key line is:

> We view browser-using AI as inevitable: so much work happens in browsers that giving Claude the ability to see what you're looking at, click buttons, and fill forms will make it substantially more useful.

A lot of this can be done by building a bunch of custom environments at training time, but only a limited number of usecases can be handled that way. They don't need the entire data, they still need the kind of tasks real world users would ask them to do.

Hence, the press release pretty much saying that they think it's unsafe, they don't have any clue how to make it safe without trying it out, and they would only want a limited number of people to try it out. Give their stature, it's good to do it publicly instead of how Google does it with trusted testers or Openai does it with select customers.

replies(1): >>45033410 #
1. zaphirplane ◴[] No.45033410[source]
I don’t get the argument. Why is the loaded foot gun better in the hands of “select” customers better than in the hands of self selecting group of beta testers?
replies(1): >>45033684 #
2. ankit219 ◴[] No.45033684[source]
They are still gating it by usecase (I presume). But this way, they are not limited to the creativity of what their self selected group of beta testers could come up with, and perhaps look at security against a more diverse set of usecases. (I am assuming the trusted testers who work on security etc would anyway be given access).