←back to thread

Claude for Chrome

(www.anthropic.com)
795 points davidbarker | 4 comments | | HN request time: 0.537s | source
Show context
rustc ◴[] No.45030857[source]
> Malicious actors can hide instructions in websites, emails, and documents that trick AI into taking harmful actions without your knowledge, including:

> * Accessing your accounts or files

> * Sharing your private information

> * Making purchases on your behalf

> * Taking actions you never intended

This should really be at the top of the page and not one full screen below the "Try" button.

replies(7): >>45030952 #>>45030955 #>>45031179 #>>45031318 #>>45031361 #>>45031563 #>>45032137 #
strange_quark ◴[] No.45030955[source]
It's insane how we're throwing out decades of security research because it's slightly annoying to have to write your own emails.
replies(14): >>45030996 #>>45031030 #>>45031080 #>>45031091 #>>45031141 #>>45031161 #>>45031177 #>>45031201 #>>45031273 #>>45031319 #>>45031527 #>>45031531 #>>45031599 #>>45033910 #
jjice ◴[] No.45031177[source]
My theory is that the average user of an LLM is close enough to the average user of a computer and I've found that the general consensus is that security practices are "annoying" and "get in the way". The same kind of user who hates anything MFA and writes their password on a sticky note that they stick to their monitor in the office.
replies(2): >>45031370 #>>45032082 #
1. woodrowbarlow ◴[] No.45031370[source]
it has been revelatory to me to realize that this is how most people want to interact with computers.

i want a computer to be predictable and repeatable. sometimes, i experience behavior that is surprising. usually this is an indication that my mental model does not match the computer model. in these cases, i investigate and update my mental model to match the computer.

most people are not willing to adjust their mental model. they want the machine to understand what they mean, and they're willing to risk some degree of lossy mis-communication which also corrupts repeatability.

maybe i'm naive but it wasn't until recently that i realized predictable determinism isn't actually something that people universally want from their personal computers.

replies(3): >>45031481 #>>45031645 #>>45031665 #
2. mywacaday ◴[] No.45031481[source]
I think most people don't want to interact with computers and people will use anything that reduces the amount of time spent and will be be embraced en-mass regardless of security or privacy issues.
3. williamscales ◴[] No.45031645[source]
I think most people want computers to be predictable and repeatable _at a level that makes sense to them_. That's going to look different for non-programmers.

Having worked helping "average" users, my perception is that there is often no mental model at any level, let alone anywhere close to what HN folks have. Developing that model is something that most people just don't do in the first place. I think this is mostly because they have never really had the opportunity to and are more interested in getting things done quickly.

When I explain things like MFA in terms of why they are valuable, most folks I've helped see usefulness there and are willing to learn. The user experience is not close to universally seamless however which is a big hangup.

4. brendoelfrendo ◴[] No.45031665[source]
I think you're right, but I think the mental model of the average computer user does not assume that the computer is predictable and repeatable. Most conventional software will behave in the same way, every time, if you perform the same operations, but I think the average user views computers as black boxes that are fundamentally unpredictable. Complex tasks will have a learning curve, and there may be multiple paths that arrive at the same end result; these paths can also be changed at the will of the person who made the software, which is probably something the average user is used to in our days of auto-updating app stores, OS upgrades, and cloud services. The computer is still deterministic, but it doesn't feel that way when the interface is constantly shifting and all of the "complicated" bits that expose what the software is actually doing are obfuscated or removed (for user convenience, of course).