←back to thread

Google is winning on every AI front

(www.thealgorithmicbridge.com)
993 points vinhnx | 8 comments | | HN request time: 0.507s | source | bottom
Show context
gcanyon ◴[] No.43663844[source]
Several people have suggested that LLMs might end up ad-supported. I'll point out that "ad supported" might be incredibly subtle/insidious when applied to LLMs:

An LLM-based "adsense" could:

   1. Maintain a list of sponsors looking to buy ads
   2. Maintain a profile of users/ad targets 
   3. Monitor all inputs/outputs
   4. Insert "recommendations" (ads) smoothly/imperceptibly in the course of normal conversation
No one would ever need to/be able to know if the output:

"In order to increase hip flexibility, you might consider taking up yoga."

Was generated because it might lead to the question:

"What kind of yoga equipment could I use for that?"

Which could then lead to the output:

"You might want to get a yoga mat and foam blocks. I can describe some of the best moves for hips, or make some recommendations for foam blocks you need to do those moves?"

The above is ham-handed compared to what an LLM could do.

replies(8): >>43663872 #>>43663878 #>>43664836 #>>43665026 #>>43666361 #>>43668350 #>>43671835 #>>43682951 #
1. Lerc ◴[] No.43665026[source]
LLMs should be legally required to act in the interest of their users (not their creators).

This is a standard that already applies to positions of advisors such as Medical professionals, lawyers and financial advisors.

I haven't seen this discussed much by regulators, but I have made a couple of submissions here and there expressing this opinion.

AIs will get better, and they will become more trusted. They cannot be allowed to sell the answer to the question "Who should I vote for?" To the highest bidder.

replies(3): >>43665427 #>>43667633 #>>43668243 #
2. asadalt ◴[] No.43665427[source]
but that would kill monetization no?
replies(1): >>43666197 #
3. dimal ◴[] No.43666197[source]
Of course not. You’d have to pay for the product, just like we do with every other product in existence, other than software.

Software is the only type of product where this is even an issue. And we’re stuck with this model because VCs need to see hockey stick growth, and that generally doesn’t happen to paid products.

4. ysofunny ◴[] No.43667633[source]
> LLMs should be legally required to act in the interest of their users (not their creators).

lofty ideal... I don't see this ever happening; not anymore than I see humanity flat out abandoning the very concept of "money"

replies(1): >>43672329 #
5. Sebguer ◴[] No.43668243[source]
Who decides what's in the interest of the user?
replies(2): >>43672253 #>>43674655 #
6. Lerc ◴[] No.43672253[source]
The same same for the human professions, a set of agreed upon guidelines on acting in service of the client, and enforcement of penalties against identifiable instances of prioritizing the interests of another party over the client.

There will always be grey areas, these exist when human responsibilities are set also, and there will be those who skirt the edges. The matters of most concern are quite easily identifiable.

7. Lerc ◴[] No.43672329[source]
I am not a fan of fatalism. Instead of saying it won't ever happen, we need to be asking to have rights.

At the very least you will force people to make the case for the opposing opinion, and we learn who they are and why they think that.

Lawyers cannot act against their clients, do you think we have irreparably lost the ability as a society to create similar protections in the future.

8. btbuildem ◴[] No.43674655[source]
Ideally, the user.