←back to thread

Google AI Ultra

(blog.google)
320 points mfiguiere | 1 comments | | HN request time: 0s | source
Show context
charles_f ◴[] No.44045393[source]
This is the kind of pricing that I expect most AI companies are gonna try to push for, and it might get even more expensive with time. When you see the delta between what's currently being burnt by OpenAI and what they bring home, the sweet point is going to be hard to find.

Whether you find that you get $250 worth out of that subscription is going to be the big question

replies(5): >>44045528 #>>44045820 #>>44045959 #>>44046010 #>>44058223 #
Ancapistani ◴[] No.44045528[source]
I agree, and the problem is that "value" != "utilization".

It costs the provider the same whether the user is asking for advice on changing a recipe or building a comprehensive project plan for a major software product - but the latter provides much more value than the former.

How can you extract an optimal price from the high-value use cases without making it prohibitively expensive for the low-value ones?

Worse, the "low-value" use cases likely influence public perception a great deal. If you drive the general public off your platform in an attempt to extract value from the professionals, your platform may never grow to the point that the professionals hear about it in the first place.

replies(6): >>44045906 #>>44045964 #>>44046505 #>>44047071 #>>44050638 #>>44052117 #
garrickvanburen ◴[] No.44045906[source]
this is the problem Google search originally had.

They successfully solved it with an advertising....and they also had the ability to cache results.

replies(2): >>44046734 #>>44047305 #
mysterydip ◴[] No.44046734[source]
Do LLMs cache results now? I assume a lot of the same questions get asked, although the answer could depend on previous conversational context.
replies(2): >>44047066 #>>44047238 #
1. cj ◴[] No.44047238[source]
I imagine caching is directly in conflict with their desire to personalize chats by user.

See: ChatGPT's memory features. Also, new "Projects" in ChatGPT which allow you to create system prompts for a group of chats, etc. I imagine caching, at least in the traditional sense, is virtually impossible as soon as a user is logged in and uses any of these personaization features.

Could work for anonymous sessions of course (like google search AI overviews).