←back to thread

67 points GeorgeWoff25 | 1 comments | | HN request time: 0s | source
Show context
hamdingers ◴[] No.46183038[source]
Wouldn't want to give out valuable ad space[1] for free now would we.

1. https://news.ycombinator.com/item?id=46086771

replies(3): >>46183058 #>>46183144 #>>46183380 #
dmix ◴[] No.46183144[source]
The issue in the article was paying customers complaining about ads. The ads OpenAI wants to roll out would likely be for free users, since the costs of training and running these LLM systems is very expensive.

From the tweet in your linked post:

> This could help OpenAI give free users more generous usage and features, while users on paid plans stay ad free, which fits with the high costs of running ChatGPT and the revenue they expect from shopping and ad related features

replies(3): >>46183297 #>>46183350 #>>46184769 #
estimator7292 ◴[] No.46183350[source]
You'd have to be pretty dumb to believe ads are only for the free tier. Look at literally every subscription streaming service. They all have ads on paid tiers now.

They will put ads in the paid ChatGPT tiers. That is an absolute certainty. The only question is how long will they tolerate un-advertised eyballs on paid plans.

replies(3): >>46183405 #>>46183582 #>>46184317 #
dmix ◴[] No.46183405[source]
Netflix's paid+ads plan costs 50% less than the standard paid only version with no ads.

I could see ChatGPT search results having affiliate links for shopping stuff even for fully-paid users.

There's a lot of competition in this space, so we'll see what users tolerate. But it's going to be tough getting around the fact this stuff is expensive to run.

Things like this are only 'free' for a reason.

replies(1): >>46183741 #
wyre ◴[] No.46183741{3}[source]
>this stuff is expensive to run

What's expensive is innovating on current models and building the infrastructure. My understanding is inference is cheap and profitable. Most open source models cost less than a dollar for 1 million tokens which makes me think SotA models likely have a similar pricepoint, but more profit margin.

replies(1): >>46184017 #
aeon_ai ◴[] No.46184017{4}[source]
I can assure you that inference is not profitable if the user is paying nothing.
replies(1): >>46184457 #
1. rchaud ◴[] No.46184457{5}[source]
DAU/MAU stats of free users have already carved out multi-millionaire and billionaire fortunes for employees and executives, all paid out with VC money. Plenty of people are profiting, even if the corporation is deep in the red.