←back to thread

614 points nickthegreek | 3 comments | | HN request time: 0.599s | source
Show context
mgreg ◴[] No.39121867[source]
Unsurprising but disappointing none-the-less. Let’s just try to learn from it.

It’s popular in the AI space to claim altruism and openness; OpenAI, Anthropic and xAI (the new Musk one) all have a funky governance structure because they want to be a public good. The challenge is once any of these (or others) start to gain enough traction that they are seen as having a good chance at reaping billions in profits things change.

And it’s not just AI companies and this isn’t new. This is art of human nature and will always be.

We should be putting more emphasis and attention on truly open AI models (open training data, training source code & hyperparameters, model source code, weights) so the benefits of AI accrue to the public and not just a few companies.

[edit - eliminated specific company mentions]

replies(17): >>39122377 #>>39122548 #>>39122564 #>>39122633 #>>39122672 #>>39122681 #>>39122683 #>>39122910 #>>39123084 #>>39123321 #>>39124167 #>>39124930 #>>39125603 #>>39126566 #>>39126621 #>>39127428 #>>39132151 #
ertgbnm ◴[] No.39122564[source]
The botched firing of Sam Altman proves that fancy governance structures are little more than paper shields against the market.

Whatever has been written can be unwritten and if that fails, just start a new company with the same employees.

replies(7): >>39122621 #>>39122688 #>>39122787 #>>39123102 #>>39124695 #>>39127641 #>>39128460 #
boringuser2 ◴[] No.39124695[source]
I wonder if your lesson is "Sam Altman should/would have been fired but for market forces".
replies(2): >>39124753 #>>39179185 #
ohwellhere ◴[] No.39124753[source]
The lesson is that "should have been fired" was believed by the people who had power on paper; "should not have been fired" was believed by the people actually had power.
replies(1): >>39124770 #
1. boringuser2 ◴[] No.39124770[source]
That just simplifies things a hair too much. Remember, the people who worked at OpenAI, subject to market forces, also supported the return of Altman.

Market forces are broad and operate at every level of power, hard and soft.

replies(2): >>39125300 #>>39132671 #
2. xdavidliu ◴[] No.39125300[source]
> Remember, the people who worked at OpenAI, subject to market forces, also supported the return of Altman.

I believe that's what your parent comment was actually talking about. I read it saying the people in power on paper was the previous board, and the people actually in power were the employees (which by the way is an interesting inversion of how it usually is)

3. insane_dreamer ◴[] No.39132671[source]
> the people who worked at OpenAI, subject to market forces, also supported the return of Altman.

that's because most of those people did not work for the mission-focused parent OpenAI company (which the board oversaw) but it's highly-profit-driven-subservient-to-Microsoft child company (who were happy to jump to Microsoft if their jobs were threatened; no ding against them as they hadn't signed up to the original mission-driven company in the first place).

it's important to separate the two entities in order to properly understand the scenario here