If you disagree, I would argue you have a very sad view of the world, where truth and cooperation are inferior to lies and manipulation.
If you disagree, I would argue you have a very sad view of the world, where truth and cooperation are inferior to lies and manipulation.
"Successful people create companies. More successful people create countries. The most successful people create religions"
This definition of success is founded on power and control. It's one of the worst definitions you could choose.
There are nobler definitions, like "Successful people have many friends and family" or "Successful people are useful to their compatriots"
Sam's published definition (to be clear, he was quoting someone else and then published it) tells you everything you need to know about his priorities.
I await with arms crossed all the lost souls arguing it's subjective.
OAI's problem isn't that Sam is untrustworthy; he's just too obviously untrustworthy.
Elon is not "untrustworthy" because of some ambitious deadlines or some stupid statements. He's plucking rockets out of the air and doing it super cheap whereas all competitors are lining their pockets with taxpayer money.
You add in everything else (free speech, speaking his mind at great personal risk, tesla), he reads as basically trustworthy to me.
When he says he's going to do something and he explains why, I basically believe him, knowing deadlines are ambitious.
That is both what is and what should be. We tend to focus on the bad, but fortunately most of the time the world operates as it should.
I’d argue that you can find examples of companies that were untrustworthy and still won. Oracle stands out as one with a pretty poor reputation that nevertheless has sustained success.
The problem for OpenAI here is that they need the support of tech giants and they broke the trust of their biggest investor. In that sense, I’d agree that they bit the hand that was feeding them. But it’s not because in general all untrustworthy companies/leaders lose in the end. OpenAI’s dependence on others for success is key.
In case you are going to make an argument about how happiness or some related factor objectively determines success, let me head that off. Altman thinks that power rather than happiness determines success, and is also a human being. Why objectively is his opinion wrong and yours right? Both of your definitions just look like people's opinions to me.
""Successful people create companies. More successful people create countries. The most successful people create religions."
I heard this from Qi Lu; I'm not sure what the source is. It got me thinking, though--the most successful founders do not set out to create companies. They are on a mission to create something closer to a religion, and at some point it turns out that forming a company is the easiest way to do so.
In general, the big companies don't come from pivots, and I think this is most of the reason why."
Sounds like an explicit endorsement lol
He’s dissecting it and connecting with the idea that if you a have a bigger vision and the ability to convince people, making a company is just an “implementation detail” … oh well .. you might be right after all … but I suspect is more nuanced, and is not endorsing religions as a means of obtaining success, I want to believe that he meant the visionary, bigger than yourself well intended view of it.
There's also mountains of research both theoretical and empirical that argue against exactly this point.
The problem is most papers on many scientific subjects are not replicable nowadays [0], hence my appeal to common sense, character, and wisdom. Highly underrated, especially on platforms like Hacker News where everything you say needs a double blind randomized controlled study.
This point^ should actually be a fundamental factor in how we determine truth nowadays. We must reduce our reliance on "the science" and go back to the scientific method of personal experimentation. Try lying to business partner a few times, let's see how that goes.
We can look at specific cases where it holds true- like in this case. There may be cases where it doesn't hold true. But your own experimentation will show it holds true more than not, which is why I'd bet against OpenAI
That tells us, at the very least, this guy is suspicious. Then you mix in all the other lies and it's pretty obvious I wouldn't trust him with my dog.
You're holding everyone to a very simple, very binary view with this. It's easy to look around and see many untrustworthy players in very very long running games whose success lasts most of their own lives and often even through their legacy.
That doesn't mean that "lies and manipulation" trump "truth and cooperation" in some absolute sense, though. It just means that significant long-running games are almost always very multi-faceted and the roads that run through them involve many many more factors than those.
Those of us who feel most natural being "truthful and cooperative" can find great success ourselves while obeying our sense of integrity, but we should be careful about underestimating those who play differently. They're not guaranteed to lose either.
You could have many other definitions that are not boring but also not bad. The definition published by Sam is bad
I don't know if I would consider being crucified achieving success. Long term and for your ideology maybe, but for you yourself you are dead.
I defer to Creed Bratton on this one and what Sam might be into.
"I've been involved in a number of cults, both as a leader and a follower. You have more fun as a follower, but you make more money as a leader."
The free speech part also reads completely hollow when the guy's first actions were to ban his critics on the platform and bring back self avowed nazis - you could argue one of those things are in favor of free speech, but generally doing both just implies you are into the nazi stuff.
What about the guy who repaired my TV once, where it worked for literally a single day, and then he 100% ghosted me? What was I supposed to do, try to get him canceled online? Seems like being a little shady didn't manage to do him any harm.
It's not clear to me whether it's usually worth it to be underhanded, but it happens frequently enough that I'm not sure the cost is all that high.
Was not going to argue happiness at all. In fact, happiness seems a very hedonistic and selfish way to measure it too.
My position is more mother goose-like. We simply have basic morals that we teach children but don't apply to ourselves. Be honest. Be generous. Be fair. Be strong. Don't be greedy. Be humble.
That these are objectively moral is unprovable but true.
It's religious and stoic in nature.
It's anathema to HN, I know.
You're complaining about tweets and meanwhile he's saving astronauts and getting us to the moon. Wake up man.
I said I would bet against OpenAI because they're untrustworthy and untrustworthiness is not good in the long run.
I can add a "usually": like "untrustworthiness is usually not good in the long run" if that's your gripe.
If you put your money otherwise, that's a sad view of the world.
Compare this to Elon Musk, who has built multiple companies with sizable moats, and who has clearly contributed to the engineering vision and leadership of his companies. There is no comparison. It's unlikely OpenAI would have had anywhere near its current success if Elon wasn't involved in the early days with funding and organizing the initial roadmap.
Space Musk promises a lot, has a grand vision, and gets stuff delivered. The price may be higher than he says and delivered later, but it's orders of magnitude better than the competition.
Tesla Musk makes and sells cars. They're ok. Not bad, not amazing, glad they precipitated the EV market, but way too pricey now that it's getting mature. Still, the showmanship is still useful for the brand.
Everything Else Musk could genuinely be improved by replacing him with an LLM: it would be just as overconfident and wrong, but cost less to get there.
(This is, I think, an apolitical observation: whatever you think about Trump, he is arguing for a pretty major restructuring of political power in a manner that is identifiable in fascism. And Musk is, pretty unarguably, bankrolling this.)
Big, long lived companies excel at delivering exactly what they say they are, and people vote with their wallet on this.
But I agree with your point. And it gets very ugly when these big institutions suddenly lose trust. They almost always deserve it, but it can upend daily life.
While banditry can work out in the short term; it pretty much always ends up the same way. There aren’t a lot of old gangsters walking around.
There are actually fascinating theories that the origin of money is not as a means of replacing a barter system, but rather as a way of keeping track who owed favors to each other. IOUs, so to speak.
I do not see how that is possible considering I have no clue who the second last owner of a cash was before me, most of the time.
How long is long?
I would bet on either side, but not in the middle on the model providers.
In his defense he is trying to fuck us all by feverishly lobbying the US Congress about the fact that "AI is waaay to dangerous" for newbs and possibly terrorists to get their hands on. If that eventually pays off, then there will be 3-4 companies that control all of any LLMs that matter.
Is that a wish, or a fact, or just plain wrong? You know that just because you want something to be true, it isn't necessarily, right?
I wouldn't trust somebody who cannot distinguish between wishful thinking and reality.
I don’t get how this follows from the quote you posted?
My interpretation is that successful people create durable, self sustaining institutions that deliver deeply meaningful benefits at scale.
I think that this interpretation is aligned with your nobler definitions. But your view of the purpose of government and religion may be more cynical than mine :)
They are facing competition from companies making hardware geared toward that inference that I think will push their margins down over time.
On the other end of the competitive landscape, what moat do those companies have? What is to stop OpenAI from pulling a Facebook and Sherlocking the most profitable products built on their platform?
Something like Apple developing a chip than can do LLM inference on device would completely upend everything.
2) the leader of only one of them is threatening to lock up journalists, shut down broadcasters, and use the military against his enemies.
3) only one of them led an attempted autogolpe that was condemned at the time by all sides
4) Musk is only backing the one described in 1, 2 and 3 above.
It's not really arguable, all this stuff.
The guy who thinks the USA should go to Mars clearly thinks he's better throwing in his lot with the whiny strongman dude who is on record -- via his own social media platform -- as saying that the giant imaginary fraud he projected to explain his humiliating loss was a reason to terminate the Constitution.
And he's putting a lot of money into it, and co-running the ground game. But sure, he wants to go to Mars. So it's all good.
Having the general ability to accomplish something doesn't magically infer integrity, you doing what you say does. Misleading and dissembling about doing what you say you will do is where you get the untrustworthy label, regardless of your personal animus or positive view of Musk.
These early promissory notes were more like coupons that were redeemed by the merchants. It didn't matter how many times a coupon was traded. As a good merchant, you knew how many of your notes you had to redeem because you're the one issuing the notes.
SpaceX and Tesla have both accomplished great things. There's a lot of talented people that work there. Elon doesn'r deserve all the credit for all their hard work.
I too am happy every day the good guys are winning today and always have won for all of history.
Models don't have this benefit. In Cursor, I can even switch between models. It would take a lot of convincing for me to switch off of Cursor, however.
If you want to create a country- better have a good reason, many noble people have done it, many bad people have done it.
If you want to create a religion- you're psycho (or you really are the chosen one)
Notice how Sam's definition of success increases with the probability of psychopathy.
I think he is making an allusion to Apple's culture.
There's successful companies because their product is good, there's more successful companies because they started early (and it feels like a monopoly: Google, Microsoft), and there's the most successful company that tells you what you are going to buy (Apple's culture).
https://www.buzzfeednews.com/article/richardnieva/worldcoin-...
https://www.technologyreview.com/2022/04/06/1048981/worldcoi...
Amazon are trustworthy?
That's going to be news to the large number of people who've received counterfeit books, dodgy packages, and so on. This is not a new problem:
That... is actually a pretty interesting argument. I have to admit that if an objective morality existed floating in the Aether, there would be no way to logically prove or disprove that one's beliefs matched it.
Since I can't argue it logically, let me make an emotional appeal by explaining how my beliefs are tied to my life:
I chose to be a utilitarian when I was 12 or so, though I didn't know it had that name yet. The reason I chose this is that I wanted my beliefs to be consistent and kind. Utilitarianism has only one basic rule, so it can't really conflict with itself. Kindness wise, you can technically weigh others however you like, but I think most utilitarians just assume that all people have equal worth.
This choice means that I doubted that my emotions captured any truth about morality. Over the years, my emotions did further effect my beliefs. For instance, I tweaked the rules to avoid "Tyranny of the Majority" type things. However, my beliefs also changed my emotions. One fruit of this is that I started to mediate conflicts more often instead of choosing a side. Sometimes it does make more sense to choose a side, but often people will all behave well if you just hear them out. Another fruit of these beliefs is that rather than thinking of things in terms of "good" or "bad", I now tend to compare states of the world as being better or worse than each other. This means that no matter how little capacity I have, I can still get myself to make things a little better for others.
All this to say, I feel like deciding to doubt my own feelings very much did what young me wanted it to do. I wouldn't be able to grow as a person if I thought I was right in the beginning.
I'd be interested to hear how you came to your beliefs. Given how firmly you've argued in this thread, it sounds like you probably have a story behind your beliefs too.
What I remember from the rational optimist - with trust, trade is unlimited.
what I remember from debt - just too much, need to read it.
That said, why would an investor give money to altman if he is untrustworthy? it just gets worse and worse.
I dunno if you have kids, but for me, main thing is having kids. It does a lot of things to your psyche, both suddenly and over a long period of time.
It's the first time you would truly take a bullet for someone, no questions asked. It tells you how much you know on an instinctual level. It forces you to define what behavior you will punish vs what you will reward. It expands your time horizons- suddenly I care very much how the world will be after I'm gone. It makes you read more mother goose books too. They all say the same things, even in different languages. It's actually crazy we debate morals at all.
I don't have kids, it does make a lot of sense that that would affect a person's psyche. The bit about having to define what behavior is good or bad seems to me like you are working out your beliefs through others, which seems like a reasonable way to do things since you get to have an outside perspective on the effects of what you are internalizing.
About debating morality though. That's exactly where principles become needed. It's great to say that we should be kind, but who are we kind to? It can't always be everyone at the same time. To bring things back to the trolley problem, I may save my mom, but it really is super unfair to the 20 people on the other track. This sort of thing is exactly why people consider nepotism to be wrong
""Healthy family relationships and rich circle of diverse friends" is an objectively better definition than "Money and companies with high stock prices""
Pretty broad principles we're comparing there.
When you get into specific cases, that's where you really need the debate and often there's no right answer, depending on the case. This is why we want judges who have a strong moral compass.
These values are bundled up in a person and they should even counterbalance each other. "Be Kind" should be balanced with "Be Strong". "Be Generous" should be balanced with "Be thrifty" and so on. The combination of these things is what we mean when we say someone has a moral compass.
I would argue it's immoral in some sense to sacrifice your mother for 5 other strangers. But these are fantasy cases that almost never happen.
A more realistic scenario is self defense or war.
In any case, the real issue with your logic is in thinking that an individual's personal views on the morality of a situation are correlated with the actual, potentially harsh, reality of that situation. There is rarely ever such a correlation and when it happens, it is likely a coincidence.
Is Sam Altman untrustworthy? Of course, he seems like a snake. That doesn't mean he will fail. And predicting the reality of the thing (that awful people sometimes succeed in this world) does not make someone inherently wrong or negative or even cynical - it just makes them a realist.
Modern cash systems involve anonymity and do not inherently keep track of the ownership history of money (as I noted). This anonymity is a fundamental feature of cash and many forms of currency today. Sure, early forms of currency might have functioned in small, close-knit communities and in such contexts, people were more likely to know each other’s social debts and relationships.
My point about cash being anonymous was meant to highlight how modern currency differs from the historical concept of money as a social ledger. This contrast is important because it shows how much the role of money has evolved.
None of these things are arguable in the abstract. When you're confronted with a case where you sacrifice one, it's always for the sake of another.
Amazon has been ignoring the problem for a long time, and is well aware of it.
They're so aware of it that I'd personally (not a lawyer though) consider them culpable due to their inaction in making any substantial actions towards fixing the problems.
I'm assuming you aren't familiar with these terms and so am defining them. Forgive me if you already were familiar.
Consequentialists think that the purpose of morality is to prevent "bad" consequences from happening. From a consequentialist perspective, one can very much argue about what makes a consequence "bad", and it makes a lot of sense to do so if we are trying to improve the human condition. Furthermore, I think consequentialists tend to care more about making their systems consistent, mainly so they are fair. As a side effect though, no principles have to be sacrificed when making a concrete decision, since none of them conflict. (That's what it means for a system to be consistent)
Virtue ethicists think that the purpose of morality is to be a "good" person. I think you are correct that it's pretty hard to define what a "good" person is. There are also many different types of "good" people. Even if you had such a person with consistent principles, if you try and stuff everyones "good" principles into them, they would become inconsistent. It's hard for me to tell exactly what the point of being "good" is supposed to be if it is not connected to the consequences of one's actions, in which case one would just be a consequentialist, However, if the point was to improve the human condition, then I think it would take a lot of different types of "good" people, so it doesn't try and make sense to argue our way into choosing one of them.
This isn't really an argument for a position as much as me trying to figure out where we disagree. Does that all sound correct to you?
He's a good marketer and created a cult of personality.
If he's so great at building businesses then just look at Twitter where there was no one who managed him.