Most active commenters
  • diggan(5)
  • insane_dreamer(3)

←back to thread

747 points porridgeraisin | 32 comments | | HN request time: 0.271s | source | bottom
Show context
psychoslave ◴[] No.45062941[source]
What a surprise, a big corp collected large amount of personal data under some promises, and now reveals actually they will exploit it in completely unrelated manner.
replies(7): >>45062982 #>>45063078 #>>45063239 #>>45064031 #>>45064041 #>>45064193 #>>45064287 #
sigmoid10 ◴[] No.45062982[source]
The are valued at $170 Billion. Not quite the same as, but in same order of magnitude as OpenAI - while having only a single digit percent fraction of active users. They probably need to prepare for the eventual user data sellout, as it is becoming increasingly more obvious that none of the big players has a real and persistent tech leadership anymore. But millions and millions of users sharing their deepest thoughts and personal problems is gonna be worth infinitely more than all the average bot bullshit written on social media. That's also why Zuck is so incredibly desperate to get into the game. It's not about owning AI. It's about owning the world's thoughts and attention.
replies(8): >>45063158 #>>45063262 #>>45063487 #>>45063546 #>>45063592 #>>45063648 #>>45064254 #>>45064540 #
1. goalieca ◴[] No.45063262[source]
Companies all seem to turn against their users whenever they have revenue/earnings trouble.
replies(7): >>45063281 #>>45063308 #>>45063361 #>>45063477 #>>45064072 #>>45064300 #>>45064448 #
2. jsheard ◴[] No.45063281[source]
Considering every AI company is hemorrhaging money with no end in sight, that doesn't bode well, does it?
3. diggan ◴[] No.45063308[source]
It seems to me like some fundamental/core technologies/services just shouldn't be run by for-profit entities, and if come across one doing that, you need to carefully choose if you want to start being beholden to such entity.

As the years go by, I'm finding myself being able to rely on those less and less, because every time I do, I eventually get disappointed by them working against their user base.

replies(1): >>45063348 #
4. bigfishrunning ◴[] No.45063348[source]
Except LLMs aren't a fundamental or core technology, they're an amusing party trick with some really enthusiastic marketers. We don't need them.
replies(3): >>45063396 #>>45063421 #>>45063672 #
5. jascination ◴[] No.45063361[source]
Enshittification. It's a thing.
replies(1): >>45063982 #
6. komali2 ◴[] No.45063396{3}[source]
Under the current system we apparently do since Chatgpt is now by far and away the busiest psychiatrist in world history.

I don't think we should be so quick to dismiss the holes LLMs are fulfilling as unnecessary. The only thing "necessary" is food water and shelter by some measures.

replies(1): >>45091431 #
7. diggan ◴[] No.45063421{3}[source]
Personally, I'm able to write code I wasn't able to before, like functions heavy with math. For game development, this has been super helpful, when I know basically what inputs I have, and what output I need, but I'm not able to figure out how the actual function implementation should be. Add a bunch of unit tests, let the LLM figure out the math, and I can move on to more important features.

For me this been a pretty fundamental shift, where before I either had to figure out another way so I can move on, or had to spend weeks writing one function after learning the needed math, and now it can take me 10-30 minutes to nail perfectly.

replies(3): >>45064100 #>>45064310 #>>45064464 #
8. jamesblonde ◴[] No.45063477[source]
No, it's the Peter Thiel - be a monopoly, and then the inevitable enshittification of the platform when it becomes a monopoly.

The solution is to break up monopolies....

9. chamomeal ◴[] No.45063672{3}[source]
I’d say they’re a fundamental technology by now. Imagine how many people rely on them. And I’ve seen some heavy reliance.
replies(3): >>45063889 #>>45064249 #>>45064525 #
10. goalieca ◴[] No.45063889{4}[source]
I’ve also seen heavy reliance on opioids and that didn’t turn out well.
replies(1): >>45065660 #
11. beezlewax ◴[] No.45063982[source]
But can you enshitten that which is already shit?
replies(1): >>45064258 #
12. sim7c00 ◴[] No.45064072[source]
remove 'seem to'. it has no place in this sentence anymore. we're not in the stoneage anymore. when has this ever not been the case?
13. insane_dreamer ◴[] No.45064100{4}[source]
Sure, I’m more productive with it in certain aspects of my work as well. Does that make it a net positive for humanity? From the energy consumption impact on climate change alone I would say the answer is clearly no. And that’s before we even talk about the impact on the next generation’s job opportunities. And tons of other issues like how Big Tech is behaving.
replies(3): >>45064267 #>>45064302 #>>45064338 #
14. thejazzman ◴[] No.45064249{4}[source]
my colleagues relying on it ruined the job for me and i quit. i became the debugging agent expected to constantly fix their half baked "it looks like it works" but doesn't nonsense

seriously, the idea we need this is a joke. people need it to pretend they can do their job. the rest of us enjoy having quick help from it. and we have done without it for a very long time already..

15. bethekidyouwant ◴[] No.45064258{3}[source]
We’ve reached recursive enshitification, I need a thought leader to tell me what’s next
replies(1): >>45064301 #
16. bethekidyouwant ◴[] No.45064267{5}[source]
Was coming down from the trees a net positive for humanity?
replies(2): >>45065456 #>>45065496 #
17. SoftTalker ◴[] No.45064301{4}[source]
It's shit all the way down.
replies(1): >>45064457 #
18. ratelimitsteve ◴[] No.45064300[source]
For social media at least it's important to remember that the users are the product, not the customer. Trying to squeeze additional revenue from your product is SOP.
19. diggan ◴[] No.45064302{5}[source]
> Does that make it a net positive for humanity?

That I don't know, and probably no one else, way too early to tell. I only responded to a comment stating "LLMs aren't a fundamental or core technology, they're an amusing party trick", which obviously I disagree with as for me they've been a fundamental shift in what I'm able to do.

> From the energy consumption impact on climate change alone I would say the answer is clearly no.

Ok, I guess that's fair enough. So if someone happens to use local models at home, in a home that is powered by solar power, then you'd feel LLM starting to be a net positive for humanity?

> And tons of other issues like how Big Tech is behaving.

This is such a big thing in general (that I agree with) but it has nothing to do with LLMs as a technology. Big Tech acts like they own the world and can do whatever they want with it, regardless if there are LLMs or not, so not sure why anyone would expect anything else.

replies(1): >>45066718 #
20. BearOso ◴[] No.45064310{4}[source]
The LLM will tell you an approximation of what many responses on the Internet said the math should be, but you should have the knowledge and check if it's actually correct.

An LLM can give you a hazy picture, but it's your job to focus it.

replies(1): >>45065605 #
21. bdangubic ◴[] No.45064338{5}[source]
there is little-to-nothing we do day-to-day (ESPECIALLY Big Tech related) that is net positive for society
22. lenerdenator ◴[] No.45064448[source]
That's just shareholder capitalism, dude.
23. lenerdenator ◴[] No.45064457{5}[source]
Well, at least until you reach turtles.
24. smohare ◴[] No.45064464{4}[source]
You think you are “nailing it” but also lack the background to even determine whether that is the case. I can assure you, there’s likely some fundamental flaws in what you’re vibing.

Just think about the type of code these things are trained on and the fact you’re clearly some random non-specialist.

replies(1): >>45065617 #
25. tjr ◴[] No.45064525{4}[source]
Unless one's job expectations have been altered to demand LLM-quantity output, how could someone be reliant upon these tools now? What were they doing two years ago (or maybe even six months ago)?

I can understand becoming reliant on a technology -- I expect most programmers today would be pretty lost with punch cards or line editors -- but LLM coding seems too new for true reliance to have formed yet...?

26. insane_dreamer ◴[] No.45065456{6}[source]
Was the creation of the atom bomb a net positive for humanity?
27. ◴[] No.45065496{6}[source]
28. diggan ◴[] No.45065605{5}[source]
Exactly, which matches with precisely with how I'm using them. So with that perspective, you then agree they're a fundamental/core technology, at least for more than just me?
29. diggan ◴[] No.45065617{5}[source]
> some fundamental flaws in what you’re vibing

That's just a misunderstanding, I'm not "vibing" anything. The tests are written by me, the API interfaces are written by me, the usages are written by me, and the implementation of these functions are written by an LLM, but reviewed to be up to code standards/quality by me.

If a function gives me the right output for the inputs I have in mind, does anything beyond that really matter?

30. chamomeal ◴[] No.45065660{5}[source]
Agree with you there. And that sorta is the kind of reliance I’m talking about. My friends will ask GPT to read restaurant menus for them lol
31. insane_dreamer ◴[] No.45066718{6}[source]
> So if someone happens to use local models at home, in a home that is powered by solar power, then you'd feel LLM starting to be a net positive for humanity?

Sure, that would make a difference, but it's not gonna happen anytime soon, other than hacker hobbyists, because no one is making money off of that.

> This is such a big thing in general (that I agree with) but it has nothing to do with LLMs as a technology.

Correct -- I don't have any issue with the technology itself, but rather how the technology is implemented and used, and the resources put towards its use. And BigTech are putting hundreds of $B into this -- for what end exactly besides potentially making tons of money off of consumer subscribers or ads a-la-Meta or Google? If BigTech was putting the same amount of money into technology that could actually benefit humanity (you know, like actually saving the world from potential future destruction by climate change), I'd have a much kinder view of them.

32. psychoslave ◴[] No.45091431{4}[source]
Do we have statistics on this? What is the healing rate compared to human psychotherapy if so?