←back to thread

747 points porridgeraisin | 1 comments | | HN request time: 0s | source
Show context
psychoslave ◴[] No.45062941[source]
What a surprise, a big corp collected large amount of personal data under some promises, and now reveals actually they will exploit it in completely unrelated manner.
replies(7): >>45062982 #>>45063078 #>>45063239 #>>45064031 #>>45064041 #>>45064193 #>>45064287 #
sigmoid10 ◴[] No.45062982[source]
The are valued at $170 Billion. Not quite the same as, but in same order of magnitude as OpenAI - while having only a single digit percent fraction of active users. They probably need to prepare for the eventual user data sellout, as it is becoming increasingly more obvious that none of the big players has a real and persistent tech leadership anymore. But millions and millions of users sharing their deepest thoughts and personal problems is gonna be worth infinitely more than all the average bot bullshit written on social media. That's also why Zuck is so incredibly desperate to get into the game. It's not about owning AI. It's about owning the world's thoughts and attention.
replies(8): >>45063158 #>>45063262 #>>45063487 #>>45063546 #>>45063592 #>>45063648 #>>45064254 #>>45064540 #
goalieca ◴[] No.45063262[source]
Companies all seem to turn against their users whenever they have revenue/earnings trouble.
replies(7): >>45063281 #>>45063308 #>>45063361 #>>45063477 #>>45064072 #>>45064300 #>>45064448 #
diggan ◴[] No.45063308[source]
It seems to me like some fundamental/core technologies/services just shouldn't be run by for-profit entities, and if come across one doing that, you need to carefully choose if you want to start being beholden to such entity.

As the years go by, I'm finding myself being able to rely on those less and less, because every time I do, I eventually get disappointed by them working against their user base.

replies(1): >>45063348 #
bigfishrunning ◴[] No.45063348[source]
Except LLMs aren't a fundamental or core technology, they're an amusing party trick with some really enthusiastic marketers. We don't need them.
replies(3): >>45063396 #>>45063421 #>>45063672 #
diggan ◴[] No.45063421[source]
Personally, I'm able to write code I wasn't able to before, like functions heavy with math. For game development, this has been super helpful, when I know basically what inputs I have, and what output I need, but I'm not able to figure out how the actual function implementation should be. Add a bunch of unit tests, let the LLM figure out the math, and I can move on to more important features.

For me this been a pretty fundamental shift, where before I either had to figure out another way so I can move on, or had to spend weeks writing one function after learning the needed math, and now it can take me 10-30 minutes to nail perfectly.

replies(3): >>45064100 #>>45064310 #>>45064464 #
smohare ◴[] No.45064464[source]
You think you are “nailing it” but also lack the background to even determine whether that is the case. I can assure you, there’s likely some fundamental flaws in what you’re vibing.

Just think about the type of code these things are trained on and the fact you’re clearly some random non-specialist.

replies(1): >>45065617 #
1. diggan ◴[] No.45065617[source]
> some fundamental flaws in what you’re vibing

That's just a misunderstanding, I'm not "vibing" anything. The tests are written by me, the API interfaces are written by me, the usages are written by me, and the implementation of these functions are written by an LLM, but reviewed to be up to code standards/quality by me.

If a function gives me the right output for the inputs I have in mind, does anything beyond that really matter?