←back to thread

446 points walterbell | 1 comments | | HN request time: 0.001s | source
Show context
BrenBarn ◴[] No.43577934[source]
It's become almost comical to me to read articles like this and wait for the part that, in this example, comes pretty close to the beginning: "This isn’t a rant against AI."

It's not? Why not? It's a "wake-up call", it's a "warning shot", but heaven forbid it's a rant against AI.

To me it's like someone listing off deaths from fentanyl, how it's destroyed families, ruined lives, but then tossing in a disclaimer that "this isn't a rant against fentanyl". In my view, the ways that people use and are drawn into AI usage has all the hallmarks of a spiral into drug addiction. There may be safe ways to use drugs but "distribute them for free to everyone on the internet" is not among them.

replies(12): >>43577939 #>>43577996 #>>43578036 #>>43578046 #>>43578066 #>>43578099 #>>43578125 #>>43578129 #>>43578304 #>>43578770 #>>43579016 #>>43579042 #
croes ◴[] No.43578304[source]
It’s a rant against the wrong usage of a tool not the tool as such.
replies(2): >>43578326 #>>43579044 #
Turskarama ◴[] No.43578326[source]
It's a tool that promotes incorrect usage though, and that is an inherent problem. All of these companies are selling AI as a tool to do work for you, and the AI _sounds confident_ not matter what it spits out.
replies(3): >>43578836 #>>43578856 #>>43579447 #
1. Terr_ ◴[] No.43578856[source]
My personal pet-peeve is how a great majority of people--and too many developers--are being misled into believing a fictional character coincidentally named "Assistant" inside a story-document half-created by an LLM is the author-LLM.

If a human generates a story containing Count Dracula, that doesn't mean vampires are real, or that capabilities like "turning into a cloud of bats" are real, or that the algorithm "thirsts for the blood of the innocent."

The same holds when the story comes from an algorithm, and it continues to hold when story is about a differently-named character named "AI Assistant" who is "helpful".

Getting people to fall for this illusion is great news for the companies though, because they can get investor-dollars and make sales with the promise of "our system is intelligent", which is true in the same sense as "our system converts blood into immortality."