←back to thread

759 points alihm | 1 comments | | HN request time: 0s | source
Show context
meander_water ◴[] No.44469163[source]
> the "taste-skill discrepancy." Your taste (your ability to recognize quality) develops faster than your skill (your ability to produce it). This creates what Ira Glass famously called "the gap," but I think of it as the thing that separates creators from consumers.

This resonated quite strongly with me. It puts into words something that I've been feeling when working with AI. If you're new to something and using AI for it, it automatically boosts the floor of your taste, but not your skill. And you end up never slowing down to make mistakes and learn, because you can just do it without friction.

replies(8): >>44469175 #>>44469439 #>>44469556 #>>44469609 #>>44470520 #>>44470531 #>>44470633 #>>44474386 #
Loughla ◴[] No.44469175[source]
This is the disconnect between proponents and detractors of AI.

Detractors say it's the process and learning that builds depth.

Proponents say it doesn't matter because the tool exists and will always exist.

It's interesting seeing people argue about AI, because they're plainly not speaking about the same issue and simply talking past each other.

replies(4): >>44469235 #>>44469655 #>>44469774 #>>44471477 #
ninetyninenine ◴[] No.44469774[source]
>It's interesting seeing people argue about AI, because they're plainly not speaking about the same issue and simply talking past each other.

There's actually some ground truth facts about AI many people are not knowledgeable about.

Many people believe we understand in totality how LLMs work. The absolute truth of this is that we overall we do NOT understand how LLMs work AT all.

The mistaken belief that we understand LLMs is the driver behind most of the arguments. People think we understand LLMs and that we Understand that the output of LLMs is just stochastic parroting, when the truth is We Do Not understand Why or How an LLM produced a specific response for a specific prompt.

Whether the process of an LLM producing a response resembles anything close to sentience or consciousness, we actually do not know because we aren't even sure about the definitions of those words, Nor do we understand how an LLM works.

This erroneous belief is so pervasive amongst people that I'm positive I'll get extremely confident responses declaring me wrong.

These debates are not the result of people talking past each other. It's because a large segment of people on HN literally are Misinformed about LLMs.

replies(2): >>44470427 #>>44471349 #
whatevertrevor ◴[] No.44470427[source]
I couldn't agree more, and not just on HN but the world at large.

For the general populace including many tech people who are not ML researchers, understanding how convolutional neural nets work is already tricky enough. For non tech people, I'd hazard a guess that LLM/ generative AI is complexity-indistinguishable from "The YouTube/Tiktok Algorithm".

And this lack of understanding, and in many cases lack of conscious acknowledgement of the lack of understanding has made many "debates" sound almost like theocratic arguments. Very little interest in grounding positions against facts, yet strongly held opinions.

Some are convinced we're going to get AGI in a couple years, others think it's just a glorified text generator that cannot produce new content. And worse there's seemingly little that changes their mind on it.

And there are self contradictory positions held too. Just as an example: I've heard people express AI produced stuff to not qualify as art (philosophically and in terms of output quality) but at the same express deep concern how tech companies will replace artists...

replies(1): >>44473679 #
the_af ◴[] No.44473679[source]
> Just as an example: I've heard people express AI produced stuff to not qualify as art (philosophically and in terms of output quality) but at the same express deep concern how tech companies will replace artists...

I don't think this is self contradictory at all.

One may have beliefs about the meaning of human produced art and how it cannot -- and shouldn't -- be replaced by AI, and at the same time believe that companies will cut costs and replace artists with AI, regardless of any philosophical debates. As an example, studio execs and producers are already leveraging AI as a tool to put movie industry professionals (writers, and possibly actors in the future) "in their place"; it's a power move for them, for example against strikes.

replies(1): >>44477221 #
whatevertrevor ◴[] No.44477221[source]
Yeah, I know that's the theory, but if AI generated art is slop then it follows that it can't actually replace quality art.

I don't think people will suddenly accept worse standards for art, and anyone producing high quality work will have a significant advantage.

And now if your argument is that the average consumer can't tell the difference, then well for mass production does the difference actually matter?

replies(1): >>44477515 #
1. the_af ◴[] No.44477515{3}[source]
Well, my main argument is that it's replacing humans, not that the quality is necessarily worse for mass produced slop.

Let's be cynical for a moment. A lot of Hollywood (and adjacent) movies are effectively slop. I mean, take almost all blockbusters, almost 99% action/scifi/superhero movies... they are slop. I'm not saying you cannot like them, but there's no denying they are slop. If you take offense at this proposition, just pretend it's not about any particular movie you adore, it's about the rest -- I'm not here to argue the merits of individual movies.

(As an aside, the same can be said about a lot of fantasy literature, Young Adult fiction, etc. It's by the numbers slop, maybe done with good intentions but slop nonetheless).

Superhero movie scripts could right now be written by AI, maybe with some curation by a human reviewer/script doctor.

But... as long as we accept these movies still exist, do we want to cut most humans out of the loop? These movies employ tons of people (I mean, just look at the credits), people with maybe high aspirations to which this is a job, an opportunity to hone their craft, earn their paychecks, and maybe eventually do something better. And these movies take a lot of hard, passionate work to make.

You bet your ass studios are going to either get rid of all these people or use AI to push their paychecks lower, or replace them if they protest unhealthy working conditions or whatever. Studio execs are on record admitting to this.

And does it matter? After all, the umpteenth Star Wars or Spiderman movie is just more slop.

Well, it matters to me, and I hope it's clear my argument is not exactly "AI cannot make another Avengers movie".

I also hope to have shown this position is not self-contradicting at all.