This is what tech bros in SV built and they all love it.
Now imagine the near future of the Internet, when all people have to adapt to that in order to not be dismissed as AI.
However, polished to a point that we humans start to lose our unique tone is what style guides that go into the minutiae of comma placement try do do. And I'm currently reading a book I'm 100% sure has been edited by an expert human editor that did quite the job of taking away all the uniqueness of the work. So, we can't just blame the LLMs for making things more gray when we have historically paid other people to do it.
I'm pretty sure all the hand wringing about A.I. is going to fade into the past in the same way as every other strand of technophobia has before.
Moreover, most people have more attachment to their own thoughts or to reading the unaltered, genuine thoughts of other humans than to a hole in the ground. The comment you respond to literally talks about the Orwellian aspects of altering someone's works.
It looks like you see writing & editing as a menial task that we just do for it's extrinsic value, whereas these people who complain about quality see it as art we make for it's intrinsic value.
Where I think a lot of this "technophobia" actually comes from though are people who do/did this for a living and are not happy about their profession being obsolesced, and so try to justify their continued employment. And no, "there were new jobs after the cotton gin" will not comfort them, because that doesn't tell them what their next profession will be and presumes that the early industrial revolution was all peachy (it wasn't).
Excavation is an inherently dangerous and physically strenuous job. Additionally, when precision or delicateness is required human diggers are still used.
If AI was being used to automate dangerous and physically strenuous jobs, I wouldn't mind.
Instead it is being used to make everything it touches worse.
Imagine an AI-powered excavator that fucked up every trench that it dug and techbros insisted you were wrong for criticizing the fucked up trench.
They posited that a similar series of events happen before, and predicted they will happen again.
Even if the text is a simple article, a personal touch / style will go a long way to make it more pleasant to read.
LLMs are just making everything equally average, minus their own imperfections. Moving forward, they will in-breed while everything becomes progressively worse.
That's death to our culture.
"By AI" or "with AI?" If I write the book and have AI proof read things as I go, or critique my ideas, or point out which points do I need to add more support for, is that written "by AI?"
When Big Corp says 30% of their code is now written "by AI," did they write the code by following thoughtful instruction from a human expert, who interpeted the work to be done, made decisions about the architectural impact, outlined those things and gave detailed instructions that the LLM could execute in small chunks?
This distinction I feel is going to become more important. AI tools are useful, and most people are using them for writing code, literature, papers, etc. I feel like, in some cases, it is not fair to say the thing was written by AI, even when sometimes it technically was.
Eh. There might be a tacit presumption here that correctness isn't real, or that style cannot be better or worse. I would reject this notion. After all, what if something is uniquely crap?
The basic, most general purpose of writing is to communicate. Various kinds of writing have varying particular purposes. The style must be appropriate to the end in question so that it can serve the purpose of the text with respect to the particular audience.
Now, we may have disagreements about what constitutes good style for a particular purpose and for a particular audience. This will be a source of variation. And naturally, there can be stylistic differences between two pieces of writing that do not impact the clarity and success with which a piece of writing does its job.
People will have varying tastes when it comes to style, and part of that will be determined by what they're used to, what they expect, a desire for novelty, a desire for clarity and adequacy, affirmation of their own intuitions, and so on. We shouldn't obfuscate and sweep the causes of varying tastes under the rug of obfuscation, however.
In the case of AI-generated text, the uncanny, je ne said quoi character that makes it irritating to read seems to be that it has the quality of something produced by a zombie. The grammatical structure is obviously there, but at a pragmatic level, it lacks a certain cohesion, procession, and relevance that reads like something someone on amphetamines or The View might say. It's all surface.
ive heard of many professions complain about their version of “editors” from comedians, to video producers, and radio jockies.
https://en.wikipedia.org/wiki/Marion_Steam_Shovel_(Le_Roy,_N...
> enhancing it with care
I get what you’re going for with this comment, but it seamlessly anthropomorphizes what’s happening in a way that has the opposite impact I think.
There is no thoughtfulness or care involved. Only algorithmic conformance to some non-human synthesis of the given style.
The issue is not just about the words that come out the other end. The issue is the loss of the transmission of human thoughts, emotions, preferences, style.
The end result is still just as suspect, and to whatever degree it appears “good”, even more soulless given the underlying reality.
Imagine someone shot a basketball, and it didn't go into the hoop. Why would telling a story about somebody else who once shot a basketball which failed to go into the hoop be helpful or relevant?
I toss all of my work into Apple Pages and Google Docs, and use them both for spelling and grammar check. I don't just blindly accept whatever they tell me, though; sometimes they're wrong, and sometimes my "mistakes" are intentional.
I also make a distinction between generating content and editing content. Spelling and grammar checkers are fine. Having an AI generate your outline is questionable. Having AI generate your content is unacceptable.
Xe also occasionally reminds people that, equal temperament being what it is, this pitch correction is actually in a few cases making people less well in tune than they originally were.
It certainly removes unique tone. Yesterday's was a pitch corrected version of a performance by John Lennon from 1972, that definitely changed Lennon's sound.
Language and music (which is a type of language) are a core of shared convention wrapped in a fuzzy liminal bark, outside of which, there is nonsense. An artist, be it a writer or a musician, is essentially somebody whose path stitches the core and the bark in their own unique way, and because those regions are established by common human consensus, the artist, by the act of using that consensus, is interacting with its group. And so is the person who enjoys the art. So, our shared conventions and what we dare call correctness are a medium for person-to-person communication, the same way that air is a medium to conduct sound or a piece of paper is a medium for a painting.
Furthermore, the core of correctness is fluid; language changes and although, at any time and place there is a central understanding of what is good style, the easy rules, such as they exist, are limited and arbitrary. For example, two different manuals of style will mandate different placements of commas. And somebody will cite a neurolinguistics study to dictate on the ordering of clauses within a sentence. For anything more complex, you need a properly trained neural network to do the grasping; be it a human editor or an LLM.
> The grammatical structure is obviously there, but at a pragmatic level, it lacks a certain cohesion, procession, and relevance that reads like something someone on amphetamines or The View might say. It's all surface.
Somebody in amphetamines is still intrinsically human, and here too we have some disagreement. I can not concede that AI’s output is always of the quality produced by a zombie, at least no more than the output of certain human editors, and at least not by looking at the language alone; otherwise it would be impossible for the AI to fool people. In fact, AI’s output is better (“more correct”) than what most people would produce if you forced them to write with a gun pointed to their head, or even with a large tax deduction.
What makes LLMs irritating is the suspicion that one is letting one’s brain engage with output from a stochastic parrot in contexts where one expects communication from a fellow human being. It’s the knowledge that, at the other end, somebody may decide to take your attention and your money dishonestly. That’s why I have no trouble paying for a ChatGPT plan—-it’s honest, I know what I get—-but hesitate to hire a human editor. Now, if I could sit at a caffe with said editor and go over their notes, then I would rather do just that.
In other words, what makes AI pernicious is not a matter of style or correctness, but that it poisons the communication medium—-it seeds doubt and distrust. That’s why people—-yours truly—-are burning manuals of style and setting shop in the bark of the communication medium, knowing that’s a place less frequented by LLMs and that there is a helpful camp filled with authoritative figures whose job of asserting absolute correctness may, perhaps, keep the LLMs in that core for a little longer.
Those are workarounds, however. It's too early to know for sure, but I think our society will need to rewrite its rules to adjust to AI. Anything from seclusion and attestation rituals for writers to a full blown Butlerian Jihad. https://w.ouzu.im/
Then again, it only takes 2 minutes to come to that realization when talking with many humans.
We can only be stoic and say "slop is gonna be slop". People are getting used to AI slop in text ("just proofreading", "not a natural speaker") and they got used to artificial artifacts in commercial/popular music.
It's sad, but it is what it is. As with DSP, there's always a creative way to use the tools (weird prompts, creative uses of failure modes).
In DSP and music production, auto-tune plus vocal comping plus overdubs have normalized music regressing towards an artificial ideal. But inevitably, real samples and individualistic artists achieve distinction by not using the McDonald's-kind of optimization.
Then, at some point, some of this lands in mainstream music, some of it doesn't.
There were always people hearing the difference.
It's a matter of taste.
> And comparing digging through the ground to human thought and creativity is an odd mix of self debasement and arrogance.
> I'm guessing there is an unspoken financial incentive guiding your point of view.
Your bias is showing through.
For what it's worth, it has made everything I use it for, much better. I can search the web for things on the net in mere seconds, where previously it could often take hours of tedious searching and reading.
And it used to be that Youtube comments were an absolute shit show of vitriol and bickering. A.I. moderation has made it so that now it's often a very pleasant experience chatting with people about video content.