Can we ask for the typical *nix text editors to disobey the POSIX standard of a text file next, so that I don't need to use hex editing to get trailing newlines off the end of files?
Can we ask for the typical *nix text editors to disobey the POSIX standard of a text file next, so that I don't need to use hex editing to get trailing newlines off the end of files?
Leaders choose the standards, especially as they approach monopoly.
Worse still: people will come out of the woodwork to actively defend the monopolist de facto standard producer.
I know it's just me but my worldview is that the world would be better if all editors had "insert final newline" behavior
I expect my editor to do what I say, not secretly(!) guess what I might have wanted, or will potentially want sometime in the future. Having to insert a newline while concatenating files is a chore, but a predictable annoyance. Having to hunt for mystery bytes, maybe less so.
All Unix text processing tools assume that every line in a text file ends in a newline. Otherwise, it's not a text file.
There's no such thing as a "trailing newline," there is only a line-terminating newline.
I've yet to hear a convincing argument why the last line should be an exception to that extremely long-standing and well understood convention.
What Unix program "throws a fit" when encountering a perfectly normal newline in the last line in a file?
What I ran into issues with was contemporary software that's shipped to Linux, such as Neo4j, which expects its license files to have no newline at the end of the file, and will actively refuse to start otherwise.
I have a feeling I'll now experience the "well that's that software's problem then" part of this debate. Just like how software not being able to handle CRLF / CR-only / LF-only, is always the problem - instead of text files being a joke, and platforms assuming things about them being the problem.
Is "line-terminating newline" a controlled / established term I'm unfamiliar with or am I right to hold deep contempt against you?
Because "trailing newline", contrary to what you claim, is 100% established terminology (in programming anyways), so I'd most definitely consider it "existing", and I find it actively puzzling that someone wouldn't.
The balance here, of course, being backwards compatability. I'd sooner kill EBCDIC, bad ASCII and Code Pages than worry about CRLF if we didn't have to care about ancient systems.
Programming languages still retain C's operator precedence hierarchy even though it was itself meant to be a backwards compatible compromise and leads to errors around logical operator expressions.
Anyways, this article is about actively breaking systems like some kind of protocol terrorist in order to achieve an outcome at any cost, if it was merely along the lines of "CRLF considered harmful in new protocols" I'd have nothing to say.
You didn't limit your general admiration of standards to CRLF, so no, not only that.
> about actively breaking systems like some kind of protocol terrorist in order to achieve an outcome at any cost,
That's simply false, he isn't
> Almost all implementations of these protocols will accept a bare NL as an end-of-line mark, even if it is technically incorrect.
See https://news.ycombinator.com/item?id=41832555 as far as HTTP/1.1 goes, it's definitely common but far from universal. The big problem with "it's 100% safe to make this change, since it doesn't break anything I know about" is that there are always a lot of things you don't know about, not all of which can be boiled down to being negligible weirdos.
So your position, then, is that all standards include "needless complexity?" What argument are you actually trying to make here?
> That's simply false, he isn't
Yea.. that's why the word "like" is present, it implies a near association, not a direct accusation.
> Almost all implementations of these protocols will accept a bare NL as an end-of-line mark, even if it is technically incorrect.
So, right back to my original point, then, standards prevent people from having to debug dumb issues that could have been avoided. This advice is basically "go ahead, create dumb issues, see if I care."
I may have flippantly labeled that as "protocol terrorism" but I don't think it's pure hyperbole either.
That you're mistaken in your one-sided generalization of the benefits of standards.
> So your position, then, is that all standards include "needless complexity?"
No, that's just another extreme you've made up.
> Yea.. that's why the word "like" is present, it implies a near association, not a direct accusation.
Your mistake is before "like", you can't be "about actively breaking systems" when you explicitly say that no systems will be broken
> "see if I care."
That this is false is also easy to see - the author reverted a change after he realized it breaks something ancient, so clearly he does care.
> standards prevent people from having to debug dumb issues that could have been avoided.
Not to circle the conversaion back to my original response to your point: why do you think "Almost all implementations" break the standard and "accept a bare NL"? Could it be that such unintuitive limitations don't prevent anything, and people still have to debug "dumb issues" because common expectations are more powerful?
It hadn't even occurred to me until today that anything else could be meant :o
Please do consider that many software products will not change and they will still be actively used on production environments that you will never have interest about.
And it was pretty clear from the context of norir's comment that they were not talking about legacy software, they were talking about writing new projects/file formats that used newlines as a separator. Just because you want to shoehorn your legacy projects into this discussion doesn't mean that they fit.