←back to thread

Perl's decline was cultural

(www.beatworm.co.uk)
393 points todsacerdoti | 1 comments | | HN request time: 0s | source
Show context
majormajor ◴[] No.46179634[source]
I never interacted with the "Perl community" described here. When I used Perl in a past job it was in the "just google for how to do things" era.

The syntax, full of @ and %, was convoluted and made you have to think about more things compared to Ruby or Python without giving you that much apparent power or benefit (as opposed to when you'd need to think about types more in Java or a C-family language).

Neither Ruby, Python, nor Perl were in my first three languages (Pascal, C/C++, Java were those). Ruby, Python, Matlab, R, and Perl all came later for me, within a few years of each other. Perl did not have anything like the approachability of Ruby and Python coming from that Pascal/C/Java background.

(IMO Python is losing some of that now, especially like in the last project I encountered in a professional capacity in Python where optional type hinting was used but wasn't always accurate which was a special sort of hell.)

EDIT: the article even touches on this some in its description of Ruby: "Ruby is a language for programmers, and is at this point an sensible candidate for building something like Rails with - a relatively blank canvas for dynamic programming, with many of the same qualities as Perl, with less legacy cruft, and more modern niceties, like an integrated object system, exceptions, straightforward data structures." Ruby was newer, and wasn't something that grew out of sysadmin tools, but was always a full fledged OO application programming language first. So my disagreement with the article is that the culture then doesn't matter because no perl culture changes would've been able to reinvent the language as a nicer, newer language like Ruby because it never would've been perl anymore at that point.

replies(1): >>46179728 #
thaumasiotes ◴[] No.46179728[source]
> the last project I encountered in a professional capacity in Python where optional type hinting was used but wasn't always accurate which was a special sort of hell.

But that's the entire purpose of optional type hinting. If the hints had to be accurate, you'd have mandatory typing, not optional hinting.

replies(3): >>46179808 #>>46180218 #>>46180756 #
Arainach ◴[] No.46180218[source]
No, optional type hinting means there's sometimes not a hint. Having a hint and then passing some type that's not that is wrong and hell.
replies(1): >>46184612 #
echelon ◴[] No.46184612[source]
Python's type hinting is horrible.

It's not checked, it's not required, and the bolted on syntax is ugly.

Even if the types were checked, they'd still fail at runtime and some code paths wouldn't get exercised.

We need a family of "near-scripting" languages like Go that check everything AOT, but that can be interpreted.

replies(1): >>46184810 #
dragonwriter ◴[] No.46184810{3}[source]
> It's not checked, it's not required,

It is both of those if you use a typechecker, which is the whole reason it exists (in fact, the first popular typechecker existed before the annotation syntax using type comments; type annotations were developed specifically so that it could be accommodated in the language rather than becoming its own separate language.)

replies(1): >>46184910 #
1. echelon ◴[] No.46184910{4}[source]
That's the problem! The code should not run if the types are wrong. Having an external tool is an antipattern.

Having to rely on process for validity is a recipe for bad. We already know how the greater python community has been with requirements.txt and dependencies. I've spent days fixing this garbage.

It's a tooling problem. Good tools make good habits part of the automation and stop you from having to think about it.