←back to thread

296 points gyre007 | 1 comments | | HN request time: 0.257s | source
Show context
lewisjoe ◴[] No.21280702[source]
Richard Gabriel’s famous essay “Worse is better” (https://www.jwz.org/doc/worse-is-better.html) is an interesting perspective on why Lisp lost to C. In a way, the same arguments (simplicity vs consistency vs correctness vs completeness) can be made for why functional programming lost to OOP.

But those philosophical perspectives aside, personally I find my brain works very much like a Turing Machine, when dealing with complex problems. Apart from my code, even most of my todos are simple step-by-step instructions to achieve something. It’s easily understandable why like me, other non-math folks would prefer a Turing Machine over Lambda Calculus’ way of writing instructions.

This could be why OOP/Imperative was often preferred over FP.

replies(13): >>21280720 #>>21280760 #>>21280800 #>>21280835 #>>21280855 #>>21281061 #>>21281225 #>>21281281 #>>21281464 #>>21282667 #>>21283371 #>>21286296 #>>21288188 #
1. hinkley ◴[] No.21288188[source]
Lisp lost in a much more profound way recently, and it's very rare to see anyone mention it, especially on the Lisp side of the conversation.

Over the last 10 years or so, we have come to the painful conclusion that mixing data and code is a very, very bad idea. It can't be done safely. We are putting functionality into processors and operating systems to expressly disallow this behavior.

If that conclusion is true, then Lisp is broken by design. There is no fixing it. It likes conflating the two, which means it can't be trusted.