For example, print change-dir make-dir; is equivalent to (print (change-dir (make-dir) ) ) in the old money. I wonder if I am reinventing too much here.
Did LISPers try to get rid of the brackets in the past?
Probably the best example of a “Lisp without parentheses” is Dylan. Originally, Dylan was developed as a more traditional Lisp with sexprs, but they came up with a non-sexr “surface syntax” before launching it to avoid scaring the public.
(define kons
(lambda (x) (lambda (y) ((pair false) ((pair false) ((pair x) y))))))
(define kar (lambda (x) (first (second (second x)))))
(define kdr (lambda (x) (second (second (second x)))))
(define nil ((pair true) ((pair true) false)))
(define null first)
(define atom (lambda (x) (first (second x))))
That's 2 extra booleans per list element.
While the one for recognizing atoms is probably necessary, the other one for recognizing nil is not: (define kons
(lambda (x) (lambda (y) ((pair false) ((pair x) y)))))
(define kar (lambda (x) (first (second x))))
(define kdr (lambda (x) (second (second x))))
(define nil ((pair true) false))
(define null (lambda (x) (((second x) (lambda (a) (lambda (d) (lambda (z) false)))) true)))
(define atom (lambda (x) (first x)))
The use of null+car+cdr can usually be avoided by using a matching construct instead like (((second list) (lambda (a) (lambda (d) (lambda (z) deal_with_car_a_and_cdr_d ) deal_with_nil)
[1] https://t3x.org/lfn/church.zipAnother valid question downvoted into oblivion.
The environment in (lexically scoped) LISP is an implementation detail. Lambda calculus does not need an environment, because variables are substituted on a sheet of paper. So lambda calculus equals lexically scoped LAMBDA in LISP.
Sure, you could view LISP as LC plus some extra functions (that are not easily implemented in LC).
These works are something I both understand and would never achieve myself. These are cultural artifacts, like deeply personal poetry, made purely for the process of it. Not practically useful, not state of the art, not research level, but... a personal journey?
If the author is reading this... can you share your vision? Motivation?
In C/C++ most functions return error codes, forcing the latter form.
And then there are functional languages allowing: x -> h -> g -> f but I think the implicit parameter passing doesn’t sit well with a lot of programmers either.
I do not usually talk much about "myself". I tried, but with no-one asking, I find it difficult to say anything.
Cage.
Thanks for An Introduction to Mental Development, I've throughly enjoyed it!
(credit to https://aphyr.com/posts/340-reversing-the-technical-intervie..., I always get a kick out of that and the follow up https://aphyr.com/posts/341-hexing-the-technical-interview).
More likely than not it's a matter of what a person gets used to. I've enjoyed working in Lisp/Scheme and C, but not so much in primarily functional languages. No doubt programmers have varied histories that explain their preferences.
As you imply, in C one could write nested functions as f (g (h (x))) if examining return values is unnecessary. OTOH in Lisp return values are also often needed, prompting use of (let ...) forms, etc., which can make function nesting unclear. In reality programming languages are all guilty of potential obscurity. We just develop a taste for what flavor of obscurity we prefer to work with.
doesn't seem to fit with:
"INTENDED AUDIENCE This is not an introduction to LISP."
on page 10.
Lisp from Nothing - https://news.ycombinator.com/item?id=24809293 - Oct 2020 (29 comments)
Lisp from Nothing - https://news.ycombinator.com/item?id=24798941 - Oct 2020 (5 comments)
I sure find them beautiful and all, but why do they take center stage so often? Beside the aesthetics and instructional value, I don't get the appeal. Also I feel that a bunch of the heavy lifting behind metacircular evaluators is actually done by the Polish notation syntax as well as the actual implementation, and these concepts don't get nearly as much love.
Any Lisper who can illuminate me?
Maybe less embarrassing than talking about Rock the Cashbar by The Clash (though that one was corrected the first time I saw the back of the album).
Where does one — who has no knowledge of these prerequisites or about LISP (except that the latter has been heard in programming circles as something esoteric, extremely powerful, etc.) — start, before reading this book?
It is always interesting to spot a person on the interwebs who seems to actually have managed to turn buddhist or some other teachings into real world deeds. Living really modestly (IIRC, he/you also uses modest, underclocked laptops?), publishing for the benefit of many, and doing all this for years and years. Like, there seems to be no "overhead" in this way of living. Hugely inspirational.
I would also point out the "Essays" section on nmh's webpage, especially the ones discussing sensitivity and high IQ: https://t3x.org/#essays
Having purchased several of your books, thanks for your work, nmh!
I can't speak for the author but this is exactly how I look at the lisp I'm developing. It's a lifetime project. I had some kind of vision depicting how different things could be, and at some point I started trying to make it happen. I want to convince myself I'm not insane for thinking it was possible in the first place.
I love it so much, and seeing your bibliography makes me feel like a kid in a candy store. The confluence of Asian philosophy and computing is delightful.
To put you in the correct headspace this Saturday morning: https://t3x.org/whoami.html
And thanks for the Cage quote. I enjoyed that, too!
Turning the Buddhist (or other) teachings into deeds is not too hard once you have understood who you are, and, maybe more importantly, who you are not. Figuring /that/ out can be tough and require a lot of practice.
What people perceive as modest is really an acceptance or even appreciation of what is. My apartment has not been renovated in decades, I repair what needs repair and otherwise leave things to themselves. I wear clothes until they disintegrate, and my hardware is already old when I buy it. This is the course of things. Things age and change and at some point disappear. Why prefer the new over the old? Why the old over the new? It is just that things and beings get old on their own, and it is much more joyful to witness this than trying to resist it.
The book is basically a modern and more complete version of the "Small C Handbook" of the 1980's. I goes through all the stages of compilation, including simple optimizations, but keeps complexity to a minimum. So if you just want to learn about compiler writing and see what a complete C compiler look like under the hood, without investing too much into theory, then this is probably one of very few books that will deliver.
Edit: and then Warren Toomey has written "A Compiler Writing Journey" based on PCC, which may shed a bit more light on the book: https://github.com/DoctorWkt/acwj
But learning the basics of lisp is more like a side effect, the focus is on program design.
Enjoy your stay!
Another source of awe is about Lisp being more of a programming system than a language, and Common Lisp was the standardization of a lot of efforts towards that by companies making large and industrial pieces of software like operating systems, word processors, and 3D graphics editors. At the language level, "compile", "compile-file", "disassemble", "trace", "break", "step" are all functions or macros available at runtime. When errors happen, if there's not an explicit handler for it (like an exception handler) then the default behavior isn't to crash but to trigger the built-in debugger. And the stack isn't unwound yet, you can inspect the local variables at every layer. (There's very good introspection in general for everything.) Various restarts will be offered at different parts of the stack -- for example, a value was unknown, so enter it now and continue. Or you can recompile your erroneous function and restart execution at one of the stack frames with the original arguments to try again. Or you can apt-get install some foreign dependency and try reloading it without having to redo any of the effort the program had already made along the way.
Again, all part of the language at runtime, not a suite of separate tools. Implementations may offer things beyond this too, like SBCL's code coverage or profiling features. All the features of the language are designed with this interactivity and redefinability in mind though -- if you redefine a class definition, existing objects will be updated, but you can control that more finely if you need to by first making a new update-instance-for-redefined-class method. (Methods aren't owned by classes, unlike other OOP languages, which I think eliminates a lot of the OOP design problems associated with those other languages.)
I like the book Successful Lisp as a tour of Common Lisp, it's got a suggested reading order in ch 2 for different skill levels: https://dept-info.labri.fr/~strandh/Teaching/MTP/Common/Davi... It's dated in parts as far as tooling goes but if you're mostly interested in reading about some bits rather than actively getting into programming with Lisp that's not so bad. If you do want to get into it, https://lispcookbook.github.io/cl-cookbook/ has some resources on getting started with a Lisp implementation and text editor (doesn't have to be emacs).
(let ([a (get-some-foo 1)]
[b (get-some-foo 2)])
(cond [(> a b) -1]
[(< a b) 1]
[else 0]))
...but I hate that, I'd much prefer if square brackets were only used for vectors, which is why I have reader macros for square brackets -> vectors and curly brackets -> hash tables in my SBCL run commands.I know this is a classic analogy, but now you've got me wondering, originally Maxwell wrote a messy pile of equations of scalrs, later someone (Gibbs?) gave them the familiar vector calculus form. Nowadays we have marvellously general and terse form, like (using the differential of the Hodge dual in naturalised units),
d star(F) = J
My question is, when are we going to get some super-compact unified representation of `eval`?Or just possibility to do syscalls to do something. What is more important then new syntax and sugar over basic instructions.
(They are probably “useful” in the dissemination of what the real essence of computation can reduce to, in practical terms.)
Not everything needs to be useful in fact: certain things can be just enjoyed in their essence, just looked at and appreciated. A bit like… art?
I am implementing my own Scheme as well. Why? I don’t know, one needs to do things that serve no apparent purpose, sometimes.
It was Oliver Heaviside (https://en.wikipedia.org/wiki/Oliver_Heaviside) that rewrote Maxwell's original equations (20 of them in differential form) into the notation used today (4 of them in vector calculus form).
Here's a nice comparison: https://ddcolrs.wordpress.com/2018/01/17/maxwells-equations-...
There's also version for the metacirculator interpreter written in full on M-expr, but they kinda break the spirit of things.
I think the version of eval that we have is already pretty terse for what it is. You could maybe code-golf it into something smaller, or you could code-golf it into something fully immutable.
My only gripe is that they all rely on an already existing reader that parses the expressions for you and represents them. Which is exactly what the book is about.
Finding a small enough interpretation that does ALL of it would be a dream, but I doubt it could be anywhere near as concise as the (modern) Maxwell equations.
(λ 1 1) (λ λ λ 1 (λ λ λ λ 3 (λ 5 (3 (λ 2 (3 (λ λ 3 (λ 1 2 3))) (4 (λ 4
(λ 3 1 (2 1)))))) (1 (2 (λ 1 2)) (λ 4 (λ 4 (λ 2 (1 4))) 5)))) (3 3) 2)
that tokenizes and parses a closed lambda term from a raw binary input stream and passes the term and the remainder stream to a given continuation [1].And then, at least for the compiler books, there is: http://t3x.org/files/whichbook.pdf
(cond ((printable? foo)
(print foo)
(newline)
foo)
(else
(print-members foo)
newline))
True, with modern machine-generated mass-operations refactoring is easier than with older tools, but that doesn't mean a given set of brackets is 'useless'.In general a sequence of expressions of which only the value of the last is used, like C's comma operator or the "implicit progn" of conventional cond and let bodies, is only useful for imperative programming where the non-last expressions are executed for their side effects.
Clojure's HAMTs can support a wider range of operations efficiently, so Clojure code, in my limited experience, tends to be more purely applicative than code in most other Lisps.
Incidentally, a purely applicative finite map data structure I recently learned about (in December 02023) is the "hash trie" of Chris Wellons and NRK: https://nullprogram.com/blog/2023/09/30/. It is definitely less efficient than a hash table, but, in my tests so far, it's still about 100ns per hash lookup on my MicroPC and 250ns on my cellphone, compared to maybe 50ns or 100ns respectively for an imperative hash table without FP-persistence. It uses about twice as much space. This should make it a usable replacement for hash tables in many applications where either FP-persistence, probabilistically bounded insertion time, or lock-free concurrent access is required.
This "hash trie" is unrelated to Knuth's 01986 "hash trie" https://www.cs.tufts.edu/~nr/cs257/archive/don-knuth/pearls-..., and I think it's a greatly simplified HAMT, but I don't yet understand HAMTs well enough to be sure. Unlike HAMTs, it can also support in-place mutating access (and in fact my performance measurements above were using it).
______
† sometimes called "functional", though that can alternatively refer to programming with higher-order functions