I would think of a language like Go as small (say, in comparison to Rust or Swift) - the language itself at least, if you discount the standard library.
I find the use of the word 'small' quite confusing.
I would think of a language like Go as small (say, in comparison to Rust or Swift) - the language itself at least, if you discount the standard library.
I find the use of the word 'small' quite confusing.
https://news.ycombinator.com/item?id=34908067
https://news.ycombinator.com/item?id=9602430
https://news.ycombinator.com/item?id=2406325
Also this comment:
> "Lush" stands for "Lisp Universal Shell". It has not just S-expression syntax but recursion, setq, dynamic typing, quoting of S-expressions and thus lists and homoiconicity, cons, car, cdr, let*, cond, progn, runtime code evaluation, serialization (though bread/bwrite rather than read/print), and readmacros. Its object system is based on CLOS.
SN(1987) neural network simulator for AmigaOS (Leon Bottou, Yann LeCun)
|
SN1(1988) ported to SunOS. added shared-weight neural nets and graphics (LeCun)
| \
| SN1.3(1989) commercial version for Unix (Neuristique)
| /
SN2(1990) new lisp interpreter and graphic functions (Bottou)
| \
| SN2.2(1991) commercial version (Neuristique)
| |
| SN2.5(1991) ogre GUI toolkit (Neuristique)
| / \
\ / SN2.8(1993+) enhanced version (Neuristique)
| \
| TL3(1993+) lisp interpreter for Unix and Win32 (Neuristique)
| [GPL]
| \_______________________________________________
| |
SN27ATT(1991) custom AT&T version |
| (LeCun, Bottou, Simard, AT&T Labs) |
| |
SN3(1992) IDX matrix engine, Lisp->C compiler/loader and |
| gradient-based learning library |
| (Bottou, LeCun, AT&T) |
| |
SN3.1(1995) redesigned compiler, added OpenGL and SGI VL |
| support (Bottou, LeCun, Simard, AT&T Labs) |
| |
SN3.2(2000) hardened/cleanup SN3.x code, |
| added SDL support (LeCun) |
| _______________________________________________________|
|/
|
ATTLUSH(2001) merging of TL3 interpreter + SN3.2 compiler
[GPL] and libraries (Bottou, LeCun, AT&T Labs).
|
LUSH(2002) rewrote the compiler/loader (Bottou, NEC Research Institute)
[GPL]
|
LUSH(2002) rewrote library, documentation, and interfaced packages
[GPL] (LeCun, Huang-Fu, NEC)
https://lush.sourceforge.net/credits.htmlGo may be a small language by some definitions (and as my phrasing implies, perhaps not by others), but it is certainly one that has had a lot of person-hours put into it.
An article on the Brown PLT blog [1] suggests analyzing languages by defining a core language and a desugaring function. A small core simplifies reasoning and analysis but can lead to verbose desugaring if features expand into many constructs. The boundary between the core and sugared language is flexible, chosen by designers, and reflects a balance between expressiveness and surface simplicity.
Feature complexity can be evaluated by desugaring: concise mappings to the core suggest simplicity, while verbose or intricate desugarings indicate complexity.
So, a possible definition of a "small" language could be one with both a small core and a minimal desugaring function.
--
1: https://blog.brownplt.org/2016/01/08/slimming-languages.html
Not to mention; you seem to be religiously pushing react which is more of a dsl but still..
penguins <- read_csv("penguins.csv") |>
na.omit() |>
select(species, island, bill_length_mm, body_mass_g) |>
group_by(species, island) |>
summarize(
mean_bill_length = mean(bill_length_mm),
mean_mass = mean(body_mass_g),
n = n()
) |>
arrange(species, desc(mean_bill_length))
penguins |>
ggplot(aes(x = species, y = mean_bill_length, fill = island)) +
geom_col(position = "dodge") +
labs(
title = "Mean Bill Length by Species and Island",
y = "Mean Bill Length (mm)"
) +
theme_minimal()
My prime use would be generating diagrams of function call chains in large Python code bases.
FWIW it is called evolutionary or lineage (or hierarchical lineage) diagram I believe.
i found vijual[1] and mermaid-ascii[2] are good starting projects.
[1]: http://www.lisperati.com/vijual/ [2]: https://github.com/AlexanderGrooff/mermaid-ascii
I recommend the article "Evaluating the Design of the R Language" [1] - it reads like a horror story. The memory usage and performance is abysmal, the OO features are a mess, and the semantics are very weird ("best effort semantics" is about as predictable as it sounds!). The lexical scoping is based on Scheme but has so many weird edge cases. It's a dumpster fire of a language, but it somehow works for its intended purpose.
(Survival analysis and multilevel modeling comes to mind.)