It completely abstracts away the concept of a machine, and it's simply translation as computation - but equally as powerful.
It completely abstracts away the concept of a machine, and it's simply translation as computation - but equally as powerful.
This lecture series goes into how it works: https://www.youtube.com/watch?v=LE0SSLizYUI
Similar to LISP in that sense.
Mathematica is at least semi-mainstream. Not sure of any other examples though.
https://maude.lcc.uma.es/maude-manual/maude-manualch1.html#x...
Also, why is this needed over the second line?
(map a (:lambda fun) = fun a)
(map a fun = fun a)
Which is really interesting but kinda hard for me to wrap my arms around compared with Martin Loff type theory. It looks as if type checking will have to be done manual, using N axiomatic rules for how you can prove A from B.
Is that your understanding as well?
Also, there doesn't seem to be much material on Computational Type theory online. Any good references? (Except for the nuPrl book of course).
pair a b = '(pair a b)
where ' is shorthand for quoting, like in lisp. If you want to explicitly define datatypes on demand, you can just manually quote and not use the pair function constructor. One thing I thought about to was smart constructors, like say if you want to define the mod 3 group: zero = 'zero
s (s (s x))) = x
s x = '(s x)
if values are always constructed with either "s" or "zero", the smart constructor of s enforces you can't create any terms with an s-chain of 3 or higher, IF the function arguments are eagerly evaluated. A -> A B (1)
A -> B (2)
B -> A
(1) never terminates, always adding a new B on application but not removing the A. (2) doesn't grow, but never terminates since each term is replaced with the other.