[0] Modulo that Python et al almost certainly have order(s) of magnitude more external libraries etc.
Prolog has many implementations and you don't have the same wealth of libraries, but yes, it's Turing complete and not of the "Turing tarpit" variety, you could reasonably write entire applications in SWI-Prolog.
It's a mind-bending language and if you want to experience the feeling of learning programming from the beginning again this would be it
Don't threaten me with a good time
let prologBlob = new ProLog()
prologBlob.add( "a => b" ).add( "b => c" )
prologBlob.query( "a == c?" ) == True
(not exactly that, but hopefully you get the gist)There's so much stuff regarding constraints, access control, relationship queries that could be expressed "simply" in prolog and being able to extract out those interior buts for further use in your more traditional programming language would be really helpful! (...at least in my imagination ;-)
But some parts, like e.g. the cut operator is something I've copied several times over for various things. A couple of prototype parser generators for example - allowing backtracking, but using a cut to indicate when backtracking is an error can be quite helpful.
uKanren is conceptually small and simple, here's a Ruby implementation: https://github.com/jsl/ruby_ukanren
I still think that Prolog should be mandatory for every programmer. It opens up the mind in such a logical way... Love it.
Unfortunately, I never found an opportunity in my 11 years since then to use it in my professional practice. Or maybe I just missed the opportunities?????
In the end though, it mostly just feels enough of a separate universe to any other language or ecosystem I'm using for projects that there's a clear threshold for bringing it in.
If there was a really strong prolog implementation with a great community and ecosystem around, in say Python or Go, that would be killer. I know there are some implementations, but the ones I've looked into seem to be either not very full-blown in their Prolog support, or have close to non-existent usage.
“The little prover” is a fantastic book for that. The whole series is.
https://play.flix.dev/?q=PQgECUFMBcFcCcB2BnUBDUBjA9gG15JtAJb...
is an embedded Datalog DB and query in a general-purpose programming language.
More examples on https://flix.dev/
In the words of a colleague responsible for said reports it 'eliminated the need for 50+ people to fill timesheets, saves 15 min x 50 people x 52 weeks per year'
It has been (and still is) in use for 10+years already. I'd say 90% of the current team members don't even know the team used to have to "punch a clock" or fill timesheets way back.
I think lua is the much better language for a wide variety of reasons (Most of the good Python libraries are just wrappers around C libraries, which is necessary because Python's FFI is really substandard), but I wouldn't reach for python or lua if I'm expecting to write more than 1000 lines of code. They both scale horribly.
https://news.ycombinator.com/context?id=43948657
Thesis:
1. LLMs are bad at counting the number of r's in strawberry.
2. LLMs are good at writing code that counts letters in a string.
3. LLMs are bad at solving reasoning problems.
4. Prolog is good at solving reasoning problems.
5. ???
6. LLMs are good at writing prolog that solves reasoning problems.
Common replies:
1. The bitter lesson.
2. There are better solvers, ex. Z3.
3. Someone smart must have already tried and ruled it out.
Successful experiments:
I see this sentiment a lot lately. A sense of missed nostalgia.
What happened?
In 20 years, will people reminisce about JavaScript frameworks and reminisce how this was an ideal world??
Advanced Turbo prolog - https://archive.org/details/advancedturbopro0000schi/mode/2u...
Prolog programming for artificial intelligence - https://archive.org/details/prologprogrammin0000brat_l1m9/mo...
Earlier, Prolog was used in AI/Expert Systems domains. Interestingly it was also used to model Requirements/Structured Analysis/Structured Design and in Prototyping. These usages seems interesting to me since there might be a way to use these techniques today with LLMs to have them generate "correct" code/answers.
For Prolog and LLMs see - https://news.ycombinator.com/item?id=45712934
Some old papers/books that i dug up and seem relevant;
Prototyping analysis, structured analysis, Prolog and prototypes - https://dl.acm.org/doi/10.1145/57216.57230
Prolog and Natural Language Analysis by Fernando C. N. Pereira and Stuart M. Shieber (free digital edition) - http://www.mtome.com/Publications/PNLA/pnla.html
The Application of Prolog to Structured Design - https://www.researchgate.net/publication/220281904_The_Appli...
SWI Prolog (specifically, see [2] again) is a high level interpreted language implemented in C, with an FFI to use libraries written in C[1], shipping with a standard library for HTTP, threading, ODBC, desktop GUI, and so on. In that sense it's very close to Python. You can do everyday ordinary things with it, like compute stuff, take input and output, serve HTML pages, process data. It starts up quickly, and is decently performant within its peers of high level GC languages - not v8 fast but not classic Java sluggish.
In other senses, it's not. The normal Algol-derivative things you are used to (arithmetic, text, loops) are clunky and weird. It's got the same problem as other declarative languages - writing what you want is not as easy as it seemed like it was going to be, and performance involves contorting your code into forms that the interpreter/compiler is good with.
It's got the problems of functional languages - everything must be recursion. Having to pass the whole world state in and out of things. Immutable variables and datastructures are not great for performance. Not great for naming either, temporary variable names all over.
It's got some features I've never seen in other languages - the way the constraint logic engine just works with normal variables is cool. Code-is-data-is-code is cool. Code/data is metaprogrammable in a LISP macro sort of way. New operators are just another predicate. Declarative Grammars are pretty unique.
The way the interpreter will try to find any valid path through your code - the thing which makes it so great for "write a little code, find a solution" - makes it tough to debug why things aren't working. And hard to name things, code doesn't do things it describes the relation of states to each other. That's hard to name on its own, but it's worse when you have to pass the world state and the temporary state through a load of recursive calls and try to name that clearly, too.
This is fun:
countdown(0) :-
write("finished!").
countdown(X) :-
writeln(X),
countdown(X-1).
It's a recursive countdown. There's no deliberate typos in it, but it won't work. The reason why is subtle - that code is doing something you can't do as easily in Python. It's passing a Prolog source code expression of X-1 into the recursive call, not the result of evaluating X-1 at runtime. That's how easy metaprogramming and code-generation is! That's why it's a fun language! That's also how easy it is to trip over "the basics" you expect from other languages.It's full of legacy, even more than Python is. It has a global state - the Prolog database - but it's shunned. It has two or three different ways of thinking about strings, and it has atoms. ISO Prolog doesn't have modules, but different implementations of Prolog do have different implementations of modules. Literals for hashtables are contentious (see [2] again). Same for object orientation, standard library predicates, and more.
In Prolog, anything that can't be inferred from the knowledge base is false. If nothing about "playsAirGuitar(mia)" is implied by the knowledge base, it's false. All the facts are assumed to be given; therefore, if something isn't given, it must be false.
Predicate logic is the opposite: If I can't infer anything about "playsAirGuitar(mia)" from my axioms, it might be true or false. It's truth value is unknown. It's true in some model of the axioms, and false in others. The statement is independent of the axioms.
Deductive logic assumes an open universe, Prolog a closed universe.
tag(TypeOfTag, ParentFunction, Line).
Type of tag indicating things like an unnecessary function call, unidiomatic conditional, etc.
I then used the REPL to pull things apart, wrote some manual notes, and then consulted my complete knowledgebase to create an action plan. Pretty classical expert system stuff. Originally I was expecting the bug fixing effort to take a couple of months. 10 days of Prolog code + 2 days of Prolog interaction + 3 days of sepples weedwacking and adjusting the what remained in the plugboard.
Plain Prolog's way of solving reasoning problems is effectively:
for person in [martha, brian, sarah, tyrone]:
if timmy.parent == person:
print "solved!"
You hard code some options, write a logical condition with placeholders, and Prolog brute-forces every option in every placeholder. It doesn't do reasoning.Arguably it lets a human express reasoning problems better than other languages by letting you write high level code in a declarative way, instead of allocating memory and choosing data types and initializing linked lists and so on, so you can focus on the reasoning, but that is no benefit to an LLM which can output any language as easily as any other. And that might have been nice compared to Pascal in 1975, it's not so different to modern garbage collected high level scripting languages. Arguably Python or JavaScript will benefit an LLM most because there are so many training examples inside it, compared to almost any other langauge.
You can have a module written in the `#racket` language (i.e., regular Racket) and then a separate module written in `#datalog` and the two can talk to each other!
1. web devs are scared of it.
2. not enough training data?
I do remember having to wrestle to get prolog to do what I wanted but I haven't written any in ~10 years.
A side one is that the LISP ecology in the 80s was hostile to "working well with others" and wanted to have their entire ecosystem in their own image files. (which, btw, is one of the same reasons I'm wary of Rust cough)
Really, it's only become open once more with the rise of WASM, systemic efficiency of computers, and open source tools finally being pretty solid.
In fact, in recent years people have started contributing again and are rediscovering the merits.
Imagine a medical doctor or a lawyer. At the end of the day, their entire reasoning process can be abstracted into some probabilistic logic program which they synthesize on-the-fly using prior knowledge, access to their domain-specific literature, and observed case evidence.
There is a growing body of publications exploring various aspects of synthesis, e.g. references included in [1] are a good starting point.
[1] https://proceedings.neurips.cc/paper_files/paper/2024/file/8...
Prolog’s Closed-World Assumption: A Journey Through Time - https://medium.com/@kenichisasagawa/prologs-closed-world-ass...
Is Prolog really based on the closed-world assumption? - https://stackoverflow.com/questions/65014705/is-prolog-reall...
Had never heard of it before, and this is first I'm hearing of it since.
Also had other cool old shit, like CIB copies of Borland Turbo Pascal 6.0, old Maxis games, Windows 3.1
But the true power is unlocked once the underlying libraries are implemented in a way that surpassesthe performance that a human can achieve.
Since implementation details are hidden, caches and parallelism can be added without the programmer noticing anything else than a performance increase.
This is why SQL has received a boost the last decade with massively parallel implementations such as BigQuery, Trino and to some extent DuckDB. And what about adding a CUDA backend?
But all this comes at a cost and needs to be planned so it is only used when needed.
Elmore Leonard, on writing. But he might as well have been talking about the cut operator.
At uni I had assignments where we were simply not allowed to use it.