Basically anything that excels when declarative specification of relationships is more natural than imperative algorithms.
Now, I have no direct experience with any of the common logical programming systems. I have familiarity.
But anytime I came upon anything that might justify such a system, the need just didn’t seem to justify it.
Talking less than 100 rules. Most likely less than a couple dozen. Stacking some IFs and a bit of math, strategically grouped in a couple aptly named wrapper methods to help reduce the cognitive load, and it’s all worked pretty well.
And, granted, if I had solid experience using these systems, onboarding cost would be lower.
When have you found it to be worth cutting over?
Instead, I implemented a minimal set of primitives, and wrote a set of derivation rules (e.g. "if you have X+Y, and Y supports negation, you can derive X-Y by X+(-Y)"), and constraints (operator overloads mustn't have ambiguous signatures, no cycles allowed in the call tree), and set up a code generator.
250 lines of Prolog, plus another 250 of ASP (a dialect of Prolog), and I had a code synthesizer.
it was one of the most magical experiences of my entire career. I'd write an optimized version of a function, rerun synthesis and it would use it everywhere it could. I'd add new types and operators and it'd instantly plumb them through. seeing code synthesis dance for you feels amazingly liberating. it's like the opposite of technical debt.