←back to thread

234 points _false | 2 comments | | HN request time: 0.021s | source

COBOL legacy systems in finance and government are somewhat of a meme. However, I've never actually met a single person who's day job is to maintain one. I'd be curious to learn what systems are you working on?
Show context
42lux ◴[] No.44604467[source]
Bank.
replies(1): >>44604541 #
a3w ◴[] No.44604541[source]
Can LLMs do Cobol?
replies(4): >>44604711 #>>44604728 #>>44604994 #>>44606128 #
gwbas1c ◴[] No.44606128[source]
I'm sure they can do brainfuck if you have a good training set.
replies(1): >>44607543 #
1. bob1029 ◴[] No.44607543[source]
LLMs are terrible at brainfuck. I spent a solid week attempting to use generative models to iteratively refine BF program tapes with nothing to show for it. I've written genetic programming routines that can produce better brainfuck programs than ChatGPT can.

For example, if I prompt ChatGPT: "Write me a BF program that produces the alphabet, but inverts the position of J & K" it will deterministically fail. I've never even seen one that produces the alphabet the normal way. I can run a GP algorithm over an example of the altered alphabet string and use simple MSE to get it to evolve a BF program that actually emits the expected output.

The BPE tokenizer seems like a big part of the problem when considering the byte-per-instruction model, but fundamentally I don't think there is a happy path even if we didn't need to tokenize the corpus. The expressiveness of the language is virtually non-existent. Namespaces, type names, member names, attributes, etc., are a huge part of what allows for a LLM to lock on to the desired outcome. Getting even one byte wrong is catastrophic for the program's meaning. You can get a lot of bytes wrong in C/C++/C#/Java/Go/etc. (e.g. member names) and still have the function do exactly the same thing.

replies(1): >>44609175 #
2. gwbas1c ◴[] No.44609175[source]
That was a very serious response to an off-the-cuff joke.

BUT: Please, oh please, write up a blog entry! I bet that would be fun to read.