... Am I the only one thinking all those contorsions to get something usable are completely mental? All to get something potentially completely wrong in a subtle way?
Those LLM not only suck megawatts of energy and TFLOPS of compute, but they also consume heaps of brain power - all that for what, in the end? What betterment?