←back to thread

70 points alexmolas | 1 comments | | HN request time: 0s | source
Show context
miki123211 ◴[] No.43646166[source]
There's also promptic which wraps litelm, which supports many, many, many more model providers, and it doesn't even need plugins.

Llm is a cool cli tool, but IMO litellm is a better Python library.

replies(1): >>43646438 #
1. simonw ◴[] No.43646438[source]
I think LLM's plugin architecture is a better bet for supporting model providers than the way LiteLLM does it.

The problem with LiteLLM's approach is that every model provider needs to be added to the core library - in https://github.com/BerriAI/litellm/tree/main/litellm/llms - and then shipped as a new release.

LLM uses plugins because then there's no need to sync new providers with the core tool. When a new Gemini feature comes out I ship a new release of https://github.com/simonw/llm-gemini - no need for a release of core.

I can wake up one morning and LLM grew support for a bunch of new models overnight because someone else released a plugin.

I'm not saying "LLM is better than LiteLLM" here - LiteLLM is a great library with a whole lot more contributors than LLM, and it's also been fully focused on being a great Python library - LLM has also had more effort invested in the CLI aspect than the Python library aspect so far.

I am confident that a plugin system is a better way to solve this problem generally though.