Available on ollama: https://ollama.com/library/gemma3
replies(2):
The community getting obsessed with Ollama has done huge damage to the field, as it's ineffecient compared to vLLM. Many people can get far more tok/s than they think they could if only they knew the right tools.
Unfortunately Ollama and vLLM are therefore incomparable at the moment, because vLLM does not support these models yet.