Interesting. We're using a SAAS solution for document extraction right now. I don't know if it's in our interest to build out more but I do like the idea of keeping extraction local.
Absolutely, we’ve been hearing the same from our customers - which is why we thought it makes sense to open source a bunch of schemas so that they’re reusable and compatible across various inference providers (esp. Ollama/local ones).