←back to thread

167 points xnx | 1 comments | | HN request time: 0.203s | source
1. tucnak ◴[] No.44529985[source]
I really hope it means that 2.5 models will be available for batching in Vertex, too. We had spent quite a bit of effort making it work with BigQuery, and it's really cool when it works. There's edge-case, though, where it doesn't work: in case the batch is also referring to cached prompt. We did report this a few months ago.