←back to thread

167 points xnx | 1 comments | | HN request time: 0.206s | source
Show context
anupj ◴[] No.44530084[source]
Batch Mode for the Gemini API feels like Google’s way of asking, “What if we made AI more affordable and slower, but at massive scale?” Now you can process 10,000 prompts like “Summarize each customer review in one line” for half the cost, provided you’re willing to wait until tomorrow for the results.
replies(4): >>44530624 #>>44531272 #>>44533342 #>>44534982 #
dist-epoch ◴[] No.44530624[source]
Most LLM providers have batch mode. Not sure why you are calling them out.
replies(1): >>44534996 #
1. okdood64 ◴[] No.44534996[source]
I'll take it further. Regular cloud compute have batch workload capabilities at cheaper rates, as well since forever.