But unless you include pagination needs to be handled as well, the LLM will naively just implement the bare minimum.
Context matters. And supplying enough context is what makes all the difference when interacting with these kind of solutions.
> I asked the AI to write me some code to get a list of all the objects in an S3 bucket
they didn’t ask for all the objects in the first returned page of the query
they asked for all the objects.
the necessary context is there.
LLMs are just on par with devs who don’t read tickets properly / don’t pay attention to the API they’re calling (i’ve had this exact case happen with someone in a previous team and it was a combination of both).