This fits the way I like to use LLMs: I always ask them for options, then I decide myself which of those options makes the most sense.
Essentially I'm using them as weird magical documentation that can spit out (incomplete but still useful) available options to guide my decision making at any turn.
replies(2):