←back to thread

71 points fka | 1 comments | | HN request time: 0.248s | source
Show context
joshstrange ◴[] No.44004489[source]
Related, it’s crazy to me that OpenAI hasn’t already done something like this for Deep Research.

After your initial question, it always follows up asking some clarifying questions, but it’s completely up to the user to format their responses and I always wonder if people are sloppy if the LLM gets confused. It would make much more sense for OpenAI to break out each question and have a dedicated answer box. That way the user’s response can be consistent and there’s less of a chance they make a mistake or forget to answer a question.

replies(2): >>44004520 #>>44006374 #
wddlz ◴[] No.44006374[source]
Sorry for the shameless plug but, we recently published this research on 'Dynamic Prompt Middleware' (https://www.iandrosos.me/images/chiwork25-27.pdf) as a potential approach for this. Basically, based on the user's prompt (and some other bits of context), we generate UX containing prompt refinements for users to quickly select answers to and do the prompting for the user.
replies(2): >>44006433 #>>44006960 #
1. ics ◴[] No.44006960[source]
Very neat paper, thanks for sharing. Being able to interact with a model through, say, Jupyter Notebook in this way would be amazing especially.