←back to thread

169 points constantinum | 2 comments | | HN request time: 0s | source
1. wiradikusuma ◴[] No.40715002[source]
Can't we ask LLM/GenAI to summarize it and return structured output? /sarcasm (half?)
replies(1): >>40715145 #
2. hellovai ◴[] No.40715145[source]
;) https://www.promptfiddle.com/structured-summary-66myE (sorry bad syntax highlighting when including baml code in baml code)

{ author: "Sam Lijin"

key_points: [ "Structured output from LLMs, like JSON, is a common challenge."

  "Existing solutions like response_format: 'json' and function calling often disappoint."

  "The article compares multiple frameworks designed to handle structured output."

  "Handling and preventing malformed JSON is a critical concern."

  "Two main techniques for this: parsing malformed JSON or constraining LLM token generation."

  "Framework comparison includes details on language support, JSON handling, prompt building, control, model providers, API flavors, type definitions, and test frameworks."

  "BAML is noted for its robust handling of malformed JSON using a new Rust-based parser."

  "Instructor supports multiple LLM providers but has limitations on prompt control."

  "Guidance, Outlines, and others apply LLM token constraints but have limitations with models like OpenAI's."
]

take_way: "Consider using frameworks that efficiently handle malformed JSON and offer prompt control to get the desired structured output from LLMs."

}