←back to thread

524 points noperator | 2 comments | | HN request time: 0.418s | source
Show context
gorgoiler ◴[] No.44497736[source]
Interesting article. Bizarrely it makes me wish I’d used Pocket more! Tangentially, with LLMs I’m getting very tired with the standard patter one sees in their responses. You’ll recognize the general format of chatty output:

Platitude! Here’s a bunch of words that a normal human being would say followed by the main thrust of the response that two plus two is four. Here are some more words that plausibly sound human!

I realize that this is of course how it all actually works underneath — LLMs have to waffle their way to the point because of the nature of their training — but is there any hope to being able to post-process out the fluff? I want to distill down to an actual answer inside the inference engine itself, without having to use more language-corpus machinery to do so.

It’s like the age old problem of internet recipes. You want this:

  500g wheat flour
  280ml water
  10g salt
  10g yeast
But what you get is this:

  It was at the age of five, sitting
  on my grandmother’s lap in the
  cool autumn sun on West Virginia
  that I first tasted the perfect loaf…
replies(5): >>44497793 #>>44497920 #>>44498626 #>>44499091 #>>44500377 #
apsurd ◴[] No.44497920[source]
How do you trust the recipe without context?

People say they want one thing but then their actions and money go to another.

I do agree there's unnecessary fluff. But "just give me the recipe" isn't really what people want. And I don't think your represent some outlier take because really have you ever gotten a recipe exactly as you outlined — zero context – and gave a damn to make it?

replies(5): >>44498306 #>>44498822 #>>44498850 #>>44500561 #>>44502401 #
1. lan321 ◴[] No.44498850[source]
> How do you trust the recipe without context?

Ratings or poster reputation.

I often use recipes from a particular chef's website, which are formulated with specific ingredients, steps, and, optionally, a video. I trust the chef since I've yet to try a bad recipe from him.

I also often use baking recipes from King Arthur based on ratings. They're also pretty consistently good and don't have much fluff.

replies(1): >>44502784 #
2. apsurd ◴[] No.44502784[source]
Those are good examples. A trusted chef's website can list purely the recipe because it's held within a pre-vetted context. I do this as well.

I'm advocating for the need for those kinds of trust signals. If AI literally just listed ingredients, I wouldn't trust it. How could I?