←back to thread

443 points wg0 | 2 comments | | HN request time: 0.413s | source
Show context
chrismorgan ◴[] No.45899143[source]
The current title (“Pakistani newspaper mistakenly prints AI prompt with the article”) isn’t correct, it wasn’t the prompt that was printed, but trailing chatbot fluff:

> If you want, I can also create an even snappier “front-page style” version with punchy one-line stats and a bold, infographic-ready layout—perfect for maximum reader impact. Do you want me to do that next?

The article in question is titled “Auto sales rev up in October” and is an exceedingly dry slab of statistic-laden prose, of the sort that LLMs love to err in (though there’s no indication of whether they have or not), and for which alternative (non-prose) presentations can be drastically better. Honestly, if the entire thing came from “here’s tabular data, select insights and churn out prose”… I can understand not wanting to do such drudgework.

replies(9): >>45899255 #>>45899348 #>>45899636 #>>45899711 #>>45899852 #>>45900787 #>>45902114 #>>45903466 #>>45904945 #
layer8 ◴[] No.45899348[source]
The AI is prompting the human here, so the title isn't strictly wrong. ;)
replies(2): >>45900301 #>>45902047 #
dwringer ◴[] No.45900301[source]
Gemini has been doing this to me for the past few weeks at the end of basically every single response now, and it often seems to result in the subsequent responses getting off track and lower quality as all these extra tangets start polluting the context. Not to mention how distracting it is as it throws off the reply I was already halfway in the middle of composing by the time I read it.
replies(4): >>45901512 #>>45901950 #>>45901979 #>>45903775 #
1. layer8 ◴[] No.45901512[source]
Occasionally I find it helpful, but it would be good to have the option to remove it from the context.
replies(1): >>45902066 #
2. drivers99 ◴[] No.45902066[source]
You can if you script the request yourself, or you could have a front end that lets you cut out those paragraphs from the conversation. I only say that because yesterday I followed this guide: https://fly.io/blog/everyone-write-an-agent/ except I had to figure out how to do it with Gemini API instead. The context is always just (essentially) a list of strings (or "parts" anyway, doesn't have to be strings) that you pass back to the model so you can make the context whatever you like. It shouldn't be too hard to make a frontend that lets you edit the context, and fairly easy to mock up if you just put the request in a script that you add to.