←back to thread

467 points wg0 | 1 comments | | HN request time: 0.201s | source
Show context
chrismorgan ◴[] No.45899143[source]
The current title (“Pakistani newspaper mistakenly prints AI prompt with the article”) isn’t correct, it wasn’t the prompt that was printed, but trailing chatbot fluff:

> If you want, I can also create an even snappier “front-page style” version with punchy one-line stats and a bold, infographic-ready layout—perfect for maximum reader impact. Do you want me to do that next?

The article in question is titled “Auto sales rev up in October” and is an exceedingly dry slab of statistic-laden prose, of the sort that LLMs love to err in (though there’s no indication of whether they have or not), and for which alternative (non-prose) presentations can be drastically better. Honestly, if the entire thing came from “here’s tabular data, select insights and churn out prose”… I can understand not wanting to do such drudgework.

replies(9): >>45899255 #>>45899348 #>>45899636 #>>45899711 #>>45899852 #>>45900787 #>>45902114 #>>45903466 #>>45904945 #
layer8 ◴[] No.45899348[source]
The AI is prompting the human here, so the title isn't strictly wrong. ;)
replies(2): >>45900301 #>>45902047 #
dwringer ◴[] No.45900301[source]
Gemini has been doing this to me for the past few weeks at the end of basically every single response now, and it often seems to result in the subsequent responses getting off track and lower quality as all these extra tangets start polluting the context. Not to mention how distracting it is as it throws off the reply I was already halfway in the middle of composing by the time I read it.
replies(5): >>45901512 #>>45901950 #>>45901979 #>>45903775 #>>45907820 #
1. elxr ◴[] No.45907820[source]
This is why I wish chat UI's had separate categories of chats (like a few generic system prompts) that let you do more back-and-forth style discussions, or more "answers only" without adding any extra noise, or even an "exploration"/"tangent" slider.

The fact that system prompts / custom instructions have to be typed-in in every major LM chat UI is a missed opportunity IMO