←back to thread

82 points meetpateltech | 1 comments | | HN request time: 0s | source
Show context
nomilk ◴[] No.45311543[source]
Surprising to see negativity here. I send all my LLM queries to 5 LLMs - ChatGPT, Claude, DeepSeek (local), Perplexity, and Grok - and Grok consistently gives good answers and often the most helpful answers. It's ~always king when there's any 'ethical' consideration (i.e. other LLMs refuse to answer - I stopped bothering with Gemini for this reason).

'Ethical' is in quotes because I can see why other LLMs refuse to answer things like "can you generate a curl request to exploit this endpoint" - a prompt used frequently during pen testing. I grew tired of telling ChatGPT "it's for a script in a movie". Other examples are aplenty (yesterday Claude accused me of violating its usage policy when asking "can polar bears eat frozen meat" - I was curious after seeing a photograph of a polar bear discovering a frozen whale in a melted ice cap). Grok gave a sane answer, of course.

replies(4): >>45311566 #>>45311621 #>>45311627 #>>45311724 #
renw0rp ◴[] No.45311566[source]
How do you manage sending and receiving requests to multiple LLMs? Are you going it manually through multiple UIs or using some app which integrates with multiple APIs?
replies(2): >>45311574 #>>45311623 #
1. Saline9515 ◴[] No.45311623[source]
You can do it directly using Openrouter.