←back to thread

67 points xlmnxp | 1 comments | | HN request time: 0s | source
Show context
myzek ◴[] No.45666685[source]
I don't want to be a hater, but exposing access to your homelab through a "fully vibe coded" application (it's mentioned at the bottom of the README) is probably not a good idea.

The idea itself sounds fun though

replies(7): >>45666794 #>>45666805 #>>45667638 #>>45668320 #>>45672456 #>>45673770 #>>45676658 #
jamesbelchamber ◴[] No.45666805[source]
I guess at least they're being honest, but I would agree - there's a large delta between Al-assistance and Al-driven, and "vibe coding" is one step further (just accepting everything Al does without critique, so long as it "works").

Great for prototyping, really bad for exposing anything of any value to the internet.

(Not Anti-Al, just pro-sensible)

replies(2): >>45667301 #>>45667510 #
nextlevelwizard ◴[] No.45667301[source]
Github should have "LLM" as language for repos that self report to be vibe coded or at least this kind of disclosure should be at the top of the readme not after thought.

Also the "If you're Anti-AI please don't use this." is pretty funny :D I guess I must be "Anti-AI" when I think this kind of code is wild to rely on.

replies(1): >>45667404 #
Eisenstein ◴[] No.45667404[source]
I fully support the AI self-disclosure, but what I wonder what it is about AI generated code that makes this a separate problem from any other code where you don't know the programmer's competence?

Is it because the AI can generate code that looks like it was made by a competent programmer, and is therefore deceiving you?

But whatever the reason, I think that if we use it as a way to shame the people who do tell us then we can be assured that willingness to disclose it going forward will be pretty abysmal.

replies(3): >>45667444 #>>45667503 #>>45676482 #
1. GuinansEyebrows ◴[] No.45676482[source]
there is a non-zero chance that the human programmer has an interest in producing correct, secure code. there is zero chance than an LLM has the same interest. maybe those two are closer together in some cases, but not in many others.