←back to thread

67 points xlmnxp | 7 comments | | HN request time: 0.561s | source | bottom
Show context
myzek ◴[] No.45666685[source]
I don't want to be a hater, but exposing access to your homelab through a "fully vibe coded" application (it's mentioned at the bottom of the README) is probably not a good idea.

The idea itself sounds fun though

replies(7): >>45666794 #>>45666805 #>>45667638 #>>45668320 #>>45672456 #>>45673770 #>>45676658 #
1. jamesbelchamber ◴[] No.45666805[source]
I guess at least they're being honest, but I would agree - there's a large delta between Al-assistance and Al-driven, and "vibe coding" is one step further (just accepting everything Al does without critique, so long as it "works").

Great for prototyping, really bad for exposing anything of any value to the internet.

(Not Anti-Al, just pro-sensible)

replies(2): >>45667301 #>>45667510 #
2. nextlevelwizard ◴[] No.45667301[source]
Github should have "LLM" as language for repos that self report to be vibe coded or at least this kind of disclosure should be at the top of the readme not after thought.

Also the "If you're Anti-AI please don't use this." is pretty funny :D I guess I must be "Anti-AI" when I think this kind of code is wild to rely on.

replies(1): >>45667404 #
3. Eisenstein ◴[] No.45667404[source]
I fully support the AI self-disclosure, but what I wonder what it is about AI generated code that makes this a separate problem from any other code where you don't know the programmer's competence?

Is it because the AI can generate code that looks like it was made by a competent programmer, and is therefore deceiving you?

But whatever the reason, I think that if we use it as a way to shame the people who do tell us then we can be assured that willingness to disclose it going forward will be pretty abysmal.

replies(3): >>45667444 #>>45667503 #>>45676482 #
4. muvlon ◴[] No.45667503{3}[source]
I think it makes sense for stuff that is fully AI generated to the point where you commit the prompts to git. At that point, they become the real "source code" and the generated code is more of a build artifact. It makes sense to tag the language as "LLM" instead of e.g. "Python" because that's what contributors will be expected to touch when interacting with the codebase.
5. xenophonf ◴[] No.45667510[source]
> Great for prototyping

I must be Doing It Wrong(TM), because my experience has been pretty negative overall. Is there like a FAQ or a HOWTO or hell even a MAKE.MONEY.FAST floating around that might clue me in?

replies(1): >>45670979 #
6. eitland ◴[] No.45670979[source]
No. You have just missed the two last steps. Here is the full explanation, and it is the same as it has always been on HN:

1. Make prototype

2. Magic happens here

3. Make lots of $$$

Great for prototyping only makes it easier to get to step 2, but done correctly, it certainly does that.

As proven by the nice app I have running on my laptop, but probably won't make any money from.

7. GuinansEyebrows ◴[] No.45676482{3}[source]
there is a non-zero chance that the human programmer has an interest in producing correct, secure code. there is zero chance than an LLM has the same interest. maybe those two are closer together in some cases, but not in many others.