←back to thread

183 points WolfOliver | 2 comments | | HN request time: 0.417s | source
Show context
jjangkke ◴[] No.45066347[source]
I'm fatigued by these articles that just broadly claim AI can't code because its painting a broad stroke against a widely diverse use of AI for different stacks.

It's horribly outdated way of thinking that an singular AI entity would be able to handle all stacks all problems directed at it because no developer is using it that way.

AI is a great tool for both coders and artists and these outlandish titles that grab attention really seem to be echo chambers aimed at people who are convinced that AI isn't going to replace them which is true but the opposite is also true.

replies(2): >>45066432 #>>45066526 #
9rx ◴[] No.45066526[source]
Well, AI really can't code any more than a compiler can. They all require a human to write the original code, even the machine does translate it into other code.

And until the day that humans are no longer driving the bus that will remain the case.

replies(1): >>45067010 #
1. lanstin ◴[] No.45067010[source]
You can say generate a c program that uses gcc 128 bit floats and systematically generates all quadratic roots in order of the max size of their minimal polynomial coefficients, and then sort them and calculate the distribution of the intervals between adjacent numbers, and it just does it. That's qualitatively different from the compilers I have used. Now I was careful to use properly technical words to pull in the world of numeric computation and c programming. But still saved me a lot of time. It was even able to bolt on multithreaded parallelism to speed it up using c stuff I never heard of.
replies(1): >>45067078 #
2. 9rx ◴[] No.45067078[source]
> That's qualitatively different from the compilers I have used.

Is it? I can, in most traditional programming languages commonly used to today, using decades old compiler technology, say something like "x = [1,2,3]" and it will, for example, systematically generate all the code necessary to allocate memory without any need for me to be any more explicit about it. It would be fair to say AI offers an even higher level abstraction, like how most programming languages used today are a higher level abstraction over assembly, but fundamentally different it is not.

"generate a c program that uses gcc 128 bit floats and systematically generates all quadratic roots in order of the max size of their minimal polynomial coefficients, and then sort them and calculate the distribution of the intervals between adjacent numbers" is just code. You still have to write the code get AI to translate it into a lower-level abstraction. It doesn't magically go off and do its own autonomous thing.