←back to thread

425 points sfarshid | 1 comments | | HN request time: 0s | source
Show context
VincentEvans ◴[] No.45005596[source]
There will be a a new kind of job for software engineers, sort of like a cross between working with legacy code and toxic site cleanup.

Like back in the day being brought in to “just fix” a amalgam of FoxPro-, Excel-, and Access-based ERP that “mostly works” and only “occasionally corrupts all our data” that ambitious sales people put together over last 5 years.

But worse - because “ambitious sales people” will no longer be constrained by sandboxes of Excel or Access - they will ship multi-cloud edge-deployed kubernetes micro-services wired with Kafka, and it will be harder to find someone to talk to understand what they were trying to do at the time.

replies(16): >>45005632 #>>45005830 #>>45009697 #>>45009999 #>>45010075 #>>45010738 #>>45010794 #>>45011192 #>>45011626 #>>45011943 #>>45012386 #>>45013129 #>>45014577 #>>45014613 #>>45014836 #>>45015644 #
dhorthy ◴[] No.45005830[source]
When Claude starts deploying Kafka clusters I’m outro
replies(3): >>45006053 #>>45010652 #>>45012753 #
CuriouslyC ◴[] No.45006053[source]
It's already happening brother, https://github.com/containers/kubernetes-mcp-server.
replies(1): >>45006808 #
dhorthy ◴[] No.45006808[source]
still don’t know why you need an MCP for this when the model is perfectly well trained to write files and run kubetctl on its own
replies(4): >>45007253 #>>45009621 #>>45009724 #>>45009822 #
1. gexla ◴[] No.45009724[source]
Not sure about the MCP, but I find that using something (RAG or otherwise provide docs) to point the LLM specifically to what you're trying to use works better than just relying on its training data or browsing the internet. An issue I had was that it would use outdated docs, etc.