←back to thread

MCP is eating the world

(www.stainless.com)
335 points emschwartz | 6 comments | | HN request time: 0.97s | source | bottom
Show context
faxmeyourcode ◴[] No.44368295[source]
Based on the comments here, a lot of folks are assuming the primary users of mcp are the end users connecting their claude/vscode/etc to whatever saas platform they're working on. While this _is_ a huge benefit and super cool to use, imo the main benefit is for things like giving complex tool access to centralized agents. Where the mcp servers allow you to build agents that have the tools to do a sort of "custom deep research."

We have deployed this internally at work where business users are giving it a list of 20 jira tickets and asking it to summarize or classify them based on some fuzzy contextual reasoning found in the description/comments. It will happly run 50+ tool calls poking around in Jira/confluence and respond in a few seconds what would have taken them hours to do manually. The fact that it uses mcp under the hood is completely irrelevant but it makes our job as builders much much easier.

replies(7): >>44368648 #>>44368903 #>>44368929 #>>44368954 #>>44369304 #>>44374580 #>>44375982 #
rcarmo ◴[] No.44369304[source]
As someone who does both, I have to say that the only reason I am writing MCP stuff is that all the user-side tools seem to support it.

And the moment we, as an industry, settle on something sane, I will rip out the whole thing and adopt that, because MCP brings _nothing_ to the table that I could not do with a "proper" API using completely standard tooling.

Then again, I have run the whole gamut since the EDI and Enterprise JavaBeans era, XML-RPC, etc. - the works. Our industry loves creating new API surfaces and semantics without a) properly designing them from the start and b) aiming for a level of re-use that is neither pathological nor wasteful of developer time, so I'm used to people from "new fields of computing" ignoring established wisdom and rolling their own API "conventions".

But, again, the instant something less contrived and more integratable comes along, I will gleefully rm -rf the entire thing and move over, and many people in the enterprise field feel exactly the same - we've spent decades builting API management solutions with proper controls, and MCP bodges all of that up.

replies(4): >>44371922 #>>44375100 #>>44375484 #>>44376382 #
alfalfasprout ◴[] No.44371922[source]
> And the moment we, as an industry, settle on something sane, I will rip out the whole thing and adopt that, because MCP brings _nothing_ to the table that I could not do with a "proper" API using completely standard tooling.

100%. I suppose I understand MCP for user-side tooling but people seem to be reinventing the wheel because they don't understand REST. making REST requests with a well defined schema from an LLM is not all that hard.

replies(3): >>44372318 #>>44372372 #>>44377001 #
OJFord ◴[] No.44372318[source]
I don't even mind it existing, it's just the way it's presented/documented/talked about like it's some special novel important concept that baffles me, and I think makes it more confusing for developer newcomers (but fine or maybe even helpful for not-particularly-technical but AI-keen/'power' users).
replies(2): >>44374289 #>>44377096 #
1. visarga ◴[] No.44374289[source]
MCP is really a great leap because LLMs orchestrate across a collection of tools instead of running a scripted flow. The most obvious example is deep research, where the LLM sends initial queries, reads, then generates new queries and loops until it finds what it needs. This dynamic orchestration of the search tool is almost impossible to do in a scripted way. And it shows where the MCP value is - you just write simple tools, and AI handles the contextual application. You just make the backend, the front end is the LLM with human in the loop.

I made an MCP with 2 tools - generate_node and search, and with it Claude Desktop app can create a knowledge graph complete with links and everything. It scales unbounded by context size but is read/write and smarter than RAG because it uses graph structure not just embeddings. I just made the reading and writing tools, the magic of writing the nodes, linking them up, searching and analyzing them is due to AI. And again, Claude can be very efficient at wielding these tools with zero effort on my part. That is the value of MCP.

replies(2): >>44374759 #>>44375360 #
2. simiones ◴[] No.44375360[source]
MCP is a detail here. The exact same thing would happen if generate_node and search were exposed as REST endpoints, if Claude Desktop had used REST instead of MCP.
replies(2): >>44376297 #>>44378269 #
3. bboygravity ◴[] No.44376297[source]
Isn't part of the issue that LLM's are relatively bad at REST and json and all that?

I remember like 2 years ago there was big hype/revolution around ChatGPT finally being able to return ONLY valid json when you asked it to. So apparently LLMs are not that good at scripting without mistakes.

Having said that, I honestly have no clue what MCP looks like or is lol :p

replies(2): >>44376686 #>>44377109 #
4. simiones ◴[] No.44376686{3}[source]
MCP is a form of JSON-RPC over HTTP, so I don't think it has anything to do with that.
5. fennecbutt ◴[] No.44377109{3}[source]
MCP is just json.

But to your last point: go look it up, have a read through their client/server implementations (for your language of choice). It doesn't actually take that long because the concept is actually rather simple, so I totally recommend it.

6. visarga ◴[] No.44378269[source]
Maybe not, I think they trained the model to be especially capable of MCP tool use, and generated data this way. Other formats and systems might be slightly worse, and the whole model would probably not handle such a diversity of tools if all it had was trained on much less diverse API integrations.