←back to thread

MCP is eating the world

(www.stainless.com)
335 points emschwartz | 4 comments | | HN request time: 0.716s | source
1. kloud ◴[] No.44368429[source]
The reason for MCP is that you get better results with optimized prompts rather than using existing API docstrings. So everybody is going to end up adopting it in some shape or form.

It is a general concept, MCP itself is nothing special, it is that just that Anthropic formalized the observation first.

Tool call = API call + instructions for LLM

So vendors who provide APIs are going to write prompts, add a thin wrapper and out goes MCP. Or you create your own instructions and wrap in MCP to optimize your own workflows.

replies(1): >>44371782 #
2. mooreds ◴[] No.44371782[source]
For pure OpenAPI APIs, why wouldn't you just update your API docstrings? Or maybe add a new attribute to the OpenAPI spec for LLM prompts?

I definitely see the value if you have a non standard API or undocumented API you wanted to expose.

And I see value in the resources and prompts parts of MCP, since they can offer clients more functionality that would be hard to put into an API spec.

replies(2): >>44374836 #>>44378078 #
3. kloud ◴[] No.44374836[source]
Exactly, having some metadata on OpenAPI and then have a generic openapi-mcp-gateway seems like a useful approach. Or being it a part of a web framework, given routes also expose MCP endpoint.

The takeaway is to go beyond the bare minimum and put the effort to optimize the docs/prompts, ideally having some evals in place.

For the prompts and resources part of the spec I haven't found much use for those yet. It seems like output of tool calls is enough to get content into the context.

4. owebmaster ◴[] No.44378078[source]
MCP is winning against OpenAPI because there is no big VC behind openapi.