←back to thread

59 points tobihrbr | 3 comments | | HN request time: 0s | source

Hey HN! We're Wen and Tobias, and we're building Metorial (https://metorial.com), an integration platform that connects AI agents to external tools and data using MCP.

The Problem: While MCP works great locally (e.g., Cursor or Claude Desktop), server-side deployments are painful. Running MCP servers means managing Docker configs, per-user OAuth flows, scaling concurrent sessions, and building observability from scratch. This infrastructure work turns simple integrations into weeks of setup.

Metorial handles all of this automatically. We maintain an open catalog of ~600 MCP servers (GitHub, Slack, Google Drive, Salesforce, databases, etc.) that you can deploy in three clicks. You can also bring your own MCP server or fork existing ones.

For OAuth, just provide your client ID and secret and we handle the entire flow, including token refresh. Each user then gets an isolated MCP server instance configured with their own OAuth credentials automatically.

What makes us different is that our serverless runtime hibernates idle MCP servers and resumes them with sub-second cold starts while preserving the state and connection. Our custom MCP engine is capable of managing thousands of concurrent connections, giving you a scalable service with per-user isolation. Other alternatives either run shared servers (security issues) or provision separate VMs per user (expensive and slow to scale).

Our Python and TypeScript SDKs let you connect LLMs to MCP tools in a single function call, abstracting away the protocol complexity. But if you want to dig deep, you can just use standard MCP and our REST API (https://metorial.com/api) to connect to our platform.

You can self-host (https://github.com/metorial/metorial) or use the managed version at https://metorial.com.

So far, we see enterprise teams use Metorial to have a central integration hub for tools like Salesforce, while startups use it to cut weeks of infra work on their side when building AI agents with integrations.

Demo video: https://www.youtube.com/watch?v=07StSRNmJZ8

Our Repos: Metorial: https://github.com/metorial/metorial, MCP Containers: https://github.com/metorial/mcp-containers

SDKs: Node/TypeScript: https://github.com/metorial/metorial-node, Python: https://github.com/metorial/metorial-python

We'd love to hear feedback, especially if you've dealt with deploying MCP at scale!

1. fsto ◴[] No.45583676[source]
We’ve just begun implementing Composio. Would love to reconsider if you help clarifying the main differences. From my perspective it looks like you have more robustness features to me as a developer and you’re fully open source (not just the client) whereas Composio has more integrations. But would love your input to clarify. Congrats on the launch!
replies(1): >>45586754 #
2. wenyers ◴[] No.45586754[source]
Wen here, the co-founder. I actually spent a couple hours today to take the time to give you a comprehensive answer.

1. As you said, Composio doesn’t allow self-hosting and the source code isn’t available. We want to follow PostHog’s playbook in letting devs run everything on their own infrastructure and open sourcing all our MCP containers.

2. A huge benefit of this approach is that we can let you fork any MCP server through our dashboard so that you can manage it yourself and make any adjustments you might need. We’ve heard the importance of doing this repeatedly from our enterprise customers.

3. I do believe that we offer more robustness features, like environment provisioning, deployment versioning, server pooling, in-depth logs of server startup, as well as a complete trace of the entire MCP session.

4. On the integrations side, Composio does indeed have more integrations right now, but we already have around 600 MCP servers (all with multiple tools of course) of which many are being modified by us every day to make them better. Since we support open source contributions, the catalog also grows with the community. (Quick note that you can have private servers scoped to your org).

5. I tried to benchmark our architecture vs Composio’s in terms of speed. As we mentioned in the post above, one thing that we spent a lot of time on was optimizing how fast we can do serverless with MCP servers. However, since Composio has neither source available nor any technical documentation on how they handle their servers, I couldn’t actually find any information on their architecture. One thing that they enforce as default is having a meta-tool layer with tools like composio_search_tools and composio_execute_tool. Assuming that this is a long living process, I still found that our implementation returned a list_tools response quicker (including the cold start time). If you factor in the time that it takes for them to find the right tools, their response took close to double the time. While we might explore a similar meta tool layer as an optional MCP server in the future, we do seem to on average have a better architecture in terms of speed, though the benchmarking was not entirely rigorous. (I am also unable to answer how they handle multiple users connecting to one MCP server with different OAuth configs because they don’t share that information). I plan on making a more rigorous comparison in a blog post soon, also comparing to hosting on Vercel, Cloudflare, etc..

Let me know if you have any follow up questions.

If you want to talk more, please feel free to DM me on LinkedIn (https://www.linkedin.com/in/karim-rahme/) or X (https://x.com/wen_rahme).

replies(1): >>45588601 #
3. fsto ◴[] No.45588601[source]
Wow, thanks a LOT for that comprehensive answer! Very helpful!

Two questions I didn’t manage to find answers to: * Do you have or plan support for webhooks? The scenario for us is that we’d ideally have one platform for setting up customer integrations for which we will make requests to and await request from * When you host, do you expose the access and refresh tokens for the connected integrations? The use cases for us are: * If we wanna make a feature / request that seems out of scope for Metorial * If we wanna migrate from Metorial, we don’t want to force our customers to have to reconnect * I love that we can bring our own OAuth apps which would be the default for us. But to try out an integration out or for (from our perspective) low prio integrations we’d still like to offer - do you offer your own OAuth apps that we can piggy back on. Just to save the customer from the effort of having to set up an OAuth apps foe each service. I know it comes with a lock in, but it’s worth it in some cases for us.

You’ve made me very excited to try you out, so I’ll implement support for both Composio and Metorial.

Thanks again for taking the time and efforts to answer so thoroughly!

Sent a connection request to you on LinkedIn.