←back to thread

43 points apichar | 5 comments | | HN request time: 0.687s | source

Large Language Models (LLMs) are powerful, but they’re limited by fixed context windows and outdated knowledge. What if your AI could access live search, structured data extraction, OCR, and more—all through a standardized interface?

We built the JigsawStack MCP Server, an open-source implementation of the Model Context Protocol (MCP) that lets any AI model call external tools effortlessly.

Here’s what it unlocks:

- Web Search & Scraping: Fetch live information and extract structured data from web pages.

- OCR & Structured Data Extraction: Process images, receipts, invoices, and handwritten text with high accuracy.

- AI Translation: Translate text and documents while maintaining context. Image Generation: Generate images from text prompts in real-time.

Instead of stuffing prompts with static data or building custom integrations, AI models can now query MCP servers on demand—extending memory, reducing token costs, and improving efficiency.

Read the full breakdown here: https://jigsawstack.com/blog/jigsawstack-mcp-servers

If you’re working on AI-powered applications, try it out and let us know how it works for you.

1. dlevine ◴[] No.43369073[source]
I have been playing around with MCP, and one of its current shortcomings is that it didn’t support OAuth. This means that credentials need to be hardcoded somewhere. Right now, it appears that a lot of MCP servers are run locally, but there is no reason they couldn’t be run as a service in the future.

There is a draft specification for OAuth in MCP, and hopefully this is supported soon.

replies(4): >>43369261 #>>43369600 #>>43370482 #>>43370490 #
2. knowaveragejoe ◴[] No.43369261[source]
There are remotely run MCP server options out there, such as mcp.run and glama.ai
3. socrateslee ◴[] No.43369600[source]
For the OAuth part, the access_token is all an MCP server needs. So users could do an OAuth Authorization like in the settings or by the chatbot, and let MCP servers handle the storage of the access_token.

For remote MCP servers, storing access_token is a very common practice. For MCP servers hosted locally, how to deal with a bunch of secret keys is a problem.

4. Nedomas ◴[] No.43370482[source]
There's open source package that allows delaying providing credentials to MCP server to runtime / via MCP tool call: https://github.com/supercorp-ai/superargs

For hosted MCPs: https://supermachine.ai

5. rguldener ◴[] No.43370490[source]
You could use Nango for the OAuth flow and then pass the user’s token to the MCP server: https://nango.dev/auth

Free for OAuth with 400+ APIs & can be self-hosted

(I am one of the founders)