←back to thread

MCP in LM Studio

(lmstudio.ai)
225 points yags | 3 comments | | HN request time: 0.435s | source
1. jtreminio ◴[] No.44383921[source]
I’ve been wanting to try LM Studio but I can’t figure out how to use it over local network. My desktop in the living room has the beefy GPU, but I want to use LM Studio from my laptop in bed.

Any suggestions?

replies(2): >>44383943 #>>44384035 #
2. skygazer ◴[] No.44383943[source]
Use an openai compatible API client on your laptop, and LM Studio on your server, and point the client to your server. LM Server can serve an LLM on a desired port using the openai style chat completion API. You can also install openwebui on your server and connect to it via a web browser, and configure it to use the LM Studio connection for its LLM.
3. numpad0 ◴[] No.44384035[source]

  [>_] -> [.* Settings] -> Serve on local network ( o)
Any OpenAI-compatible client app should work - use IP address of host machine as API server address. API key can be bogus or blank.