Skip to content

PowerShell proxy that makes LM Studio models available with the OpenAI API format. Lets editors like Cursor or CodeGPT use local LLMs without paid keys.

License

Notifications You must be signed in to change notification settings

i3T4AN/lmstudio-openai-proxy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

LM Studio Powershell Proxy

Route any OpenAI-compatible client through a local PowerShell proxy to your LM Studio server. This README covers setup, hosts changes, and execution (no source code here).


1) Prerequisites

  • Windows 10/11
  • LM Studio running its OpenAI-compatible server (default: http://127.0.0.1:1234/v1)
# Verify LM Studio is up:
Invoke-RestMethod http://127.0.0.1:1234/v1/models

2) Place the proxy script

Put your lmproxy.ps1 (the proxy script) somewhere convenient, e.g.:

C:\Users\<YOUR_USER>\Desktop\lmproxy.ps1

The script should forward POST /v1/chat/completions to http://127.0.0.1:1234/v1/chat/completions and return a fake GET /v1/models list that includes your model ID (e.g. qwen/qwen3-coder-30b).


3) One-time URL ACL for port 8080

Run PowerShell as Administrator and grant the URL reservation:

netsh http add urlacl url=http://+:8080/ user=Everyone

4) Hosts file change (point OpenAI → your proxy)

From Administrator PowerShell:

Add-Content -Path "$env:SystemRoot\System32\drivers\etc\hosts" -Value "`n127.0.0.1 api.openai.com"
ipconfig /flushdns

Check:

Select-String -Path "$env:SystemRoot\System32\drivers\etc\hosts" -Pattern "api.openai.com"

To undo later:

(Get-Content "$env:SystemRoot\System32\drivers\etc\hosts") |
  Where-Object {$_ -notmatch '^127\.0\.0\.1\s+api\.openai\.com$'} |
  Set-Content "$env:SystemRoot\System32\drivers\etc\hosts"
ipconfig /flushdns

5) Run the proxy

Start as Administrator so it can bind to port 8080:

powershell -ExecutionPolicy Bypass -File "C:\Users\<YOUR_USER>\Desktop\lmproxy.ps1"

You should see something like:

Proxy listening on http://+:8080  ->  http://127.0.0.1:1234

6) Quick tests (through the proxy)

Models (fake list from the proxy):

Invoke-RestMethod -Uri "http://api.openai.com:8080/v1/models" -Headers @{Authorization="Bearer sk-local-test"}

Chat completion passthrough:

$body = @{
  model    = "qwen/qwen3-coder-30b"   # must match what the proxy returns at /v1/models
  messages = @(@{ role = "user"; content = "Say hello from LM Studio." })
} | ConvertTo-Json -Depth 5

Invoke-RestMethod -Uri "http://api.openai.com:8080/v1/chat/completions" `
  -Method Post `
  -Headers @{Authorization="Bearer sk-local-test"; "Content-Type"="application/json"} `
  -Body $body

7) Point your editor/tool at the proxy

Use these settings in any OpenAI-compatible client:

  • Base URL: http://api.openai.com:8080/v1
  • API Key: any placeholder (e.g. sk-local-test)
  • Model: the exact ID your proxy returns at /v1/models (e.g. qwen/qwen3-coder-30b)

Note: In Cursor free tiers, only Ask/Chat will work. Agent/Edit are paywalled by Cursor.


8) Troubleshooting

  • Access denied on start: make sure you ran the netsh http add urlacl ... command and are starting PowerShell as Administrator.
  • No response / 404: your proxy likely only handles /v1/models and /v1/chat/completions. Check the endpoint path.
  • Model not found: ensure the model ID in your client exactly matches what your proxy returns at /v1/models.
  • LM Studio not responding: confirm http://127.0.0.1:1234/v1/models works and that your model is loaded.

About

PowerShell proxy that makes LM Studio models available with the OpenAI API format. Lets editors like Cursor or CodeGPT use local LLMs without paid keys.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published