Route any OpenAI-compatible client through a local PowerShell proxy to your LM Studio server. This README covers setup, hosts changes, and execution (no source code here).
- Windows 10/11
- LM Studio running its OpenAI-compatible server (default:
http://127.0.0.1:1234/v1
)
# Verify LM Studio is up:
Invoke-RestMethod http://127.0.0.1:1234/v1/models
Put your lmproxy.ps1
(the proxy script) somewhere convenient, e.g.:
C:\Users\<YOUR_USER>\Desktop\lmproxy.ps1
The script should forward POST /v1/chat/completions
to http://127.0.0.1:1234/v1/chat/completions
and return a fake GET /v1/models
list that includes your model ID (e.g. qwen/qwen3-coder-30b
).
Run PowerShell as Administrator and grant the URL reservation:
netsh http add urlacl url=http://+:8080/ user=Everyone
From Administrator PowerShell:
Add-Content -Path "$env:SystemRoot\System32\drivers\etc\hosts" -Value "`n127.0.0.1 api.openai.com"
ipconfig /flushdns
Check:
Select-String -Path "$env:SystemRoot\System32\drivers\etc\hosts" -Pattern "api.openai.com"
To undo later:
(Get-Content "$env:SystemRoot\System32\drivers\etc\hosts") |
Where-Object {$_ -notmatch '^127\.0\.0\.1\s+api\.openai\.com$'} |
Set-Content "$env:SystemRoot\System32\drivers\etc\hosts"
ipconfig /flushdns
Start as Administrator so it can bind to port 8080:
powershell -ExecutionPolicy Bypass -File "C:\Users\<YOUR_USER>\Desktop\lmproxy.ps1"
You should see something like:
Proxy listening on http://+:8080 -> http://127.0.0.1:1234
Models (fake list from the proxy):
Invoke-RestMethod -Uri "http://api.openai.com:8080/v1/models" -Headers @{Authorization="Bearer sk-local-test"}
Chat completion passthrough:
$body = @{
model = "qwen/qwen3-coder-30b" # must match what the proxy returns at /v1/models
messages = @(@{ role = "user"; content = "Say hello from LM Studio." })
} | ConvertTo-Json -Depth 5
Invoke-RestMethod -Uri "http://api.openai.com:8080/v1/chat/completions" `
-Method Post `
-Headers @{Authorization="Bearer sk-local-test"; "Content-Type"="application/json"} `
-Body $body
Use these settings in any OpenAI-compatible client:
- Base URL:
http://api.openai.com:8080/v1
- API Key: any placeholder (e.g.
sk-local-test
) - Model: the exact ID your proxy returns at
/v1/models
(e.g.qwen/qwen3-coder-30b
)
Note: In Cursor free tiers, only Ask/Chat will work. Agent/Edit are paywalled by Cursor.
- Access denied on start: make sure you ran the
netsh http add urlacl ...
command and are starting PowerShell as Administrator. - No response / 404: your proxy likely only handles
/v1/models
and/v1/chat/completions
. Check the endpoint path. - Model not found: ensure the model ID in your client exactly matches what your proxy returns at
/v1/models
. - LM Studio not responding: confirm
http://127.0.0.1:1234/v1/models
works and that your model is loaded.