Skip to content

Commit 7e6043c

Browse files
committed
feat: add ci for modules
1 parent 3834975 commit 7e6043c

File tree

12 files changed

+119
-46
lines changed

12 files changed

+119
-46
lines changed

.github/workflows/upload-pypi-dev.yml

Lines changed: 17 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -74,14 +74,23 @@ jobs:
7474
run: |
7575
echo "VERSION=$(poetry version --short)" >> $GITHUB_ENV
7676
77-
- name: Build and tag Docker image
77+
# Build and tag Docker images with both :latest and :[NEW_VERSION]
78+
- name: Proxy Build and tag Docker images
79+
working-directory: ./deploy
7880
run: |
79-
docker build \
80-
--build-arg LLMSTUDIO_VERSION=${{ env.VERSION }} \
81-
-t tensoropsai/llmstudio:${{ env.VERSION }} \
82-
.
81+
make version=${{ env.VERSION }} build-proxy
8382
84-
- name: Push Docker image to Docker Hub
83+
# Build and tag Docker images with both :latest and :[NEW_VERSION]
84+
- name: Tracker Build and tag Docker images
85+
working-directory: ./deploy
8586
run: |
86-
docker push tensoropsai/llmstudio:${{ env.VERSION }}
87-
87+
make version=${{ env.VERSION }} build-tracker
88+
89+
# Push both Docker images to Docker Hub
90+
- name: Push Proxy Docker images to Docker Hub
91+
run: |
92+
docker push tensoropsai/llmstudio-proxy:${{ env.VERSION }}
93+
# Push both Docker images to Docker Hub
94+
- name: Push Tracker Docker images to Docker Hub
95+
run: |
96+
docker push tensoropsai/llmstudio-tracker:${{ env.VERSION }}

.github/workflows/upload-pypi.yml

Lines changed: 17 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -72,17 +72,24 @@ jobs:
7272
password: ${{ secrets.DOCKER_PASSWORD }}
7373

7474
# Build and tag Docker images with both :latest and :[NEW_VERSION]
75-
- name: Build and tag Docker images
75+
- name: Proxy Build and tag Docker images
76+
working-directory: ./deploy
7677
run: |
77-
docker build \
78-
--build-arg LLMSTUDIO_VERSION=${{ env.VERSION }} \
79-
-t tensoropsai/llmstudio:latest \
80-
-t tensoropsai/llmstudio:${{ env.VERSION }} \
81-
.
78+
make version=${{ env.VERSION }} build-proxy
8279
80+
# Build and tag Docker images with both :latest and :[NEW_VERSION]
81+
- name: Tracker Build and tag Docker images
82+
working-directory: ./deploy
83+
run: |
84+
make version=${{ env.VERSION }} build-tracker
85+
86+
# Push both Docker images to Docker Hub
87+
- name: Push Proxy Docker images to Docker Hub
88+
run: |
89+
docker push tensoropsai/llmstudio-proxy:${{ env.VERSION }}
90+
docker push tensoropsai/llmstudio-proxy:latest
8391
# Push both Docker images to Docker Hub
84-
- name: Push Docker images to Docker Hub
92+
- name: Push Tracker Docker images to Docker Hub
8593
run: |
86-
docker push tensoropsai/llmstudio:${{ env.VERSION }}
87-
docker push tensoropsai/llmstudio:latest
88-
94+
docker push tensoropsai/llmstudio-tracker:${{ env.VERSION }}
95+
docker push tensoropsai/llmstudio-tracker:latest

README.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -26,30 +26,33 @@ Don't forget to check out [https://docs.llmstudio.ai](docs) page.
2626

2727
Install the latest version of **LLMstudio** using `pip`. We suggest that you create and activate a new environment using `conda`
2828

29+
For full version:
2930
```bash
30-
pip install llmstudio
31+
pip install 'llmstudio[proxy,tracker]'
3132
```
3233

33-
Install `bun` if you want to use the UI
34-
34+
For lightweight (core) version:
3535
```bash
36-
curl -fsSL https://bun.sh/install | bash
36+
pip install llmstudio
3737
```
3838

3939
Create a `.env` file at the same path you'll run **LLMstudio**
4040

4141
```bash
4242
OPENAI_API_KEY="sk-api_key"
4343
ANTHROPIC_API_KEY="sk-api_key"
44+
VERTEXAI_KEY="sk-api-key"
4445
```
4546

4647
Now you should be able to run **LLMstudio** using the following command.
4748

4849
```bash
49-
llmstudio server --ui
50+
llmstudio server --proxy --tacker
5051
```
5152

52-
When the `--ui` flag is set, you'll be able to access the UI at [http://localhost:3000](http://localhost:3000)
53+
When the `--proxy` flag is set, you'll be able to access the [Swagger at http://0.0.0.0:50001/docs (default port)](http://0.0.0.0:50001/docs)
54+
55+
When the `--tracker` flag is set, you'll be able to access the [Swagger at http://0.0.0.0:50002/docs (default port)](http://0.0.0.0:50002/docs)
5356

5457
## 📖 Documentation
5558

deploy/Makefile

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
build-proxy:
2+
docker build --build-arg LLMSTUDIO_VERSION=$(version) \
3+
-t tensoropsai/llmstudio-proxy:latest \
4+
-t tensoropsai/llmstudio-proxy:$(version) \
5+
-f proxy.Dockerfile \
6+
.
7+
8+
build-tracker:
9+
docker build --build-arg LLMSTUDIO_VERSION=$(version) \
10+
-t tensoropsai/llmstudio-tracker:latest \
11+
-t tensoropsai/llmstudio-tracker:$(version) \
12+
-f tracker.Dockerfile \
13+
.
14+
15+
build: build-tracker build-proxy
16+
17+
run:
18+
docker compose -f docker-compose.yml up

deploy/docker-compose.yml

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
version: "3.8"
2+
3+
services:
4+
llmstudio-proxy:
5+
image: tensoropsai/llmstudio-proxy
6+
restart: always
7+
env_file:
8+
- .env
9+
ports:
10+
- 8001:50001
11+
llmstudio-tracking:
12+
image: tensoropsai/llmstudio-tracker
13+
restart: always
14+
env_file:
15+
- .env
16+
ports:
17+
- 8002:50002

deploy/proxy.Dockerfile

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
FROM python:3.11-slim
2+
ENV PYTHONUNBUFFERED=1
3+
4+
# Install tools
5+
RUN apt-get clean && apt-get update
6+
7+
# Install llmstudio
8+
ARG LLMSTUDIO_VERSION
9+
RUN pip install 'llmstudio[proxy]'==${LLMSTUDIO_VERSION}
10+
11+
CMD ["llmstudio", "server", "--proxy"]

deploy/tracker.Dockerfile

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
FROM python:3.11-slim
2+
ENV PYTHONUNBUFFERED=1
3+
4+
# Install tools
5+
RUN apt-get clean && apt-get update
6+
7+
# Install llmstudio
8+
ARG LLMSTUDIO_VERSION
9+
RUN pip install 'llmstudio[tracker]'==${LLMSTUDIO_VERSION}
10+
RUN pip install psycopg2-binary
11+
12+
CMD ["llmstudio", "server", "--tracker"]

libs/llmstudio/README.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -26,30 +26,33 @@ Don't forget to check out [https://docs.llmstudio.ai](docs) page.
2626

2727
Install the latest version of **LLMstudio** using `pip`. We suggest that you create and activate a new environment using `conda`
2828

29+
For full version:
2930
```bash
30-
pip install llmstudio
31+
pip install 'llmstudio[proxy,tracker]'
3132
```
3233

33-
Install `bun` if you want to use the UI
34-
34+
For lightweight (core) version:
3535
```bash
36-
curl -fsSL https://bun.sh/install | bash
36+
pip install llmstudio
3737
```
3838

3939
Create a `.env` file at the same path you'll run **LLMstudio**
4040

4141
```bash
4242
OPENAI_API_KEY="sk-api_key"
4343
ANTHROPIC_API_KEY="sk-api_key"
44+
VERTEXAI_KEY="sk-api-key"
4445
```
4546

4647
Now you should be able to run **LLMstudio** using the following command.
4748

4849
```bash
49-
llmstudio server --ui
50+
llmstudio server --proxy --tacker
5051
```
5152

52-
When the `--ui` flag is set, you'll be able to access the UI at [http://localhost:3000](http://localhost:3000)
53+
When the `--proxy` flag is set, you'll be able to access the [Swagger at http://0.0.0.0:50001/docs (default port)](http://0.0.0.0:50001/docs)
54+
55+
When the `--tracker` flag is set, you'll be able to access the [Swagger at http://0.0.0.0:50002/docs (default port)](http://0.0.0.0:50002/docs)
5356

5457
## 📖 Documentation
5558

libs/proxy/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -43,13 +43,13 @@ OPENAI_API_KEY="sk-api_key"
4343
ANTHROPIC_API_KEY="sk-api_key"
4444
```
4545

46-
Now you should be able to run **LLMstudio** using the following command.
46+
Now you should be able to run **LLMstudio Proxy** using the following command.
4747

4848
```bash
49-
llmstudio server --ui
49+
llmstudio server --proxy
5050
```
5151

52-
When the `--ui` flag is set, you'll be able to access the UI at [http://localhost:3000](http://localhost:3000)
52+
When the `--proxy` flag is set, you'll be able to access the [Swagger at http://0.0.0.0:50001/docs (default port)](http://0.0.0.0:50001/docs)
5353

5454
## 📖 Documentation
5555

libs/proxy/llmstudio_proxy/config.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ def assign_port(default_port=None):
2121

2222

2323
defaults = {
24-
"LLMSTUDIO_ENGINE_HOST": "localhost",
24+
"LLMSTUDIO_ENGINE_HOST": "0.0.0.0",
2525
"LLMSTUDIO_ENGINE_PORT": str(assign_port(50001)),
2626
}
2727

0 commit comments

Comments
 (0)