From 3128f50bed47f32ca449f901557b09277fca562b Mon Sep 17 00:00:00 2001 From: diogoazevedo15 <68329635+diogoazevedo15@users.noreply.github.com> Date: Thu, 17 Oct 2024 14:39:07 +0100 Subject: [PATCH 1/2] Update deploy-on-google-kubernetes-engine.mdx (#160) 1. Change path of images in the guide from how-to/deploy-on-gke/images/ to images, since images are stored here https://mintlify.s3-us-west-1.amazonaws.com/tensorops/how-to/deploy-on-gke/images/ --- .../deploy-on-google-kubernetes-engine.mdx | 18 +++++++++--------- 1 file changed, 9 insertions(+), 9 deletions(-) diff --git a/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx b/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx index cad6cb04..6a834b08 100644 --- a/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx +++ b/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx @@ -18,20 +18,20 @@ This example demonstrates a public deployment. For a private service accessible Go to **Workloads** and **Create a new Deployment**. - + Rename your project. We will call the one in this guide **llmstudio-on-gcp**. - + Choose between **creating a new cluster** or **using an existing cluster**. For this guide, we will create a new cluster and use the default region. - + @@ -47,7 +47,7 @@ This example demonstrates a public deployment. For a private service accessible ``` Set it as the **Image path** to your container. - + @@ -63,7 +63,7 @@ Additionally, set the `GOOGLE_API_KEY` environment variable to enable calls to G Refer to **SDK/LLM/Providers** for instructions on setting up other providers. - + @@ -74,13 +74,13 @@ Additionally, set the `GOOGLE_API_KEY` environment variable to enable calls to G Select **Expose deployment as a new service** and leave the first item as is. - + Add two other items, and expose the ports defined in the **Set Environment Variables** step. - + @@ -108,7 +108,7 @@ Now let's make a call to our LLMstudio instance on GCP! Go to your newly deployed **Workload**, scroll to the **Exposing services** section, and take note of the Host of your endpoint. - + Create your `.env` file with the following: @@ -141,7 +141,7 @@ Now let's make a call to our LLMstudio instance on GCP! ``` - + From 83b0c02707cad5270d0c79461603c74d84e1bb47 Mon Sep 17 00:00:00 2001 From: diogoazevedo15 <68329635+diogoazevedo15@users.noreply.github.com> Date: Thu, 17 Oct 2024 14:55:35 +0100 Subject: [PATCH 2/2] Update deploy-on-google-kubernetes-engine.mdx (#161) --- .../deploy-on-gke/deploy-on-google-kubernetes-engine.mdx | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx b/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx index 6a834b08..0bdbf789 100644 --- a/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx +++ b/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx @@ -1,3 +1,7 @@ +--- +title: "Deploy on Google Kubernetes Engine" +--- + Learn how to deploy LLMstudio as a containerized application on Google Kubernetes Engine and make calls from a local repository.