diff --git a/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx b/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx
index cad6cb04..0bdbf789 100644
--- a/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx
+++ b/docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx
@@ -1,3 +1,7 @@
+---
+title: "Deploy on Google Kubernetes Engine"
+---
+
Learn how to deploy LLMstudio as a containerized application on Google Kubernetes Engine and make calls from a local repository.
@@ -18,20 +22,20 @@ This example demonstrates a public deployment. For a private service accessible
Go to **Workloads** and **Create a new Deployment**.
-
+
Rename your project. We will call the one in this guide **llmstudio-on-gcp**.
-
+
Choose between **creating a new cluster** or **using an existing cluster**.
For this guide, we will create a new cluster and use the default region.
-
+
@@ -47,7 +51,7 @@ This example demonstrates a public deployment. For a private service accessible
```
Set it as the **Image path** to your container.
-
+
@@ -63,7 +67,7 @@ Additionally, set the `GOOGLE_API_KEY` environment variable to enable calls to G
Refer to **SDK/LLM/Providers** for instructions on setting up other providers.
-
+
@@ -74,13 +78,13 @@ Additionally, set the `GOOGLE_API_KEY` environment variable to enable calls to G
Select **Expose deployment as a new service** and leave the first item as is.
-
+
Add two other items, and expose the ports defined in the **Set Environment Variables** step.
-
+
@@ -108,7 +112,7 @@ Now let's make a call to our LLMstudio instance on GCP!
Go to your newly deployed **Workload**, scroll to the **Exposing services** section, and take note of the Host of your endpoint.
-
+
Create your `.env` file with the following:
@@ -141,7 +145,7 @@ Now let's make a call to our LLMstudio instance on GCP!
```
-
+