Skip to content

Commit 3c7b543

Browse files
committed
Revert "No public description"
This reverts commit 42153eb.
1 parent 42153eb commit 3c7b543

File tree

2,725 files changed

+764443
-1
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

2,725 files changed

+764443
-1
lines changed

.cloud-build/Notebooks.txt

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
generative-ai/gemini/getting-started/intro_gemini_2_0_flash.ipynb
2+
generative-ai/gemini/getting-started/intro_gemini_2_0_flash_lite.ipynb
3+
generative-ai/gemini/getting-started/intro_gemini_2_0_image_gen.ipynb
4+
generative-ai/gemini/getting-started/intro_gemini_2_5_flash.ipynb
5+
generative-ai/gemini/getting-started/intro_gemini_2_5_pro.ipynb
6+
generative-ai/gemini/getting-started/intro_gemini_chat.ipynb
7+
generative-ai/gemini/getting-started/intro_gemini_curl.ipynb
8+
generative-ai/gemini/chat-completions/intro_chat_completions_api.ipynb
9+
generative-ai/gemini/code-execution/intro_code_execution.ipynb
10+
generative-ai/gemini/controlled-generation/intro_controlled_generation.ipynb
11+
generative-ai/gemini/function-calling/intro_function_calling.ipynb
12+
generative-ai/gemini/global-endpoint/intro_global_endpoint.ipynb
13+
generative-ai/gemini/prompts/intro_prompt_design.ipynb
14+
generative-ai/gemini/use-cases/spatial-understanding/spatial_understanding.ipynb
15+
generative-ai/gemini/use-cases/intro_multimodal_use_cases.ipynb

.cloud-build/README.md

Lines changed: 61 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,61 @@
1+
# Notebook Testing
2+
3+
This script is designed to automate the uploading of Jupyter Notebook files (`.ipynb`) from a local directory to a Google Cloud Storage (GCS) bucket
4+
5+
## Purpose
6+
7+
The script simplifies the process of transferring multiple notebook files to GCS, making it easier to manage and deploy your notebooks in a cloud environment.
8+
9+
## Script Description
10+
11+
The script performs the following actions:
12+
13+
1. **Reads Output URI variable:**
14+
- It reads the destination GCS bucket URI from a variable named `OUTPUT_URI` injected from secret manager. This allows for easy configuration of the destination.
15+
2. **Iterates Through Notebooks:**
16+
- It loops through all `.ipynb` files located in the `Notebooks.txt` file.
17+
3. **Copies Notebooks to GCS:**
18+
- For each notebook file, it extracts the filename using `basename`.
19+
- It then uses the `gcloud storage cp` command to copy the notebook file to the specified GCS bucket, maintaining the directory structure.
20+
21+
## Prerequisites
22+
23+
Before running this script, ensure you have the following:
24+
25+
- **Google Cloud SDK (gcloud):** The Google Cloud SDK must be installed and configured with appropriate credentials. You should be authenticated to the Google Cloud project where you want to upload the notebooks.
26+
- **GCS Bucket:** The destination GCS bucket must exist.
27+
- **OUTPUT_URI:** A secret named `OUTPUT_URI` must be injected for this step in your pipeline. This variable should contain the full GCS URI of the destination bucket (e.g., `gs://your-bucket-name`).
28+
- **Jupyter Notebooks:** The Jupyter Notebook files (`.ipynb`) should be located in the `/workspace/generative-ai/gemini/getting-started/` directory.
29+
30+
## How to Use
31+
32+
1. **Set Up `OUTPUT_URI`:**
33+
34+
- Create a variable named `OUTPUT_URI` in the same directory as your script.
35+
- Add the full GCS URI of your destination bucket to this file. For example:
36+
37+
```none
38+
gs://your-bucket-name
39+
```
40+
41+
2. **Place Notebooks:**
42+
43+
- Ensure the names of the `.ipynb` files are located in the `Notebooks.txt` file.
44+
45+
3. **Add the script as a step in your pipeline:**
46+
47+
## Example
48+
49+
Assuming you have a GCS bucket named `my-notebooks-bucket` and a notebook file named `example.ipynb` in the specified directory:
50+
51+
1. Create `OUTPUT_URI` variable the content: `gs://my-notebooks-bucket`
52+
2. Place `example.ipynb` in `/workspace/generative-ai/gemini/getting-started/`
53+
3. Run the script.
54+
4. The `example.ipynb` will be copied to `gs://my-notebooks-bucket/generative-ai/gemini/getting-started/example.ipynb`.
55+
56+
## Notes
57+
58+
- Ensure that the service account used by `gcloud` has the necessary permissions to write to the GCS bucket.
59+
- The script assumes that the directory `/workspace/generative-ai/gemini/getting-started/` exists. If it doesn't, the script will not find any notebooks.
60+
- This script will overwrite files with the same name in the GCS bucket.
61+
- If you have a large number of notebooks, consider adding error handling and logging to the script.

.cloud-build/clone-repo/README.md

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
# Script for Copying Gemini Getting Started Files to GCS
2+
3+
This script automates the process of cloning your notebook repository and will copy the notebook files to a specified Google Cloud Storage (GCS) location.
4+
5+
## Prerequisites
6+
7+
- **Google Cloud SDK (gcloud) installed and configured:** You need to have the `gcloud` command-line tool installed and authenticated with your Google Cloud account.
8+
- **gsutil installed:** `gsutil` is part of the Google Cloud SDK and is used for interacting with GCS.
9+
- **Git installed:** Git is required to clone the repository.
10+
- **OUTPUT_URI:** A secret inject from secret manager Google Cloud in your account.
11+
12+
## How to use
13+
14+
1. **Create `OUTPUT_URI` secret:**
15+
16+
2. **Run the script:**
17+
Add script to pipeline
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
#!/bin/bash
2+
3+
# Clones the generative-ai repository from GitHub.
4+
git clone --depth 1 -b main https://github.com/GoogleCloudPlatform/generative-ai.git
5+
6+
# Changes the current directory to the cloned repository.
7+
cd generative-ai || exit 1

.cloud-build/cloudbuild.yaml

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
timeout: 7200s
2+
3+
steps:
4+
# [START copy-repo]
5+
- name: google/cloud-sdk
6+
id: "copy-repo"
7+
entrypoint: bash
8+
args:
9+
- "-c"
10+
- |
11+
. .cloud-build/clone-repo/clone_repo.sh
12+
# [END copy-repo]
13+
# [START execute-notebooks]
14+
- name: google/cloud-sdk
15+
id: "execute-notebooks"
16+
entrypoint: bash
17+
timeout: 7200s # Because this step takes the longest. Avg runtime is 1 hour and 20 minutes.
18+
args:
19+
- "-c"
20+
- |
21+
gcloud secrets versions access latest --secret=NOTEBOOK_RUNTIME_TEMPLATE > NOTEBOOK_RUNTIME_TEMPLATE
22+
gcloud secrets versions access latest --secret=PROJECT_ID > PROJECT_ID
23+
gcloud secrets versions access latest --secret=REGION > REGION
24+
gcloud secrets versions access latest --secret=SA_EMAIL > SA
25+
gcloud secrets versions access latest --secret=GCS_BUCKET > OUTPUT_URI
26+
gcloud secrets versions access latest --secret=PS_TOPIC > PS_TOPIC
27+
. .cloud-build/executor/test_notebooks.sh
28+
# [END execute-notebooks]
29+
# [START github-notification]
30+
#- name: google/cloud-sdk
31+
# id: 'github-notification'
32+
# entrypoint: bash
33+
# args:
34+
# - '-c'
35+
# - |
36+
# gcloud secrets versions access latest --secret=GH_token > GH_token.txt
37+
38+
# . .cloud-build/github-issue/ghcli-install.sh
39+
# . .cloud-build/github-issue/auto_issue_create.sh
40+
41+
# [END github-notification]
42+
43+
logsBucket: ${_LOG_BUCKET}

.cloud-build/executor/README.md

Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
# Automated Vertex AI Colab Notebook Execution
2+
3+
This Bash script automates the execution of Jupyter Notebooks using Vertex AI Colab executions. It iterates through notebooks in a specified directory, triggers executions, and monitors their status.
4+
5+
## Purpose
6+
7+
The script aims to:
8+
9+
- Automate the execution of all `.ipynb` files located in `/workspace/generative-ai/gemini/getting-started`.
10+
- Use a pre-configured Notebook Runtime Template.
11+
- Store execution outputs in a Google Cloud Storage (GCS) bucket.
12+
- Monitor execution status and identify failed notebooks.
13+
- Output a list of failed notebooks to both the console and a file (`/workspace/Failure.txt`).
14+
15+
## Prerequisites
16+
17+
- Google Cloud SDK (`gcloud`) installed and configured.
18+
- Authentication set up with `gcloud auth login`.
19+
- Vertex AI API enabled.
20+
- Necessary variables (PROJECT_ID, REGION, SA, OUTPUT_URI, NOTEBOOK_RUNTIME_TEMPLATE) stored in separate secret manager in the same directory as the script.
21+
- A Google Cloud Storage bucket accessible for storing notebook outputs.
22+
- Notebook Runtime Template created in Google Cloud Vertex AI.
23+
- Notebooks present in the `/workspace/generative-ai/gemini/getting-started` directory.
24+
25+
## Usage
26+
27+
1. **Prepare Variable in secret manager:** Create the following values in secret manager
28+
- `PROJECT_ID`: Your Google Cloud Project ID.
29+
- `REGION`: The Google Cloud region to use.
30+
- `SA`: The service account to use for the executions.
31+
- `OUTPUT_URI`: The GCS URI for storing execution outputs (e.g., `gs://your-bucket`).
32+
- `NOTEBOOK_RUNTIME_TEMPLATE`: The full resource name of your Notebook Runtime Template.
33+
2. **Add to your pipeline:**
Lines changed: 144 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,144 @@
1+
#!/bin/bash
2+
3+
alias gcurl='curl -H "Authorization: Bearer $(gcloud auth print-access-token)" -H "Content-Type: application/json"'
4+
5+
TARGET=$(cat .cloud-build/Notebooks.txt)
6+
7+
current_date=$(date +%Y-%m-%d)
8+
current_time=$(date +%H-%M-%S)
9+
current_time_readable=$(date "+%B %d %Y %H:%M:%S")
10+
11+
NOTEBOOK_RUNTIME_TEMPLATE=$(cat NOTEBOOK_RUNTIME_TEMPLATE)
12+
OUTPUT_URI=$(cat OUTPUT_URI)
13+
SA=$(cat SA)
14+
PROJECT_ID=$(cat PROJECT_ID)
15+
REGION=$(cat REGION)
16+
PUBSUB_TOPIC=$(cat PS_TOPIC)
17+
18+
failed_count=0
19+
failed_notebooks=()
20+
total_count=0
21+
successful_notebooks=()
22+
successful_count=0
23+
24+
for x in $TARGET; do
25+
total_count=$((total_count + 1))
26+
# Use the full path from the repository for display name
27+
DISPLAY_NAME="${x##generative-ai/}"
28+
DISPLAY_NAME="${DISPLAY_NAME%.ipynb}-$current_date-$current_time"
29+
echo "Starting execution for ${x}"
30+
31+
# Execute and get the operation ID
32+
OPERATION_ID=$(gcloud colab executions create \
33+
--display-name="$DISPLAY_NAME" \
34+
--notebook-runtime-template="$NOTEBOOK_RUNTIME_TEMPLATE" \
35+
--direct-content="$x" \
36+
--gcs-output-uri="$OUTPUT_URI" \
37+
--project="$PROJECT_ID" \
38+
--region="$REGION" \
39+
--service-account="$SA" \
40+
--verbosity=debug \
41+
--execution-timeout="1h30m" \
42+
--format="value(name)")
43+
44+
echo "Operation ID: $OPERATION_ID"
45+
TRUNCATED_OPERATION_ID=$(echo "$OPERATION_ID" | cut -c 67-85)
46+
47+
# check job status
48+
echo "Waiting for execution to complete..."
49+
if ! EXECUTION_DETAILS=$(gcloud colab executions describe "$TRUNCATED_OPERATION_ID" --region="$REGION"); then
50+
echo "Error describing execution for ${x}. See logs for details."
51+
failed_count=$((failed_count + 1))
52+
failed_notebooks+=("${x}")
53+
continue
54+
else
55+
echo "Execution completed for ${x}"
56+
fi
57+
58+
# Check the jobState
59+
JOB_STATE=$(echo "$EXECUTION_DETAILS" | grep "jobState:" | awk '{print $2}')
60+
if [[ "$JOB_STATE" == "JOB_STATE_SUCCEEDED" ]]; then
61+
echo "Notebook execution succeeded."
62+
successful_count=$((successful_count + 1))
63+
successful_notebooks+=("${x}")
64+
else
65+
echo "Notebook execution failed. Job state: $JOB_STATE. Please use id $TRUNCATED_OPERATION_ID to troubleshoot notebook ${x}. See log for details."
66+
failed_count=$((failed_count + 1))
67+
failed_notebooks+=("${x}")
68+
continue
69+
fi
70+
71+
done
72+
73+
# Print the final list of failed notebooks
74+
if [[ ${#failed_notebooks[@]} -gt 0 ]]; then
75+
echo "Failed Notebooks:"
76+
for notebook in "${failed_notebooks[@]}"; do
77+
echo "- $notebook" | tee -a /workspace/Failure.txt
78+
done
79+
fi
80+
81+
if [[ $failed_count -gt 0 ]]; then
82+
echo "Total failed notebook executions: $failed_count"
83+
fi
84+
85+
if [[ $successful_count -gt 0 ]]; then
86+
echo "Total successful notebook executions: $successful_count"
87+
fi
88+
89+
# Prep pub/sub message
90+
failed_notebooks_str=$(
91+
IFS=', '
92+
echo "${failed_notebooks[*]}"
93+
)
94+
95+
# prep notebook name for pub/sub message
96+
failed_notebooks_str=$(
97+
IFS=', '
98+
echo "${failed_notebooks[*]}"
99+
)
100+
101+
if [[ -n "$failed_notebooks_str" ]]; then
102+
IFS=',' read -ra failed_notebooks_array <<<"$failed_notebooks_str"
103+
trimmed_notebooks=()
104+
for notebook in "${failed_notebooks_array[@]}"; do
105+
trimmed_notebooks+=("$(echo -n "$notebook" | sed 's/ *$//')")
106+
done
107+
failed_notebooks_str=$(
108+
IFS=', '
109+
echo "${trimmed_notebooks[*]}"
110+
)
111+
else
112+
failed_notebooks_str=""
113+
fi
114+
115+
successful_notebooks_str=$(
116+
IFS=', '
117+
echo "${successful_notebooks[*]}"
118+
)
119+
120+
if [[ -n "$successful_notebooks_str" ]]; then
121+
IFS=',' read -ra successful_notebooks_array <<<"$successful_notebooks_str"
122+
trimmed_successful_notebooks=()
123+
for notebook in "${successful_notebooks_array[@]}"; do
124+
trimmed_successful_notebooks+=("$(echo -n "$notebook" | sed 's/ *$//')")
125+
done
126+
successful_notebooks_str=$(
127+
IFS=', '
128+
echo "${trimmed_successful_notebooks[*]}"
129+
)
130+
else
131+
successful_notebooks_str=""
132+
fi
133+
134+
# Construct the message to send to pub/sub topic
135+
message_data="{\"total_count\":$((total_count + 0)),\"failed_count\":$((failed_count + 0)),\"failed_notebooks\":\"${failed_notebooks_str}\",\"successful_notebooks\":\"${successful_notebooks_str}\",\"successful_count\":$((successful_count + 0)),\"execution_date\":\"${current_time_readable}\"}"
136+
137+
# Publish to Pub/Sub
138+
echo "$(date) - INFO - Publishing to Pub/Sub topic: $PUBSUB_TOPIC"
139+
if ! gcloud pubsub topics publish "$PUBSUB_TOPIC" --message="$message_data" --project="$PROJECT_ID"; then
140+
echo "$(date) - ERROR - Failed to publish to Pub/Sub topic $PUBSUB_TOPIC. Check permissions and topic configuration."
141+
#exit 1
142+
fi
143+
144+
echo "All notebook executions completed."
Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
# Create GitHub Issue on Notebook Test Failure
2+
3+
These Bash scripts installs GitHub CLI and automates the creation of a GitHub issue when automated notebook tests fail. It reads the error output from a specified file and uses the `gh` CLI to create a new issue in a designated repository.
4+
5+
## Purpose
6+
7+
The script aims to:
8+
9+
- Detect the presence of a failure file (`/workspace/Failure.txt`).
10+
- If the file exists, create a GitHub issue with the file's content as the issue body.
11+
- Include a predefined title, labels, and assignees in the issue.
12+
- Provide informative messages about the success or failure of issue creation.
13+
14+
## Prerequisites
15+
16+
- GitHub CLI (`gh`) installed and authenticated.
17+
- A GitHub repository with write access for the user running the script.
18+
- A failure file (`/workspace/Failure.txt`) containing the error output from failed notebook tests.
19+
20+
## Usage
21+
22+
1. **Save the script:** Save the provided script as `create_gh_issue.sh`.
23+
2. **Set Repository Details:** Modify the variables `REPO_OWNER`, `REPO_NAME`, `ISSUE_ASSIGNEES` and `FAILURE_FILE` within the script to match your GitHub repository details and the location of the failure file.
24+
3. **Add script to pipeline:**
Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
#!/bin/bash
2+
3+
ISSUE_TITLE="Failed automated Notebook Testing"
4+
ISSUE_LABELS="bug"
5+
# ISSUE_ASSIGNEES="CadillacBurgess1"
6+
REPO_OWNER="GoogleCloudPlatform"
7+
REPO_NAME="generative-ai"
8+
FAILURE_FILE="/workspace/Failure.txt"
9+
10+
if [ -f "$FAILURE_FILE" ]; then
11+
ISSUE_BODY=$(cat "$FAILURE_FILE")
12+
13+
if ! gh issue create \
14+
-t "$ISSUE_TITLE" \
15+
-b "$ISSUE_BODY" \
16+
-l "$ISSUE_LABELS" \
17+
-a "$ISSUE_ASSIGNEES" \
18+
-R "$REPO_OWNER/$REPO_NAME"; then
19+
echo "Error creating issue."
20+
exit 1
21+
fi
22+
23+
echo "Issue created with output of failed notebooks."
24+
else
25+
echo "No notebooks failed. Issue creation skipped."
26+
fi
Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
#!/bin/bash
2+
3+
(type -p wget >/dev/null || (apt update && apt-get install wget -y)) &&
4+
mkdir -p /etc/apt/keyrings &&
5+
chmod 0755 /etc/apt/keyrings &&
6+
out=$(mktemp) && wget -nv -O$out https://cli.github.com/packages/githubcli-archive-keyring.gpg &&
7+
cat $out | tee /etc/apt/keyrings/githubcli-archive-keyring.gpg >/dev/null &&
8+
rm -f "$out" &&
9+
chmod go+r /etc/apt/keyrings/githubcli-archive-keyring.gpg &&
10+
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/githubcli-archive-keyring.gpg] https://cli.github.com/packages stable main" | sudo tee /etc/apt/sources.list.d/github-cli.list >/dev/null &&
11+
apt update &&
12+
apt install gh -y
13+
14+
apt update -y
15+
apt install gh -y
16+
gh auth login --with-token <GH_token.txt

0 commit comments

Comments
 (0)