Skip to content

Commit f928847

Browse files
authored
Updated how local environment file is generated. (#32)
1 parent 00d1a9f commit f928847

File tree

8 files changed

+174
-157
lines changed

8 files changed

+174
-157
lines changed

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -106,6 +106,7 @@ celerybeat.pid
106106
*.sage.py
107107

108108
# Environments
109+
.accounts
109110
.env
110111
.venv
111112
env/
@@ -115,6 +116,7 @@ env.bak/
115116
venv.bak/
116117
requirements.json
117118

119+
118120
# Spyder project settings
119121
.spyderproject
120122
.spyproject

README.md

Lines changed: 21 additions & 35 deletions
Original file line numberDiff line numberDiff line change
@@ -79,43 +79,21 @@ This demo uses Terraform and bash scripting to create and teardown infrastructur
7979
cd demo-change-data-capture
8080
```
8181

82-
1. Create a file to manage all the values you'll need through the setup.
82+
1. Create an `accounts` by running the following command.
8383

8484
```bash
85-
CONFLUENT_CLOUD_EMAIL=<replace>
86-
CONFLUENT_CLOUD_PASSWORD=<replace>
87-
88-
CCLOUD_API_KEY=api-key
89-
CCLOUD_API_SECRET=api-secret
90-
CCLOUD_BOOTSTRAP_ENDPOINT=kafka-cluster-endpoint
91-
92-
ORACLE_USERNAME=admin
93-
ORACLE_PASSWORD=demo-cdc-c0nflu3nt!
94-
ORACLE_ENDPOINT=oracle-endpoint
95-
ORACLE_PORT=1521
96-
97-
POSTGRES_PRODUCTS_ENDPOINT=postgres-products
98-
REDSHIFT_ADDRESS=redshift-address
99-
100-
SF_PVT_KEY=snowflake-private-key
101-
102-
export TF_VAR_confluent_cloud_api_key="<replace>"
103-
export TF_VAR_confluent_cloud_api_secret="<replace>"
104-
105-
export SNOWFLAKE_USER="tf-snow"
106-
export SNOWFLAKE_PRIVATE_KEY_PATH="../snowflake/snowflake_tf_snow_key.p8"
107-
export SNOWFLAKE_ACCOUNT="YOUR_ACCOUNT_LOCATOR"
85+
echo "CONFLUENT_CLOUD_EMAIL=add_your_email\nCONFLUENT_CLOUD_PASSWORD=add_your_password\nexport TF_VAR_confluent_cloud_api_key=\"add_your_api_key\"\nexport TF_VAR_confluent_cloud_api_secret=\"add_your_api_secret\"\nexport SNOWFLAKE_ACCOUNT=\"add_your_account_locator\"" > .accounts
10886
```
10987

110-
> **Note:** Run `source .env` at any time to update these values in your terminal session. Do NOT commit this file to a GitHub repo.
88+
> **Note:** This repo ignores `.accounts` file
11189
11290
### Confluent Cloud
11391

11492
1. Create Confluent Cloud API keys by following [this](https://registry.terraform.io/providers/confluentinc/confluent/latest/docs/guides/sample-project#summary) guide.
11593

11694
> **Note:** This is different than Kafka cluster API keys.
11795
118-
1. Update the `.env` file for the following variables with your credentials.
96+
1. Update the `.accounts` file for the following variables with your credentials.
11997

12098
```bash
12199
CONFLUENT_CLOUD_EMAIL=<replace>
@@ -162,11 +140,9 @@ This demo uses Terraform and bash scripting to create and teardown infrastructur
162140

163141
> **Note:** If your Snowflake account isn't in AWS-US-West-2 refer to [doc](https://docs.snowflake.com/en/user-guide/admin-account-identifier#snowflake-region-ids) to identify your account locator.
164142
165-
1. Update your `.env` file and add the newly created credentials for the following variables
143+
1. Update your `.accounts` file and add the newly created credentials for the following variable
166144

167145
```bash
168-
export SNOWFLAKE_USER="tf-snow"
169-
export SNOWFLAKE_PRIVATE_KEY_PATH="../snowflake/snowflake_tf_snow_key.p8"
170146
export SNOWFLAKE_ACCOUNT="YOUR_ACCOUNT_LOCATOR"
171147
```
172148

@@ -179,21 +155,31 @@ This demo uses Terraform and bash scripting to create and teardown infrastructur
179155

180156
> **Note:** For troubleshooting or more information review the [doc](https://quickstarts.snowflake.com/guide/terraforming_snowflake/index.html?index=..%2F..index#2).
181157
182-
1. Source the `.env` file.
158+
### Create a local environment file
183159

184-
```
160+
1. Navigate to the home directory of the project and run `create_env.sh` script. This bash script copies the content of `.accounts` file into a new file called `.env` and append additional variables to it.
161+
162+
```bash
185163
cd demo-change-data-capture
164+
./create_env.sh
165+
```
166+
167+
1. Source `.env` file.
168+
169+
```bash
186170
source .env
187171
```
188172

173+
> **Note:**: if you don't source `.env` file you'll be prompted to manually provide the values through command line when running Terraform commands.
174+
189175
### Build your cloud infrastructure
190176

191177
1. Log into your AWS account through command line.
192178

193179
1. Navigate to the repo's terraform directory.
194180

195181
```bash
196-
cd terraform
182+
cd demo-change-data-capture/terraform
197183
```
198184

199185
1. Initialize Terraform within the directory.
@@ -218,18 +204,18 @@ This demo uses Terraform and bash scripting to create and teardown infrastructur
218204
terraform apply -var sg_package="ADVANCED"
219205
```
220206

221-
1. Write the output of `terraform` to a JSON file. The `env.sh` script will parse the JSON file to update the `.env` file.
207+
1. Write the output of `terraform` to a JSON file. The `setup.sh` script will parse the JSON file to update the `.env` file.
222208

223209
```bash
224210
terraform output -json > ../resources.json
225211
```
226212

227213
> **Note:** _Verify that the `resources.json` is created at root level of demo-change-data-capture directory._
228214
229-
1. Run the `env.sh` script.
215+
1. Run the `setup.sh` script.
230216
```bash
231217
cd demo-change-data-capture
232-
./env.sh
218+
./setup.sh
233219
```
234220
1. This script achieves the following:
235221

create_env.sh

Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
#!/bin/bash
2+
3+
accounts_file=".accounts"
4+
env_file=".env"
5+
6+
# Check if .accounts file exists
7+
if [ ! -f "$accounts_file" ]; then
8+
echo "$accounts_file not found."
9+
exit 1
10+
fi
11+
12+
# Define the environment variable content
13+
env_content=$(cat <<EOF
14+
CCLOUD_API_KEY=api-key
15+
CCLOUD_API_SECRET=api-secret
16+
CCLOUD_BOOTSTRAP_ENDPOINT=kafka-cluster-endpoint
17+
18+
ORACLE_USERNAME=admin
19+
ORACLE_PASSWORD=demo-cdc-c0nflu3nt!
20+
ORACLE_ENDPOINT=oracle-endpoint
21+
ORACLE_PORT=1521
22+
23+
POSTGRES_PRODUCTS_ENDPOINT=postgres-products
24+
REDSHIFT_ADDRESS=redshift-address
25+
26+
SF_PVT_KEY=snowflake-private-key
27+
28+
export SNOWFLAKE_USER="tf-snow"
29+
export SNOWFLAKE_PRIVATE_KEY_PATH="../snowflake/snowflake_tf_snow_key.p8"
30+
EOF
31+
)
32+
33+
# Combine the environment variable content with .accounts and write to .env
34+
echo "$env_content" | cat - "$accounts_file" > "$env_file"
35+
36+
echo "Created an environment file named: $env_file"

example.env

Lines changed: 0 additions & 24 deletions
This file was deleted.

setup.sh

Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
#!/bin/bash
2+
3+
sleep_time=2
4+
env_file=".env"
5+
6+
# Use confluent environment
7+
confluent login --save
8+
export CCLOUD_ENV_ID=$(confluent environment list -o json \
9+
| jq -r '.[] | select(.name | contains('\"${CCLOUD_ENV_NAME:-Demo_Change_Data_Capture}\"')) | .id')
10+
11+
confluent env use $CCLOUD_ENV_ID
12+
13+
# Use kafka cluster
14+
export CCLOUD_CLUSTER_ID=$(confluent kafka cluster list -o json \
15+
| jq -r '.[] | select(.name | contains('\"${CCLOUD_CLUSTER_NAME:-demo_kafka_cluster}\"')) | .id')
16+
17+
confluent kafka cluster use $CCLOUD_CLUSTER_ID
18+
19+
# Get cluster bootstrap endpoint
20+
export CCLOUD_BOOTSTRAP_ENDPOINT=$(confluent kafka cluster describe -o json | jq -r .endpoint)
21+
STRIPPED_CCLOUD_BOOTSTRAP_ENDPOINT=$(echo $CCLOUD_BOOTSTRAP_ENDPOINT | sed 's/SASL_SSL:\/\///')
22+
23+
# use sed to replace kafka-cluster-endpoint with the replacement string
24+
sed -i .bak "s/kafka-cluster-endpoint/$STRIPPED_CCLOUD_BOOTSTRAP_ENDPOINT/g" $env_file
25+
echo "Added Kafka cluster endpoint to $env_file"
26+
sleep $sleep_time
27+
28+
# Create an API key pair to use for connectors
29+
echo "Creating Kafka cluster API key"
30+
CREDENTIALS=$(confluent api-key create --resource $CCLOUD_CLUSTER_ID --description "demo-change-data-capture" -o json)
31+
kafka_api_key=$(echo $CREDENTIALS | jq -r '.api_key')
32+
kafka_api_secret=$(echo $CREDENTIALS | jq -r '.api_secret')
33+
sleep $sleep_time
34+
35+
# use sed to replace all instances of $kafka_api_key with the replacement string
36+
sed -i .bak "s^api-key^\"$kafka_api_key\"^g" $env_file
37+
sed -i .bak "s^api-secret^\"$kafka_api_secret\"^g" $env_file
38+
echo "Added Kafka API key and secret to $env_file"
39+
40+
sleep $sleep_time
41+
42+
# Read values from resources.json and update the $env_file file.
43+
# These resources are created by Terraform
44+
json=$(cat resources.json)
45+
46+
oracle_endpoint=$(echo "$json" | jq -r '.oracle_endpoint.value.address')
47+
postgres_products=$(echo "$json" | jq -r '.postgres_instance_products_public_endpoint.value')
48+
49+
raw_snowflake_svc_private_key=$(echo "$json" | jq -r '.snowflake_svc_private_key.value')
50+
snowflake_svc_private_key=$(echo "$raw_snowflake_svc_private_key" | sed '/-----BEGIN RSA PRIVATE KEY-----/d; /-----END RSA PRIVATE KEY-----/d' | tr -d '\n')
51+
52+
redshift_endpoint=$(echo "$json" | jq -r '.redshift_endpoint.value')
53+
redshift_address=$(echo $redshift_endpoint | sed 's/:5439//')
54+
55+
# Updating the $env_file file with sed command
56+
sed -i .bak "s^oracle-endpoint^$oracle_endpoint^g" $env_file
57+
sed -i .bak "s^postgres-products^$postgres_products^g" $env_file
58+
sed -i .bak "s^snowflake-private-key^\"$snowflake_svc_private_key\"^g" $env_file
59+
sed -i .bak "s^redshift-address^$redshift_address^g" $env_file
60+
61+
echo "Added Oracle endpoint to $env_file"
62+
echo "Added PostgreSQL endpoint to $env_file"
63+
echo "Added Snowflake private key to $env_file"
64+
echo "Added Amazon Redshift address to $env_file"
65+
66+
# sleep $sleep_time

0 commit comments

Comments
 (0)