An advanced OpenDevStack Frontend Quickstarter to build mobile and desktop apps with the ionic framework, react, vite and playwright.
- Ionic/React with Typescript for building cross-platform native and web app
- Single-Sign-On (SSO) for user authentication and authorization with Azure Active Directory
- OpenDevStack (ODS) CI/CD configuration out of the box with the basic setup for Docker (incl. injecting Runtime Variables), Jenkins (incl. feature environments, release-manager rollout) and OpenShift (managed with Helm)
- Ionic Appflow (coming soon)
- Setup for Vite, Playwright, ESlint, Stylelint, Prettier, commitlint, Husky (git hooks) and semantic versioning for a better developer experience
For an official ODS Quickstarter, the provisioning app takes care of the provisioning in combination with the Jenkinsfile in the associated Quickstarter template. However, since this is an extended Quickstarter, which is developed independently and decoupled from ODS, the necessary steps of the provisioning app and Jenkins itself must be performed, which are covered in this section.
To provision this Quickstarter, you need a deployed ODS project with the corresponding *-cd, *-dev and *-test projects in the OpenShift 4 dev cluster, the *-prod project in the OpenShift 4 prod cluster and the associated Bitbucket project.
-
Option 1 (recommended): Clone the repository
git clone https://github.com/SimonGolms/ods-quickstarter-fe-ionic-react-vite-playwright.git cd ods-quickstarter-fe-ionic-react-vite-playwright -
Option 2: Download the repository
curl --location --remote-name https://github.com/SimonGolms/ods-quickstarter-fe-ionic-react-vite-playwright/archive/refs/heads/main.tar.gz && \ tar -xvzf main.tar.gz && \ rm main.tar.gz cd ods-quickstarter-fe-ionic-react-vite-playwright-main
To make the Quickstarter available in your project, the corresponding project id is required. With the following command all files are checked. The placeholder PROJECTID is searched and replaced by the actual project id.
Replace YOUR_PROJECT_ID with your project id, e.g. foo
# IMPORTANT: Keep your project id in lowercase.
find . -type f -exec sed --expression 's/PROJECTID/YOUR_PROJECT_ID/g' --in-place {} +π IMPORTANT: This and the other commands also replace the placeholders in the other sections of the documentation. It is therefore recommended to continue with the README in the downloaded source code. Otherwise, please be aware that you have to replace the placeholder PROJECTID with your project id for each further command.
Replace YOUR_COMPONENT_ID with your component id, e.g. app, frontend, etc.
# IMPORTANT: Keep your component id in lowercase.
find . -type f -exec sed --expression 's/COMPONENTID/YOUR_COMPONENT_ID/g' --in-place {} +Replace YOUR_OPENSHIFT_DOMAIN_DEV with your OpenShift 4 Dev Cluster Domain, e.g. dev.ocp.company.com
How do I find out the YOUR_OPENSHIFT_DOMAIN_DEV value?
Go to your *-cd project of your OpenShift 4 Dev Cluster in the browser and look at the URL properly. Extract the YOUR_OPENSHIFT_DOMAIN_DEV as follows:
https://console-openshift-console.apps.dev.ocp.company.com/topology/ns/PROJECTID-cd βββ Your url
https://console-openshift-console.apps. dev.ocp.company.com /topology/ns/PROJECTID-cd
β² β² β²
β β ββββ Pathname
β βββ YOUR_OPENSHIFT_DOMAIN_DEV
ββ Application Sub Domainfind . -type f -exec sed --expression 's/OPENSHIFT_DOMAIN_DEV/YOUR_OPENSHIFT_DOMAIN_DEV/g' --in-place {} +Replace YOUR_OPENSHIFT_DOMAIN_PROD with your OpenShift 4 Dev Cluster Domain, e.g. prod.ocp.company.com
How do I find out the YOUR_OPENSHIFT_DOMAIN_PROD value?
Go to your *-prod project of your OpenShift 4 Dev Cluster in the browser and look at the URL properly. Extract the YOUR_OPENSHIFT_DOMAIN_PROD as follows:
https://console-openshift-console.apps.dev.ocp.company.com/topology/ns/PROJECTID-cd βββ Your url
https://console-openshift-console.apps. dev.ocp.company.com /topology/ns/PROJECTID-cd
β² β² β²
β β ββββ Pathname
β βββ YOUR_OPENSHIFT_DOMAIN_PROD
ββ Application Sub Domainfind . -type f -exec sed --expression 's/OPENSHIFT_DOMAIN_PROD/YOUR_OPENSHIFT_DOMAIN_PROD/g' --in-place {} +Replace YOUR_BITBUCKET_DOMAIN with your BitBucket Domain, e.g. bitbucket.company.com
How do I find out the YOUR_BITBUCKET_DOMAIN value?
Go to your BitBucket project in the browser and look at the URL properly. Extract the YOUR_BITBUCKET_DOMAIN as follows:
https://bitbucket.company.com/projects/PROJECTID βββ Your url
https:// bitbucket.company.com /projects/PROJECTID
β² β² β²
β β ββββ Pathname
β βββ YOUR_BITBUCKET_DOMAIN
ββ Protocolfind . -type f -exec sed --expression 's/BITBUCKET_DOMAIN/YOUR_BITBUCKET_DOMAIN/g' --in-place {} +rm -rf .git .github CHANGELOG.mdHelm is a package manager for Kubernetes that configures and deploys applications and services on a Kubernetes/OpenShift cluster. Think of it like apt/yum/homebrew for Kubernetes. It uses Helm charts to simplify the development and deployment process.
helm will be used later in the pre-commit git hook for linting the application Helm charts.
# Install helm cli
curl https://raw.githubusercontent.com/helm/helm/main/scripts/get-helm-3 | bash
# Verify successfully installed helm version
helm versionMore Information: https://helm.sh/docs/intro/install/
With the OpenShift command-line interface (CLI), the oc command, you can create applications and manage OpenShift Container Platform projects from a terminal.
oc will be used by helm and for further configuration and housekeeping tasks.
To be compatible with the latest OpenShift Container Platform (OCP) version, the binary is downloaded and installed from the associated OpenShift Container Platform.
# Install oc cli
curl -O https://downloads-openshift-console.apps.OPENSHIFT_DOMAIN_DEV/amd64/linux/oc.tar && \
tar -xvf oc.tar && rm oc.tar && \
sudo mv oc /usr/local/bin/
# Verify successfully installed oc version
oc versionMore Information: https://docs.openshift.com/container-platform/4.9/cli_reference/openshift_cli/getting-started-cli.html
-
Create BitBucket Repository
- Replace
USER@COMPANY.COMwith an authorized (administrative) user with access to the Bitbucket project
# For security reasons (e.g. terminal history) --user 'USERNAME:PASSWORD' should be avoided. # Instead, a prompt will show up for the password if --user 'USERNAME' is used! curl --data '{"defaultBranch":"master","description":"π± Repo of COMPONENTID from PROJECTID that is build with ionic and react","name":"PROJECTID-COMPONENTID"}' \ --header "Content-Type: application/json" \ --request POST \ --url https://BITBUCKET_DOMAIN/rest/api/1.0/projects/PROJECTID/repos/ \ --user USER@COMPANY.COM
- Replace
-
Get trigger secret from the webhook proxy
# Login to OpenShift dev instance oc login --server=https://api.OPENSHIFT_DOMAIN_DEV:6443 --token=123...456 # Get trigger secret 'webhook-proxy' in plaintext oc get secret webhook-proxy --namespace PROJECTID-cd --output jsonpath='{.data.trigger-secret}' | base64 -d | xargs
-
Create Webhook
- Replace
TRIGGER_SECRETwith the obtained trigger secret from previous step. - Replace
USER@COMPANY.COMwith an authorized (administrative) user with access to the Bitbucket project
# For security reasons (e.g. terminal history) --user 'USERNAME:PASSWORD' should be avoided. # Instead, a prompt will show up for the password if --user 'USERNAME' is used! curl --data '{"active":true,"configuration":{},"events":["pr:merged","repo:refs_changed","pr:declined","pr:deleted"],"name":"Jenkins","url":"https://webhook-proxy-PROJECTID-cd.apps.OPENSHIFT_DOMAIN_DEV?trigger_secret=TRIGGER_SECRET"}' \ --header "Content-Type: application/json" \ --request POST \ --url https://BITBUCKET_DOMAIN/rest/api/1.0/projects/PROJECTID/repos/PROJECTID-COMPONENTID/webhooks \ --user USER@COMPANY.COM
- Replace
-
Publish to Bitbucket Code Repository
# Requires git v2.31.1 git init --initial-branch=master git add --all git commit -m "chore: initial version" git remote add origin https://BITBUCKET_DOMAIN/scm/PROJECTID/PROJECTID-COMPONENTID.git # Before you push your first commit, make sure that no credentials are in the README as a result of the previous steps. # You might also delete unnecessary content in this context, like the 'Provision Quickstarter' section of this README. git push -u origin HEAD:master
If the provisioning was successful, the previous push of the first commit should have triggered the first build process in Jenkins in the meantime, which can be viewed under the following link: https://jenkins-PROJECTID-cd.apps.OPENSHIFT_DOMAIN_DEV/job/PROJECTID-cd/job/PROJECTID-cd-COMPONENTID-master/
A new feature environment is created by the associated git branch name, e.g. feature/next.
# Creates and switch to new branch from the current branch
git checkout -b feature/next
# Add empty commit in case the previous commit includes '[skip ci]'
git commit -m "chore: create feature-next environment" --allow-empty
# Push the new branch to the remote repository
git push -u origin feature/nextA new Jenkins build should have been created and can be followed under the following link: https://jenkins-PROJECTID-cd.apps.OPENSHIFT_DOMAIN_DEV/job/PROJECTID-cd/job/PROJECTID-cd-COMPONENTID-feature-next/
Assuming the Jenkins build has been successfully completed, the application should have been created in the OpenShift 4 project PROJECTID-dev as a new HelmRelease resource and should be accessible under the following link: https://PROJECTID-COMPONENTID-feature-next.apps.OPENSHIFT_DOMAIN_DEV
-
Update your
metadata.ymlin your release manager repository# Example metadata.yml id: PROJECTID name: Project PROJECTID description: Description of PROJECTID environments: prod: apiUrl: api.OPENSHIFT_DOMAIN_PROD:6443 credentialsId: PROJECTID-cd-PROJECTID-prod repositories: - id: COMPONENTID branch: master url: https://bitbucket.biscrum.com/scm/PROJECTID/PROJECTID-COMPONENTID.git services: bitbucket: credentials: id: PROJECTID-cd-cd-user-with-password # jira: # credentials: # id: PROJECTID-cd-cd-user-with-password nexus: repository: name: leva-documentation
-
Go to the Jenkins build of the release manager and start a new build process in the
devenvironment. Assuming the release has been successfully completed, the application should have been created in the OpenShift 4 projectPROJECTID-devas a newHelmReleaseresource and should be accessible under the following link: https://PROJECTID-COMPONENTID-dev.apps.OPENSHIFT_DOMAIN_DEV -
Before you can deploy a release into
qa/testenvironment, you need to merge the release branch into your master branch to pass the checks in the Jenkins shared library stageodsOrchestrationPipeline, see comment inmetadata.ymlfor more details:# Switch to master branch git checkout master # Merge the remote release branch into master without opening a text editor and accept the auto-generated message git merge origin/release/<VERSION> --no-edit # Push the changes to the remote repository git push
-
Go to the Jenkins build of the release manager and start a new build process in the
qa/testenvironment. Assuming the release has been successfully completed, the application should have been created in the OpenShift 4 projectPROJECTID-testas a newHelmReleaseresource and should be accessible under the following link: https://PROJECTID-COMPONENTID-test.apps.OPENSHIFT_DOMAIN_DEV -
Go to the Jenkins build of the release manager and start a new build process in the
prodenvironment. Assuming the release has been successfully completed, the application should have been created in the OpenShift 4 projectPROJECTID-prodas a newHelmReleaseresource and should be accessible under the following link: https://PROJECTID-COMPONENTID.apps.OPENSHIFT_DOMAIN_DEV
Testing:
Tracking:
N/A
Linter:
Compiler:
IDEs/Editors:
-
Azure App Registration
Make sure you have an existing Azure App registry that has a single-page application (SPA) redirect to
http://localhost/.Ideally you have one Azure App registry per environment (
dev/test/prod) with at least the following SPA redirects:- Dev:
http://localhost/,https://PROJECTID-COMPONENTID-dev.apps.OPENSHIFT_DOMAIN_DEV - Test:
http://localhost/,https://PROJECTID-COMPONENTID-test.apps.OPENSHIFT_DOMAIN_DEV - Prod:
http://localhost/,https://PROJECTID-COMPONENTID.apps.OPENSHIFT_DOMAIN_PROD
Update the following entries with the
Application (client) IDandDirectory (tenant) IDfrom the corresponding app registry environment-
Replace
YOUR_CLIENT_ID_DEVwith theApplication (client) IDfrom your app registration for thedevenvironmentfind \( -wholename "./.env" -or -wholename "./chart/values.dev.yaml" -or -wholename "./Jenkinsfile" \) -exec sed --expression 's/11111111-2222-3333-4444-555555555dev/YOUR_CLIENT_ID_DEV/g' --in-place {} +
-
Replace
YOUR_CLIENT_ID_TESTwith theApplication (client) IDfrom your app registration for thetestenvironmentfind -wholename "./chart/values.test.yaml" -exec sed --expression 's/11111111-2222-3333-4444-55555555test/YOUR_CLIENT_ID_TEST/g' --in-place {} +
-
Replace
YOUR_CLIENT_ID_PRODwith theApplication (client) IDfrom your app registration for theprodenvironmentfind -wholename "./chart/values.prod.yaml" -exec sed --expression 's/11111111-2222-3333-4444-55555555prod/YOUR_CLIENT_ID_PROD/g' --in-place {} +
-
Replace
YOUR_TENANT_IDwith theDirectory (tenant) IDfrom your app registration, which is basically the same for per environment (dev/test/prod)find \( -wholename "./.env" -or -wholename "./chart/values.*.yaml" -or -wholename "./Jenkinsfile" \) -exec sed --expression 's/common/YOUR_TENANT_ID/g' --in-place {} +
- Dev:
More information: https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-spa-app-registration
- Helm v3+
- Node.js v16+
- NPM v8+
- oc v4.9+
Using NVM
# Install latest LTS Version (v16+) with the latest npm version (v8+)
nvm install --lts --latest-npmCreate appropriate .env file from .env.template.
cp .env.template .envAsk your colleagues which values are currently necessary!
npm installnpm run startStarts the development server and makes your application accessible at localhost:8100. Changes in the application code will be hot-reloaded.
npm run buildThe app is built for optimal performance: assets are minified and served gzipped.
npm run testnpm run build
mv build docker
docker build -t PROJECTID-COMPONENTID -f docker/Dockerfile dockerIn case the command RUN apk update && apk upgrade cannot be executed (e.g. working behind a proxy), uncomment it for the moment.
docker run -p 8080:8080 --env-file .env PROJECTID-COMPONENTIDStarts the nginx server and makes your application accessible at localhost:8080.
This CI/CD setup has been developed for the 'trunk-based development' approach.
[...] Trunk-based development is a version control management practice where developers merge small, frequent updates to a core βtrunkβ or main branch. [...] Trunk-based development is far more simplified since it focuses on the main branch as the source of fixes and releases. In trunk-based development the main branch is assumed to always be stable, without issues, and ready to deploy [...]. - https://www.atlassian.com/continuous-delivery/continuous-integration/trunk-based-development
The master branch is your only real source-of-true π and should always reflect the state as found in all three environments.
It is recommended not to merge any changes into the master-branch before a new release is triggered. Otherwise there is the risk of not being able to perform a hotfix immediately. The only solution remains a corresponding Git Strategy to restore the old state, import the necessary hotfix changes, release and then add the reset changes again.
With each commit, the source code in the master branch is checked for its releasability and tagged at the end with a new semantic version based on the git commit history.
%% If the Mermaid Diagram is not rendered (as is the case on BitBucket), it can be viewed at https://mermaid.live/
flowchart TB
subgraph openshift-dev["OpenShift (DEV)"]
subgraph ods
subgraph sonarqube["SonarQube"]
direction TB
ods-DC-sonarqube["sonarqube (DeploymentConfig)"]:::classDeploymentConfig <-. Port 9000 .-> ods-S-sonarqube["sonarqube (Service)"]:::classService <-. Port 9000 .-> ods-RT-sonarqube["sonarqube (Route)\nhttps://sonarqube-ods.apps.OPENSHIFT_DOMAIN_DEV"]:::classRoute
end
end
subgraph aqua
aqua-aqua["Aqua Container Security"]
end
subgraph PROJECTID-cd
subgraph webhook-proxy
direction TB
cd-DC-webhook-proxy["webhook-proxy (DeploymentConfig)"]:::classDeploymentConfig <-. Port 8080 .-> cd-S-webhook-proxy["webhook-proxy (Service)"]:::classService <-. Port 8080 .-> cd-RT-webhook-proxy["webhook-proxy (Route)\nhttps://webhook-proxy-PROJECTID-cd.apps.OPENSHIFT_DOMAIN_DEV"]:::classRoute
end
subgraph Jenkins
subgraph Jenkinsfile
stageInitialize-->stageInstallDependency-->stageVersioning-->stageWorkaroundFindOpenShiftImageOrElse --> stageAnalyzeCode-->odsComponentStageScanWithSonar-->stageBuild-->stageDeploy-->odsComponentStageBuildOpenShiftImage-->stageWorkaroundUnitTest-->stageWorkaroundRolloutDeployment-->stageRelease
end
end
subgraph cd-IS-PROJECTID-COMPONENTID-master["PROJECTID-COMPONENTID-master (Image Stream)"]
cd-IST-PROJECTID-COMPONENTID-master["PROJECTID-COMPONENTID-master:v1.4.2"]:::classImageStreamTag
end
end
end
subgraph BitBucket
subgraph bb-project-PROJECTID["PROJECTID (Project)"]
subgraph bb-project-PROJECTID-COMPONENTID["PROJECTID-COMPONENTID (Repo)"]
bb-project-PROJECTID-COMPONENTID-branch-master["master (Branch)"]
end
end
end
aqua-aqua -- scan --> cd-IST-PROJECTID-COMPONENTID-master
bb-project-PROJECTID-COMPONENTID -- trigger --> webhook-proxy -- trigger --> Jenkins
bb-project-PROJECTID-COMPONENTID-branch-master -- pull --> Jenkinsfile
odsComponentStageBuildOpenShiftImage -- push --> cd-IST-PROJECTID-COMPONENTID-master
odsComponentStageBuildOpenShiftImage -. trigger .-> aqua-aqua -. result .-> odsComponentStageBuildOpenShiftImage
odsComponentStageScanWithSonar -. trigger .-> sonarqube -. result .-> odsComponentStageScanWithSonar
stageRelease -- "push (v1.4.2)" --> bb-project-PROJECTID-COMPONENTID-branch-master
%% stlyes
classDef classBitBucket fill:#2684FF22,stroke:#2684FF,stroke-width:4px
classDef classBitBucketProject fill:#2684FF22,stroke:#2684FF
classDef classBuildConfig fill:#00408022,stroke:#004080
classDef classDeployment fill:#00408022,stroke:#004080
classDef classDeploymentConfig fill:#00408022,stroke:#004080
classDef classHelmRelease fill:#2b9af322,stroke:#2b9af3
classDef classImageStream fill:#2b9af322,stroke:#2b9af3
classDef classImageStreamTag fill:#2b9af322,stroke:#2b9af3
classDef classOcpProject fill:#ffffff00,stroke:#f00,stroke-width:2px
classDef classOcpResource fill:#ffffff00,stroke:#06c,stroke-width:2px
classDef classOpenShift fill:#ffffff00,stroke:#f00,stroke-width:4px
classDef classRoute fill:#2b9af322,stroke:#2b9af3
classDef classService fill:#6ca10022,stroke:#6ca100
class BitBucket classBitBucket
class bb-project-PROJECTID,bb-project-PROJECTID-COMPONENTID classBitBucketProject
class cd-IS-PROJECTID-COMPONENTID-master classImageStream
class aqua,ods,PROJECTID-cd,PROJECTID-dev classOcpProject
class aqua-aqua,Jenkins,Jenkinsfile,sonarqube,webhook-proxy classOcpResource
class openshift-dev classOpenShift
With each new feature/* branch created, a new environment is created in the OpenShift Project PROJECTID-cd.
Different stages are processed in the Jenkinsfile and finally rolled out and managed via Helm.
Please be aware that a new route (e.g. https://PROJECTID-COMPONENTID-feature-next.apps.OPENSHIFT_DOMAIN_DEV) is created for each new feature environment. If this is required for the SSO login, it must be specified as a new valid redirect URL in the app registration.
%% If the Mermaid Diagram is not rendered (as is the case on BitBucket), it can be viewed at https://mermaid.live/
flowchart TB
subgraph openshift-dev["OpenShift (DEV)"]
subgraph ods
subgraph sonarqube["SonarQube"]
direction TB
ods-DC-sonarqube["sonarqube (DeploymentConfig)"]:::classDeploymentConfig <-. Port 9000 .-> ods-S-sonarqube["sonarqube (Service)"]:::classService <-. Port 9000 .-> ods-RT-sonarqube["sonarqube (Route)\nhttps://sonarqube-ods.apps.OPENSHIFT_DOMAIN_DEV"]:::classRoute
end
end
subgraph PROJECTID-cd
subgraph webhook-proxy
direction TB
cd-DC-webhook-proxy["webhook-proxy (DeploymentConfig)"]:::classDeploymentConfig <-. Port 8080 .-> cd-S-webhook-proxy["webhook-proxy (Service)"]:::classService <-. Port 8080 .-> cd-RT-webhook-proxy["webhook-proxy (Route)\nhttps://webhook-proxy-PROJECTID-cd.apps.OPENSHIFT_DOMAIN_DEV"]:::classRoute
end
subgraph Jenkins
subgraph Jenkinsfile
stageInitialize-->stageInstallDependency-->stageVersioning-->stageWorkaroundFindOpenShiftImageOrElse -->|orElse| stageAnalyzeCode-->odsComponentStageScanWithSonar-->stageBuild-->stageDeploy-->odsComponentStageBuildOpenShiftImage-->stageWorkaroundUnitTest-->stageWorkaroundRolloutDeployment-->stageRelease
end
end
subgraph cd-IS-PROJECTID-COMPONENTID-feature-next["PROJECTID-COMPONENTID-feature-next (Image Stream)"]
cd-IST-PROJECTID-COMPONENTID-feature-next["PROJECTID-COMPONENTID-feature-next:hash"]:::classImageStreamTag
end
end
subgraph PROJECTID-dev
subgraph dev-HR-PROJECTID-COMPONENTID-feature-next["PROJECTID-COMPONENTID-feature-next (Helm Release)"]
dev-D-COMPONENTID["COMPONENTID (Deployment)"]:::classDeployment <-. Port 8080 .-> dev-S-COMPONENTID["COMPONENTID (Service)"]:::classService <-. Port 8080 .-> dev-RT-COMPONENTID["COMPONENTID (Route)\nhttps://PROJECTID-COMPONENTID-feature-next.apps.OPENSHIFT_DOMAIN_DEV"]:::classRoute
end
end
end
subgraph bitbucket["BitBucket"]
subgraph bitbucket-PROJECTID["PROJECTID (Project)"]
subgraph bitbucket-PROJECTID-COMPONENTID["PROJECTID-COMPONENTID (Repo)"]
bitbucket-PROJECTID-COMPONENTID-branch-feature-next["feature-next (Branch)"]
end
end
end
bitbucket-PROJECTID-COMPONENTID -- trigger --> webhook-proxy -- trigger --> Jenkins
bitbucket-PROJECTID-COMPONENTID-branch-feature-next -- pull --> Jenkinsfile
cd-IST-PROJECTID-COMPONENTID-feature-next --> dev-D-COMPONENTID
odsComponentStageBuildOpenShiftImage -- push --> cd-IST-PROJECTID-COMPONENTID-feature-next
odsComponentStageScanWithSonar <-.-> sonarqube
stageWorkaroundRolloutDeployment -- stageRolloutWithHelm --> dev-HR-PROJECTID-COMPONENTID-feature-next
%% stlyes
classDef classBitBucket fill:#2684FF22,stroke:#2684FF,stroke-width:4px
classDef classBitBucketProject fill:#2684FF22,stroke:#2684FF
classDef classBuildConfig fill:#00408022,stroke:#004080
classDef classDeployment fill:#00408022,stroke:#004080
classDef classDeploymentConfig fill:#00408022,stroke:#004080
classDef classHelmRelease fill:#2b9af322,stroke:#2b9af3
classDef classImageStream fill:#2b9af322,stroke:#2b9af3
classDef classImageStreamTag fill:#2b9af322,stroke:#2b9af3
classDef classOcpProject fill:#ffffff00,stroke:#f00,stroke-width:2px
classDef classOcpResource fill:#ffffff00,stroke:#06c,stroke-width:2px
classDef classOpenShift fill:#ffffff00,stroke:#f00,stroke-width:4px
classDef classRoute fill:#2b9af322,stroke:#2b9af3
classDef classService fill:#6ca10022,stroke:#6ca100
class BitBucket classBitBucket
class bitbucket-PROJECTID,bitbucket-PROJECTID-COMPONENTID classBitBucketProject
class dev-HR-PROJECTID-COMPONENTID-feature-next classHelmRelease
class cd-IS-PROJECTID-COMPONENTID-feature-next classImageStream
class ods,PROJECTID-cd,PROJECTID-dev classOcpProject
class Jenkins,Jenkinsfile,sonarqube,webhook-proxy classOcpResource
class openshift-dev classOpenShift
%% If the Mermaid Diagram is not rendered (as is the case on BitBucket), it can be viewed at https://mermaid.live/
flowchart TB
subgraph openshift-dev["OpenShift (DEV)"]
subgraph PROJECTID-cd
subgraph jenkins["Jenkins"]
subgraph jenkinsfile-PROJECTID-COMPONENTID["Jenkinsfile (PROJECTID-COMPONENTID)"]
stageInitialize-->stageInstallDependency-->stageVersioning-->stageWorkaroundFindOpenShiftImageOrElse --> stageAnalyzeCode-->odsComponentStageScanWithSonar-->stageBuild-->stageDeploy-->odsComponentStageBuildOpenShiftImage-->stageWorkaroundUnitTest-->stageWorkaroundRolloutDeployment-->stageRelease
end
build-with-parameters("<strong>Build with Parameters</strong>\nenvironment: dev\nversion: 20220518.001"):::classManualTask
subgraph jenkinsfile-releasemanager["Jenkinsfile (Release Manager)"]
InitStage --> BuildStage --> DeployStage --> TestStage --> ReleaseStage --> FinalizeStage
end
end
subgraph cd-IS-COMPONENTID["COMPONENTID (Image Stream)"]
cd-IST-COMPONENTID["COMPONENTID:20220518.001"]:::classImageStreamTag
end
end
subgraph ods
subgraph sonarqube["SonarQube"]
direction TB
ods-DC-sonarqube["sonarqube (DeploymentConfig)"]:::classDeploymentConfig <-. Port 9000 .-> ods-S-sonarqube["sonarqube (Service)"]:::classService <-. Port 9000 .-> ods-RT-sonarqube["sonarqube (Route)\nhttps://sonarqube-ods.apps.OPENSHIFT_DOMAIN_DEV"]:::classRoute
end
end
subgraph aqua
aqua-aqua["Aqua Container Security"]
end
subgraph PROJECTID-dev
subgraph dev-HR-COMPONENTID["COMPONENTID (Helm Release)"]
direction TB
dev-D-COMPONENTID["COMPONENTID (Deployment)"]:::classDeployment <-. Port 8080 .-> dev-S-COMPONENTID["COMPONENTID (Service)"]:::classService <-. Port 8080 .-> dev-RT-COMPONENTID["COMPONENTID (Route)\nhttps://PROJECTID-dev.apps.OPENSHIFT_DOMAIN_DEV"]:::classRoute
end
end
end
subgraph BitBucket
subgraph bitbucket-PROJECTID["PROJECTID (Project)"]
subgraph bitbucket-PROJECTID-COMPONENTID["PROJECTID-COMPONENTID (Repo)"]
bitbucket-PROJECTID-COMPONENTID-branch-master["master (Branch)"]
bitbucket-PROJECTID-COMPONENTID-branch-release["release/20220518.001 (Branch)"]
end
subgraph bitbucket-PROJECTID-releasemanager["PROJECTID-releasemanager (Repo)"]
bitbucket-PROJECTID-releasemanager-branch-master["master (Branch)"]
end
end
end
aqua-aqua -- scan --> cd-IST-COMPONENTID
cd-IST-COMPONENTID --> dev-D-COMPONENTID
bitbucket-PROJECTID-COMPONENTID-branch-master -- pull --> jenkinsfile-PROJECTID-COMPONENTID
bitbucket-PROJECTID-releasemanager-branch-master -- pull --> jenkinsfile-releasemanager
BuildStage -- trigger --> jenkinsfile-PROJECTID-COMPONENTID
build-with-parameters -. trigger .-> InitStage
FinalizeStage -- push --> bitbucket-PROJECTID-COMPONENTID-branch-release
FinalizeStage -- push --> bitbucket-PROJECTID-releasemanager-branch-master
odsComponentStageBuildOpenShiftImage -- push --> cd-IST-COMPONENTID
odsComponentStageBuildOpenShiftImage -. trigger .-> aqua-aqua -. result .-> odsComponentStageBuildOpenShiftImage
odsComponentStageScanWithSonar -. trigger .-> sonarqube -. result .-> odsComponentStageScanWithSonar
stageWorkaroundRolloutDeployment -- "Rollout with Helm" --> dev-HR-COMPONENTID
%% stlyes
classDef classBitBucket fill:#2684FF22,stroke:#2684FF,stroke-width:4px
classDef classBitBucketProject fill:#2684FF22,stroke:#2684FF
classDef classBuildConfig fill:#00408022,stroke:#004080
classDef classDeployment fill:#00408022,stroke:#004080
classDef classDeploymentConfig fill:#00408022,stroke:#004080
classDef classHelmRelease fill:#2b9af322,stroke:#2b9af3
classDef classImageStream fill:#2b9af322,stroke:#2b9af3
classDef classImageStreamTag fill:#2b9af322,stroke:#2b9af3
classDef classOcpProject fill:#ffffff00,stroke:#f00,stroke-width:2px
classDef classOcpResource fill:#ffffff00,stroke:#06c,stroke-width:2px
classDef classOpenShift fill:#ffffff00,stroke:#f00,stroke-width:4px
classDef classRoute fill:#2b9af322,stroke:#2b9af3
classDef classService fill:#6ca10022,stroke:#6ca100
classDef classManualTask fill:#65bd1022,stroke:#65bd10,stroke-width:4px
class BitBucket classBitBucket
class bitbucket-PROJECTID,bitbucket-PROJECTID-COMPONENTID,bitbucket-PROJECTID-releasemanager classBitBucketProject
class dev-HR-COMPONENTID classHelmRelease
class cd-IS-COMPONENTID classImageStream
class aqua,ods,PROJECTID-cd,PROJECTID-dev classOcpProject
class aqua-aqua,jenkins,jenkinsfile-PROJECTID-COMPONENTID,jenkinsfile-releasemanager,sonarqube,webhook-proxy classOcpResource
class openshift-dev classOpenShift
%% If the Mermaid Diagram is not rendered (as is the case on BitBucket), it can be viewed at https://mermaid.live/
flowchart TB
subgraph openshift-dev["OpenShift (DEV)"]
subgraph PROJECTID-cd
subgraph jenkins["Jenkins"]
build-with-parameters{{"<strong>Build with Parameters</strong>\nenvironment: qa\nversion: 20220518.001"}}:::classManualTask
subgraph jenkinsfile-releasemanager["Jenkinsfile (Release Manager)"]
InitStage --> BuildStage --> DeployStage --> TestStage --> ReleaseStage --> FinalizeStage
end
subgraph jenkinsfile-PROJECTID-COMPONENTID["Jenkinsfile (PROJECTID-COMPONENTID)"]
stageInitialize --> stageInstallDependency --> stageVersioning --> stageWorkaroundFindOpenShiftImageOrElse --> stageWorkaroundUnitTest --> stageWorkaroundRolloutDeployment --> stageRelease
end
end
subgraph cd-IS-COMPONENTID["COMPONENTID (Image Stream)"]
cd-IST-COMPONENTID["COMPONENTID:20220518.001"]:::classImageStreamTag
end
end
subgraph PROJECTID-test
subgraph test-HR-COMPONENTID["COMPONENTID (Helm Release)"]
test-D-COMPONENTID["COMPONENTID (Deployment)"]:::classDeployment <-. Port 8080 .-> test-S-COMPONENTID["COMPONENTID (Service)"]:::classService <-. Port 8080 .-> test-RT-COMPONENTID["COMPONENTID (Route)\nhttps://PROJECTID-test.apps.OPENSHIFT_DOMAIN_DEV"]:::classRoute
end
end
end
subgraph BitBucket
subgraph bitbucket-PROJECTID["PROJECTID (Project)"]
subgraph bitbucket-PROJECTID-COMPONENTID["PROJECTID-COMPONENTID (Repo)"]
bitbucket-PROJECTID-COMPONENTID-branch-master["master (Branch)"]
merge{{"<strong>Merge into master</strong>\nrelease/20220518.001 (Branch)"}}:::classManualTask
end
subgraph bitbucket-PROJECTID-releasemanager["PROJECTID-releasemanager (Repo)"]
bitbucket-PROJECTID-releasemanager-branch-master["master (Branch)"]
end
end
end
cd-IST-COMPONENTID <-.-> stageWorkaroundFindOpenShiftImageOrElse
cd-IST-COMPONENTID --> test-D-COMPONENTID
bitbucket-PROJECTID-COMPONENTID-branch-master -- pull --> jenkinsfile-PROJECTID-COMPONENTID
bitbucket-PROJECTID-releasemanager-branch-master -- pull --> jenkinsfile-releasemanager
merge --> bitbucket-PROJECTID-COMPONENTID-branch-master["master (Branch)"]
BuildStage -- trigger --> jenkinsfile-PROJECTID-COMPONENTID
build-with-parameters -. trigger .-> InitStage
FinalizeStage -- push --> bitbucket-PROJECTID-releasemanager-branch-master
stageWorkaroundRolloutDeployment -- "Rollout with Helm" --> test-HR-COMPONENTID
%% stlyes
classDef classBitBucket fill:#2684FF22,stroke:#2684FF,stroke-width:4px
classDef classBitBucketProject fill:#2684FF22,stroke:#2684FF
classDef classBuildConfig fill:#00408022,stroke:#004080
classDef classDeployment fill:#00408022,stroke:#004080
classDef classDeploymentConfig fill:#00408022,stroke:#004080
classDef classHelmRelease fill:#2b9af322,stroke:#2b9af3
classDef classImageStream fill:#2b9af322,stroke:#2b9af3
classDef classImageStreamTag fill:#2b9af322,stroke:#2b9af3
classDef classOcpProject fill:#ffffff00,stroke:#f00,stroke-width:2px
classDef classOcpResource fill:#ffffff00,stroke:#06c,stroke-width:2px
classDef classOpenShift fill:#ffffff00,stroke:#f00,stroke-width:4px
classDef classRoute fill:#2b9af322,stroke:#2b9af3
classDef classService fill:#6ca10022,stroke:#6ca100
classDef classManualTask fill:#65bd1022,stroke:#65bd10,stroke-width:4px
class BitBucket classBitBucket
class bitbucket-PROJECTID,bitbucket-PROJECTID-COMPONENTID,bitbucket-PROJECTID-releasemanager classBitBucketProject
class test-HR-COMPONENTID classHelmRelease
class cd-IS-COMPONENTID classImageStream
class aqua,ods,PROJECTID-cd,PROJECTID-test classOcpProject
class aqua-aqua,jenkins,jenkinsfile-PROJECTID-COMPONENTID,jenkinsfile-releasemanager,sonarqube,webhook-proxy classOcpResource
class openshift-dev classOpenShift
%% If the Mermaid Diagram is not rendered (as is the case on BitBucket), it can be viewed at https://mermaid.live/
flowchart TB
subgraph openshift-dev["OpenShift (DEV)"]
subgraph PROJECTID-cd
subgraph jenkins["Jenkins"]
build-with-parameters{{"<strong>Build with Parameters</strong>\nenvironment: prod\nversion: 20220518.001"}}:::classManualTask
subgraph jenkinsfile-releasemanager["Jenkinsfile (Release Manager)"]
InitStage --> BuildStage --> DeployStage --> TestStage --> ReleaseStage --> FinalizeStage
end
subgraph jenkinsfile-PROJECTID-COMPONENTID["Jenkinsfile (PROJECTID-COMPONENTID)"]
stageInitialize --> stageInstallDependency --> stageVersioning --> stageWorkaroundFindOpenShiftImageOrElse --> stageWorkaroundUnitTest --> stageWorkaroundRolloutDeployment --> stageRelease
end
end
subgraph cd-IS-COMPONENTID["COMPONENTID (Image Stream)"]
cd-IST-COMPONENTID["COMPONENTID:20220518.001"]:::classImageStreamTag
end
end
end
subgraph openshift-prod["OpenShift (PROD)"]
subgraph PROJECTID-prod
subgraph prod-HR-COMPONENTID["COMPONENTID (Helm Release)"]
prod-D-COMPONENTID["COMPONENTID (Deployment)"]:::classDeployment <-. Port 8080 .-> prod-S-COMPONENTID["COMPONENTID (Service)"]:::classService <-. Port 8080 .-> prod-RT-COMPONENTID["COMPONENTID (Route)\nhttps://PROJECTID.apps.OPENSHIFT_DOMAIN_PROD"]:::classRoute
end
end
end
subgraph BitBucket
subgraph bitbucket-PROJECTID["PROJECTID (Project)"]
subgraph bitbucket-PROJECTID-COMPONENTID["PROJECTID-COMPONENTID (Repo)"]
bitbucket-PROJECTID-COMPONENTID-branch-master["master (Branch)"]
end
subgraph bitbucket-PROJECTID-releasemanager["PROJECTID-releasemanager (Repo)"]
bitbucket-PROJECTID-releasemanager-branch-master["master (Branch)"]
end
end
end
cd-IST-COMPONENTID --> prod-D-COMPONENTID
cd-IST-COMPONENTID <-.-> stageWorkaroundFindOpenShiftImageOrElse
bitbucket-PROJECTID-COMPONENTID-branch-master -- pull --> jenkinsfile-PROJECTID-COMPONENTID
bitbucket-PROJECTID-releasemanager-branch-master -- pull --> jenkinsfile-releasemanager
BuildStage -- trigger --> jenkinsfile-PROJECTID-COMPONENTID
build-with-parameters -. trigger .-> InitStage
FinalizeStage -- push --> bitbucket-PROJECTID-releasemanager-branch-master
stageWorkaroundRolloutDeployment -- "Rollout with Helm" --> prod-HR-COMPONENTID
%% stlyes
classDef classBitBucket fill:#2684FF22,stroke:#2684FF,stroke-width:4px
classDef classBitBucketProject fill:#2684FF22,stroke:#2684FF
classDef classBuildConfig fill:#00408022,stroke:#004080
classDef classDeployment fill:#00408022,stroke:#004080
classDef classDeploymentConfig fill:#00408022,stroke:#004080
classDef classHelmRelease fill:#2b9af322,stroke:#2b9af3
classDef classImageStream fill:#2b9af322,stroke:#2b9af3
classDef classImageStreamTag fill:#2b9af322,stroke:#2b9af3
classDef classOcpProject fill:#ffffff00,stroke:#f00,stroke-width:2px
classDef classOcpResource fill:#ffffff00,stroke:#06c,stroke-width:2px
classDef classOpenShift fill:#ffffff00,stroke:#f00,stroke-width:4px
classDef classRoute fill:#2b9af322,stroke:#2b9af3
classDef classService fill:#6ca10022,stroke:#6ca100
classDef classManualTask fill:#65bd1022,stroke:#65bd10,stroke-width:4px
class BitBucket classBitBucket
class bitbucket-PROJECTID,bitbucket-PROJECTID-COMPONENTID,bitbucket-PROJECTID-releasemanager classBitBucketProject
class prod-HR-COMPONENTID classHelmRelease
class cd-IS-COMPONENTID classImageStream
class aqua,ods,PROJECTID-cd,PROJECTID-prod classOcpProject
class aqua-aqua,jenkins,jenkinsfile-PROJECTID-COMPONENTID,jenkinsfile-releasemanager,sonarqube,webhook-proxy classOcpResource
class openshift-dev,openshift-prod classOpenShift
π‘ From time to time, obsolete resources should be cleaned up. It would be best to have this automated at a later time. However, at the moment, this is not yet possible, because the webhook-proxy captures the deleted event and cannot be further customized, see: https://github.com/opendevstack/ods-core/blob/99d26527df60fbb4d72ba15a8c233e325ff37fe1/jenkins/webhook-proxy/main.go#L541-L556
git checkout master
# Cleaning outdated branches
git fetch --prune
# Delete all local branches except current branch (e.g. master)
git branch | grep --invert-match '^*' | xargs git branch -D
# Delete all remote branches except master (may take some time)
# Skip git hooks with '--no-verify'
git branch -r | grep 'origin' | grep --invert-match 'master$' | grep --invert-match HEAD | cut -d/ -f2- | while read line; do git push --no-verify origin :heads/$line; done;# Delete all local tags that do NOT match a pattern of a semantic version (MAJOR.MINOR.PATCH), e.g. ods-generated-v20220518.001, v1.0.0-next.5
git tag -l | grep --invert-match '^v[[:digit:]]*.[[:digit:]]*.[[:digit:]]*$' | xargs git tag -d
# Delete all remote tags that do NOT match a pattern of a semantic version (MAJOR.MINOR.PATCH), e.g. ods-generated-v20220518.001, v1.0.0-next.5
# Skip git hooks with '--no-verify'
git ls-remote --tags origin | cut -d/ -f3- | grep --invert-match '^v[[:digit:]]*.[[:digit:]]*.[[:digit:]]*$' | grep -v '}$' | xargs git push --delete --no-verify originWith the approach of making each feature available as a new deployed environment for testing before it is merged into the master branch, a number of environments are created in OpenShift over time. The easiest way to delete these is to use the following command:
# Login
oc login --server=https://api.OPENSHIFT_DOMAIN_DEV:6443 --token=123...456
# Switch to Project
oc project PROJECTID-dev
# Delete/Uninstall all feature charts
helm list | grep -e 'COMPONENTID' | cut -f1 | xargs helm uninstall
# Delete all other feature resources
oc get all --output jsonpath='{range .items[*]}{"oc delete "}{.kind}{" "}{.metadata.name}{" "}{"\n"}{end}' | grep -- "COMPONENTID-feature-" | while read -r line; do eval $line; doneFeature Pipelines: Since OpenShift 4, Jenkins pipelines are treated as BuildConfig. Unfortunately, with ODS@4.x in the Jenkins stage odsComponentStageBuildOpenShiftImage, all builds are also created as a BuildConfig in the cd project without any further information filled labels. A distinction is not obvious at first view, but can be figured out via the configuration parameter .spec.strategy.type (JenkinsPipeline vs Docker).
Release Pipelines: Can be deleted without any problems, since they do not create any further resources, instead they are directly cancelled due to [skip ci] in the commit message.
ODS Quickstarter: Can be deleted after successful creation without any problems, as there is no further need for them and they only take up resources unnecessarily.
# Login
oc login --server=https://api.OPENSHIFT_DOMAIN_DEV:6443 --token=123...456
# Switch Project
oc project PROJECTID-cd
# Delete all feature pipelines (may take some time)
oc get bc --output jsonpath='{range .items[*]}{.metadata.name}{"\t"}{.spec.strategy.type}{"\n"}{end}' | grep -e "JenkinsPipeline" | cut -f1 | grep -e "COMPONENTID-feature-" | while read -r line; do oc delete bc $line && sleep 10s; done
# Delete all release pipelines (may take some time)
oc get bc --output custom-columns=NAME:.metadata.name | grep -e "COMPONENTID-release-" | while read -r line; do oc delete bc $line && sleep 10s; done
# Delete all ods quickstarter pipelines (may take some time)
oc get bc --output custom-columns=NAME:.metadata.name | grep -e "ods-qs-" | while read -r line; do oc delete bc $line && sleep 10s; done- Improve Documentation
- Implement Android
- Implement iOS
- Implement Ionic Appflow
- Improve Testing
How do I find out which a Jenkins Agent with Node.js are available in my ODS@4.x setup?
How to find the oc login token
- Go to https://oauth-openshift.apps.OPENSHIFT_DOMAIN_DEV/oauth/token/display
- Click on
Display tokenorRequest another token
A Pull Request shows a merge conflict on chart/Chart.yaml, chart/values.yaml, CHANGELOG.md, metadata.yml, package-lock.json, package.json, README.md
This happens mainly when e.g. a new feature branch has already been created from master branch before the Jenkins job has been successfully completed with a release commit.
To avoid resolving all merge conflicts manually, this can already be specified in the merge command by the --strategy-option=theirs option to automatically accept all incoming changes.
Solution:
# Merge the remote master branch into the current one without opening a text editor (accept the auto-generated message) and accept all incoming changes on merge conflicts
git merge origin/master --no-edit --strategy-option=theirs
# (Optional) Add `skip ci` command to the previous merge commit
git commit --amend -m "$(git log --format=%s --max-count=1) [skip ci]"
# Push the changes to the remote repository
git pushThe Jenkins pipeline does not start and shows the following error message:[Failed] Failed to pull image "image-registry.openshift-image-registry.svc:5000/ods/jenkins-agent-nodejs16:4.x" ... [Failed] Error: ImagePullBackOff
It might happen that your ODS@4.x setup only provides a Jenkins agent with Node.js 12.x. However, in order to be able to work with the latest version and to have potential security holes closed, a Jenkins agent with the latest Node.js version is required for the build process in the CI/CD process.
FAQ: How do I find out which a Jenkins Agent with Node.js are available in my ODS@4.x setup?
In case it does not exist yet, it can be easily created with the following commands
FAQ: How to find the oc login token
Solution:
# Login to OpenShift dev instance
oc login --server=https://api.OPENSHIFT_DOMAIN_DEV:6443 --token=123...456
# Switch project
oc project PROJECTID-cd
# Provision jenkins-agent-nodejs-16
oc process -f https://raw.githubusercontent.com/SimonGolms/ods-jenkins-agent-nodejs/main/jenkins-agent-nodejs-16-template.yaml | oc create -f -For more information about the Jenkins agent, see: https://github.com/SimonGolms/ods-jenkins-agent-nodejs
The Release Manager finishes the release to the qa/test environment with the following yellow message: Finished: UNSTABLE
This state is already set at the beginning by the following message in the Jenkins log: WARN: app@<GIT-HASH-1> is NOT a descendant of <GIT-HASH-2>, which has previously been promoted to 'Q'. If <GIT-HASH-2> has been promoted to 'P' as well, promotion to 'P' will fail. Proceed with caution.
Before you can deploy a release into qa/test environment, you need to merge the release branch into your master branch to pass the checks in the Jenkins shared library stage odsOrchestrationPipeline, see comment in metadata.yml for more details:
Solution:
-
Merge release branch into master
# Switch to master branch git checkout master # Merge the remote release branch into master without opening a text editor and accept the auto-generated message git merge origin/release/<VERSION> --no-edit # Push the changes to the remote repository git push
-
Repeat step 1 for all other relevant code repositories which are also specified in the
metadata.ymlof the Release Manager code repository and are rolled out with helm, like a backend. -
In the Release Manager code repository fix the inconsistent ods state by deleting the
./ods-statefolder# Switch to master branch git checkout master # Fetch latest state to match the remote branch git pull # Remove inconsistent ods state rm -rf ods-state # Commit all changes git commit --all --message="chore(ods): remove inconsistent state" # Push the changes to the remote repository git push
-
Re-run the Release Manager pipeline with a new version. This time, be sure to merge the release branch into
masterbefore the further rollout towardsqa/testenvironment!
When committing via the VS Code interface, the following error message appears .husky/commit-msg: 4: npx: not found
Since we are using nvm as our versions manager for Node.js, we first need to tell husky where to find the appropriate binaries. This is done by creating and configuring the file ~/.huskyrc. Further Information: https://typicode.github.io/husky/#/?id=command-not-found
Solution:
-
Create
~/.huskyrcfile withnvmconfigurationcat > ~/.huskyrc << EOF # This loads nvm.sh and sets the correct PATH before running hook export NVM_DIR="$HOME/.nvm" [ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" EOF
-
Restart VS Code
Simon Golms:
- Digital Card:
npx simongolms - Github: @SimonGolms
- Website: gol.ms
Contributions, issues and feature requests are welcome!
Give a βοΈ if this project helped you!
Copyright Β© 2022 Simon Golms.
This project is Apache-2.0 licensed.