Skip to content

Commit 05fb6ec

Browse files
committed
Grammar and metaphor consistency
1 parent 38cc775 commit 05fb6ec

File tree

5 files changed

+29
-29
lines changed

5 files changed

+29
-29
lines changed

content/modernizr/00-preparation/index.en.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ Confirm the settings and initialize the connection:
5353

5454
## Step 4: Validating the Integration
5555

56-
Verify the Bedrock connection is functioning correctly by sending the below test prompt to Cline. If you encounter an error asking you to please wait before trying again, press "proceed anyway" to retry the request.
56+
Verify the Bedrock connection is functioning correctly by sending the below test prompt to Cline. If you are throttled, press "proceed anyway" to retry the request.
5757

5858
```terminal
5959
Hello and Welcome to this modernization project, can you confirm you can read and list all the files in the workspace?

content/modernizr/01-modernization/index.en.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ The e-commerce application follows a standard three-tier architecture:
2222
2. **Frontend Application** - React-based user interface for customer interactions
2323
3. **MySQL Database** - Relational database storing all application data
2424

25-
You'll need to start both the backend and frontend services to establish a baseline for analysis.
25+
While your application environment is already preconfigured, you'll still need to start both the backend and frontend services to establish a baseline for analysis.
2626

2727
### Initializing the Backend Service
2828

content/modernizr/01-modernization/workflow-00.en.md

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -8,51 +8,51 @@ chapter: false
88

99
## Your 7-Step Journey to Database Modernization
1010

11-
Think of database modernization like renovating a house while people are still living in it. You can't just tear everything down and start over - you need a careful, step-by-step plan that keeps everything working while you upgrade piece by piece. That's exactly what our modernization workflow does!
11+
Think of database modernization like renovating a house while people are still living in it. You can't just tear everything down and start over you need a careful, step-by-step plan that keeps everything working while you upgrade piece by piece. That's exactly what our modernization workflow does!
1212

13-
Our process consists of seven carefully designed stages, each one building on the previous step. It's like following a recipe - each ingredient needs to be added at the right time and in the right order to get the perfect result.
13+
Our process consists of seven carefully designed stages, each one building on the previous step. It's like following a recipe each ingredient needs to be added at the right time and in the right order to get the perfect result.
1414

1515
![Modernization workflow](/static/images/modernizr/1/workflow-base-01.png)
1616

17-
## Stage 1: Understanding What You Have - Database Detective Work
17+
## Stage 1: Understanding What You Have Database Detective Work
1818

19-
The first stage is like being a detective investigating the current system. We need to understand everything about how the existing MySQL database works before we can improve it. This involves connecting to the database, examining how fast different operations run, looking at the structure of all the tables, and studying the application code to understand exactly how data flows through the system.
19+
The first stage is like being a detective investigating the current system. We need to understand everything about how the existing MySQL database works before we can improve it. This involves connecting to the database, examining what predicates are used in queries and what data is returned, how fast different operations run, looking at the structure of all the tables, and studying the application code to understand exactly how data flows through the system.
2020

21-
Think of it like a mechanic who needs to understand every part of your current engine before they can recommend which parts to upgrade. We use a specialized AI tool (the MySQL MCP server) to help us gather all this information systematically.
21+
Think of it like doing a walkthrough with a designer before your remodel. They need to understand your taste and how you use your space to fully capture the requirements for the project. We use a specialized AI tool (the MySQL MCP server) to help us gather all this information systematically.
2222

23-
## Stage 2: Designing the New Blueprint - Creating Your DynamoDB Model
23+
## Stage 2: Designing the New Blueprint Creating Your DynamoDB Model
2424

25-
This is where the real design work happens! Using all the information we gathered in Stage 1, we create a completely new data model designed specifically for DynamoDB. This stage is highly interactive - you'll work closely with the AI to make important decisions about how to structure your data.
25+
This is where the real design work happens! Using all the information we gathered in Stage 1, we create a completely new data model designed specifically for DynamoDB. This stage is highly interactive you'll work closely with the AI to make important decisions about how to structure your data.
2626

2727
It's like working with an architect to design your dream house renovation. The AI provides technical expertise and suggestions, but you need to guide the process and make the final decisions about what works best for your specific needs. This collaboration ensures the new design fits your application's requirements.
2828

29-
## Stage 3: Building the Bridge - Creating a Database Abstraction Layer
29+
## Stage 3: Building the Bridge Creating a Database Abstraction Layer
3030

3131
Now we create a special "bridge" layer in your application code that can talk to both the old MySQL database and the new DynamoDB system at the same time. This follows AWS best practices and ensures you can switch between systems safely without breaking anything.
3232

33-
Think of this like installing a smart electrical panel in your house that can work with both the old wiring and new smart home devices. Everything continues to work normally while you prepare for the upgrade.
33+
Think of this like rennovating the guest room before you rennovate the primary bedroom — you'll always have somewhere to sleep. Similarly with a database abstraction layer. everything continues to work normally while you prepare for the upgrade.
3434

35-
## Stage 4: Testing the Connection - Validating DynamoDB Integration
35+
## Stage 4: Testing the Connection Validating DynamoDB Integration
3636

37-
In this stage, we set up a local version of DynamoDB and test our bridge layer to make sure everything works correctly. It's like doing a test run of your renovated house systems before you actually move in permanently.
37+
In this stage, we set up a local version of DynamoDB and test our bridge layer to make sure everything works correctly. It's like the city inspector making sure your renovated house systems are up to code before you complete the project.
3838

3939
We validate that all the connections work properly and that data flows correctly through both systems. This gives us confidence that everything is ready for the next phase.
4040

41-
## Stage 5: Running Both Systems - Application Refactoring and Dual Writes
41+
## Stage 5: Running Both Systems Application Refactoring and Dual Writes
4242

4343
This is the most complex stage, where your application learns to write data to both databases simultaneously. We use a method called "test-driven development," which means we write tests first to ensure everything works correctly, then modify the code to pass those tests.
4444

45-
During this stage, we also create a special admin control panel that lets you monitor and control the modernization process. It's like having a control room where you can watch both the old and new systems running side by side and manage the transition safely.
45+
During this stage, we also create a special admin control panel that lets you monitor and control the modernization process. You can watch both the old and new systems running side by side and manage the transition safely.
4646

47-
## Stage 6: Moving to the Cloud - Deploying the cloud Infrastructure
47+
## Stage 6: Moving to the Cloud Deploying the cloud Infrastructure
4848

49-
Once everything is tested and working locally, we deploy your new DynamoDB tables to the actual AWS cloud environment. This is like moving your furniture into your newly renovated house - everything needs to be in the right place and working properly.
49+
Once everything is tested and working locally, we deploy your new DynamoDB tables to the actual AWS cloud environment. You've finally got your Certificate of Occupancy!
5050

5151
The deployment process ensures your cloud infrastructure is set up correctly and ready to handle real traffic.
5252

53-
## Stage 7: The Great Migration - Moving Your Data
53+
## Stage 7: The Great Migration Moving Your Data
5454

55-
The final stage is where we actually move all your existing data from MySQL to DynamoDB. This is carefully controlled and monitored - you decide when you're ready to start using dual writes, and then we gradually migrate all your historical data.
55+
The final stage is where we actually move all your existing data from MySQL to DynamoDB. This is carefully controlled and monitored you decide when you're ready to start using dual writes, and then we gradually migrate all your historical data.
5656

5757
We use specialized data processing tools (like the AWS Glue MCP Server) to handle this migration safely and efficiently. It's like having professional movers who ensure all your belongings get to the new house safely and end up in exactly the right places.
5858

content/modernizr/01-modernization/workflow-01.en.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,15 +12,15 @@ The `/prompts` directory implements a structured approach to AI-driven developme
1212

1313
![Prompts](/static/images/modernizr/1/workflow-prompt-01.png)
1414

15-
### Requirements Documentation - Defining Objectives and Success Criteria
15+
### Requirements Documentation Defining Objectives and Success Criteria
1616

1717
The Requirements Document establishes the foundational context by articulating the business objectives and technical constraints that drive the modernization initiative. This document defines explicit acceptance criteria and success metrics, creating what software engineers refer to as the "definition of done." By providing comprehensive context about the problem domain, the requirements document enables the LLM to understand not just what needs to be built, but why it needs to be built and how to validate that the implementation meets the specified goals.
1818

19-
### Design Documentation - Technical Architecture and Implementation Strategy
19+
### Design Documentation Technical Architecture and Implementation Strategy
2020

2121
The Design Document serves as the technical specification that translates high-level requirements into concrete architectural decisions and implementation strategies. This document defines the specific methodologies, data structures, and system workflows that will be employed throughout the modernization process. It includes detailed implementation guidelines, architectural patterns, and design rationale that provide the LLM with a comprehensive technical blueprint for executing the modernization according to established software engineering principles.
2222

23-
### Task Documentation - Executable Implementation Steps
23+
### Task Documentation Executable Implementation Steps
2424

2525
The Tasks Document functions as the bridge between abstract architectural design and concrete implementation by decomposing design specifications into discrete, executable development tasks. This document provides sequenced instructions that reference specific files, tools, and expected deliverables, ensuring the LLM receives actionable directives rather than abstract concepts. The task breakdown transforms architectural decisions into manageable development units that can be systematically executed and validated.
2626

content/modernizr/01-modernization/workflow-02.en.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -6,21 +6,21 @@ weight: 33
66
chapter: false
77
---
88

9-
Each stage generates "Artifacts" deliverables that will be used in the future stages across the solution, this project works sequentially, where you will see how artifacts from initial stages are re-used. Every stage will create a new folder `stage-xx` where all the artifacts will be stored. In addition another file `xx-working_log.md` will be generated, this file is used by the LLM to keep track of the work that has been done so far, consider it as a notepad, or scratch pad.
9+
Each stage generates "Artifacts" deliverables that will be used in the future stages across the solution. This project works sequentially, using output artifacts from initial stages as input for the next. Every stage will create a new folder `stage-xx` where all the artifacts will be stored. In addition another file `xx-working_log.md` will be generated, this file is used by the LLM to keep track of the work that has been done so far, consider it as a notepad, or scratch pad.
1010

1111
::alert[If you execute this workshop from scratch (as available in `/clean-start` folder) it will take ~11.5 hours to complete, where the most of its time will be spent in application re-factoring (stages 3, 4 and 5). For simplicity and to streamline the duration of this workshop you will have these steps already completed for you.]{type="info"}
1212

1313
# Stage-01 artifacts
1414

15-
Let's start exploring the artifacts available for the first stage `stage-01`. This stage is focused on capturing the data that is available from the source database and application backend logic. We use the MySQL MCP server to understand table structure, constraints and data, The MySQL query logs, to identify the data velocity and finally we explore the application logic to capture all the access patterns that we will need to modernize.
15+
Let's start exploring the artifacts available for the first stage `stage-01`. This stage is focused on capturing the data that is available from the source database and application backend logic. We use the MySQL MCP server to understand table structure, constraints and data. Next we use the MySQL query logs to identify the data . Finally, we explore the application logic to capture all the access patterns that we will need to modernize.
1616

1717
![Artifacts](/static/images/modernizr/1/workflow-artifacts-01.png)
1818

19-
DynamoDB is all about application access patterns that we need to support, how the data in your application will be using the DynamoDB tables. The secret to DynamoDB data modelling is to store the data exactly in the format your application will consume it, while working with different entities if you happen to have some.
19+
DynamoDB is all about application the access patterns that we need to support. The secret to DynamoDB data modelling is to store data exactly in the format your application will consume it, structing your data in a way that can be read as efficiently as possible with the smallest number of queries.
2020

21-
Open the files available in the `stage-01` folder, familiarize with them and understand in detail the current application access patterns, remember this is the application logic that you will need to support in the modernized application.
21+
Open the files available in the `stage-01` folder, familiarize with them and understand in detail the current application access patterns. This is the application logic that you will need to support in the modernized application.
2222

23-
- `01_1_API_access_patterns.md` - This file should be your primary source of information, it contains an analysis of the backend code, it should contain at the end a numbered list of 48 different application access patterns! You can also learn more about this project by reading the README available in the front end folder `frontend/README.md` It contains a description of which patters requires authentication and a quick explanation on how to execute the API calls.
24-
- `01_2_mysql_log_analysis.md` - This file is the MySQL log analysis, it contains a description of how many times some access patterns were detected, we run a small load test to simulate some traffic, and capture meaningful information, please notice the load test does not capture 100% of the application access patterns with the logs. If you plan to use a similar approach for your modrnization process, you need to enable the logs for a period of time that allows you to capture most of your traffic, however you should keep in mind there might be some application access patters that might not be covered, since the API endpoints might not have enough traffic to be collected during the test.
25-
- `01_3_table_structure_analysis.md` - Uses the MySQL MCP server to explore table structure and identify table contraints, relationships and data format.
23+
- `01_1_API_access_patterns.md` This file should be your primary source of information. It contains an analysis of the backend code. When the LLM finishes creating it, it should contain a numbered list of 48 different application access patterns! If you want to better understand these access patterns you can learn more about this project by reading the README available in the front end folder `frontend/README.md` It contains a description of which patters requires authentication and a quick explanation on how to execute the API calls.
24+
- `01_2_mysql_log_analysis.md` This file is the MySQL log analysis, containing a description of how many times different access patterns were detected. We run a small load test to simulate traffic and capture data on the results. Please note that the load test does not capture 100% of the application access patterns in the logs. If you plan to use a similar approach for your modrnization process, you should use logs that capture live traffic for a period of time sufficient to capture all required patterns. Keep in mind, though, that there still might be some application access patters that were not captured if they weren't exercised during the loging window.
25+
- `01_3_table_structure_analysis.md` Uses the MySQL MCP server to explore table structure and identify table contraints, relationships, and data format.
2626

0 commit comments

Comments
 (0)