Skip to content
This repository was archived by the owner on Jul 29, 2024. It is now read-only.

Commit 7e70b72

Browse files
committed
initial commit with support for Flink 1.11 and JDK 11.
1 parent 4779217 commit 7e70b72

File tree

5 files changed

+52
-77
lines changed

5 files changed

+52
-77
lines changed

.classpath

Lines changed: 7 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -13,14 +13,15 @@
1313
</classpathentry>
1414
<classpathentry kind="src" output="target/test-classes" path="src/test/java">
1515
<attributes>
16+
<attribute name="test" value="true"/>
1617
<attribute name="optional" value="true"/>
1718
<attribute name="maven.pomderived" value="true"/>
18-
<attribute name="test" value="true"/>
1919
</attributes>
2020
</classpathentry>
2121
<classpathentry kind="src" path="src/text/java"/>
22-
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8">
22+
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.launching.macosx.MacOSXType/Java SE 11.0.6 [11.0.6]">
2323
<attributes>
24+
<attribute name="module" value="true"/>
2425
<attribute name="maven.pomderived" value="true"/>
2526
</attributes>
2627
</classpathentry>
@@ -31,26 +32,20 @@
3132
</classpathentry>
3233
<classpathentry kind="src" path="target/generated-sources/annotations">
3334
<attributes>
35+
<attribute name="ignore_optional_problems" value="true"/>
3436
<attribute name="optional" value="true"/>
3537
<attribute name="maven.pomderived" value="true"/>
36-
<attribute name="ignore_optional_problems" value="true"/>
3738
<attribute name="m2e-apt" value="true"/>
3839
</attributes>
3940
</classpathentry>
40-
<classpathentry excluding="**" kind="src" output="target/test-classes" path="src/test/resources">
41-
<attributes>
42-
<attribute name="maven.pomderived" value="true"/>
43-
<attribute name="test" value="true"/>
44-
</attributes>
45-
</classpathentry>
4641
<classpathentry kind="src" output="target/test-classes" path="target/generated-test-sources/test-annotations">
4742
<attributes>
43+
<attribute name="ignore_optional_problems" value="true"/>
44+
<attribute name="test" value="true"/>
4845
<attribute name="optional" value="true"/>
4946
<attribute name="maven.pomderived" value="true"/>
50-
<attribute name="ignore_optional_problems" value="true"/>
5147
<attribute name="m2e-apt" value="true"/>
52-
<attribute name="test" value="true"/>
5348
</attributes>
5449
</classpathentry>
5550
<classpathentry kind="output" path="target/classes"/>
56-
</classpath>
51+
</classpath>

README.md

Lines changed: 7 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -22,8 +22,8 @@ The Architecture of this starter kit is shown in the below diagram
2222

2323
### Pre-requisites
2424

25-
1. JDK 8
26-
1. IDE for e.g. [Eclipse](https://www.eclipse.org/) or [Spring Tools](https://spring.io/tools) or [Intellij IDEA](https://www.jetbrains.com/idea/)
25+
1. JDK 11
26+
1. IDE for e.g. [Eclipse](https://www.eclipse.org/) or [Spring Tools](https://spring.io/tools) or [Intellij IDEA](https://www.jetbrains.com/idea/)
2727
1. [Apache Maven](https://maven.apache.org/)
2828
1. [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html)
2929
1. This starter kit tested with Apache Flink Version 1.8
@@ -43,33 +43,15 @@ The following AWS services are required to deploy this starter kit:
4343

4444
## Build Instructions
4545

46-
### Build Apache Flink Connector for Amazon Kinesis
47-
48-
1. Use this command to run the script [build_flink_connector_kinesis.sh](./src/main/resources/build_flink_connector_kinesis.sh)
49-
50-
```./build_flink_connector_kinesis.sh PATH_TO_FLINK_SOURCE_CODE 1.8.2 2.11```
51-
52-
1. This will generate ```flink-connector-kinesis_2.11-1.8.2.jar``` under ***PATH_TO_FLINK_SOURCE_CODE/flink-release-1.8.2/flink-connectors/flink-connector-kinesis/target/***
53-
1. When the above step is successfull, below Maven dependency will be resolved properly in [pom.xml](./pom.xml)
54-
55-
```xml
56-
<dependency>
57-
<groupId>org.apache.flink</groupId>
58-
<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
59-
<version>${flink.version}</version>
60-
</dependency>
61-
```
62-
63-
---
64-
6546
### Build Apache Flink Application
6647

6748
1. Clone this starter kit to your Laptop / MacBook
6849
1. It has Maven nature, so you can import it to your IDE.
6950
1. Build the Jar file using one of the steps below:
7051
1. Using standalone Maven, go to project home directory and run command ```mvn -X clean install```
7152
1. From Eclipse or STS, run command ```-X clean install```. Navigation: Project right click --> Run As --> Maven Build (Option 4)
72-
1. Build process will generate a jar file ```amazon-kinesis-data-analytics-flink-starter-kit-0.1.jar```. Note: The size of the jar file is around 38 MB
53+
1. Build process will generate a jar file ```amazon-kinesis-data-analytics-flink-starter-kit-1.0.jar```.
54+
1. Note: The size of the jar file is around 46 MB
7355

7456
---
7557

@@ -101,7 +83,7 @@ You can deploy the Starter Kit using either AWS CLI or AWS Console.
10183
1. Upload Flink Application Jar file to S3 bucket
10284

10385
```bash
104-
aws s3 cp amazon-kinesis-data-analytics-flink-starter-kit-0.1.jar s3://bucket_name/kda_flink_starter_kit_jar/
86+
aws s3 cp amazon-kinesis-data-analytics-flink-starter-kit-1.0.jar s3://bucket_name/kda_flink_starter_kit_jar/
10587
```
10688

10789
1. Create Kinesis Stream
@@ -229,7 +211,7 @@ You can deploy the Starter Kit using either AWS CLI or AWS Console.
229211
1. Runtime = Apache Flink. Select version 1.8
230212
1. Click on Configure
231213
1. Amazon S3 bucket = Choose the bucket you selected in Step # 2
232-
1. Path to Amazon S3 object = must be the prefix for ```amazon-kinesis-data-analytics-flink-starter-kit-0.1.jar```
214+
1. Path to Amazon S3 object = must be the prefix for ```amazon-kinesis-data-analytics-flink-starter-kit-1.0.jar```
233215
1. Under section **Access to application resources** select ***Choose from IAM roles that Kinesis Data Analytics can assume***
234216
1. IAM role = Choose the IAM role created above
235217
1. Using the Jar file generated in the above step
@@ -261,4 +243,4 @@ Contributions are welcome, refer [CONTRIBUTING.md](https://github.com/aws-sample
261243

262244
## License Summary
263245

264-
This sample code is made available under the MIT license. See the LICENSE file.
246+
This sample code is made available under the MIT license. See the LICENSE file.

pom.xml

Lines changed: 28 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -5,59 +5,61 @@
55
<name>Amazon Kinesis Data Analytics Flink - Starter Kit</name>
66
<groupId>com.amazonaws.kda.samples</groupId>
77
<artifactId>amazon-kinesis-data-analytics-flink-starter-kit</artifactId>
8-
<version>0.1</version>
8+
<version>1.0</version>
99

1010
<properties>
11-
<java.version>1.8</java.version>
12-
<scala.binary.version>2.11</scala.binary.version>
13-
<kda.version>1.1.0</kda.version>
14-
<flink.version>1.8.2</flink.version>
11+
<java.version>11</java.version>
12+
<scala.binary.version>2.12</scala.binary.version>
13+
<kinesis.analytics.flink.version>2.0.0</kinesis.analytics.flink.version>
14+
<kinesis.analytics.runtime.version>1.2.0</kinesis.analytics.runtime.version>
15+
<flink.version>1.11.1</flink.version>
1516
</properties>
1617

1718
<dependencies>
18-
<!-- JUnit Dependency -->
19+
<!-- JUnit dependency -->
1920
<dependency>
2021
<groupId>org.junit.jupiter</groupId>
2122
<artifactId>junit-jupiter-engine</artifactId>
2223
<version>5.6.2</version>
2324
<scope>test</scope>
2425
</dependency>
25-
26-
<!-- Log4j Dependency -->
26+
<!-- Log4j dependencies -->
2727
<dependency>
28-
<groupId>log4j</groupId>
29-
<artifactId>log4j</artifactId>
30-
<version>1.2.17</version>
28+
<groupId>org.apache.logging.log4j</groupId>
29+
<artifactId>log4j-api</artifactId>
30+
<version>2.14.0</version>
3131
</dependency>
32-
33-
<!-- Amazon Kinesis Analytics runtime dependencies -->
32+
<dependency>
33+
<groupId>org.apache.logging.log4j</groupId>
34+
<artifactId>log4j-core</artifactId>
35+
<version>2.14.0</version>
36+
</dependency>
37+
<!-- Amazon Kinesis Analytics runtime dependency -->
3438
<dependency>
3539
<groupId>com.amazonaws</groupId>
3640
<artifactId>aws-kinesisanalytics-runtime</artifactId>
37-
<version>${kda.version}</version>
41+
<version>${kinesis.analytics.runtime.version}</version>
3842
</dependency>
39-
40-
<!-- Apache Flink connector for Amazon Kinesis -->
43+
<!-- Amazon Kinesis Analytics Flink dependency -->
44+
<dependency>
45+
<groupId>com.amazonaws</groupId>
46+
<artifactId>aws-kinesisanalytics-flink</artifactId>
47+
<version>${kinesis.analytics.flink.version}</version>
48+
</dependency>
49+
<!-- Apache Flink connector for Amazon Kinesis dependency -->
4150
<dependency>
4251
<groupId>org.apache.flink</groupId>
4352
<artifactId>flink-connector-kinesis_${scala.binary.version}</artifactId>
4453
<version>${flink.version}</version>
4554
</dependency>
46-
55+
<!-- Flink streaming Java dependency -->
4756
<dependency>
4857
<groupId>org.apache.flink</groupId>
4958
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
5059
<version>${flink.version}</version>
5160
<scope>provided</scope>
5261
</dependency>
53-
54-
<dependency>
55-
<groupId>com.amazonaws</groupId>
56-
<artifactId>aws-kinesisanalytics-flink</artifactId>
57-
<version>${kda.version}</version>
58-
</dependency>
59-
60-
<!-- Amazon CloudWatch Logs -->
62+
<!-- Amazon CloudWatch Logs dependency -->
6163
<dependency>
6264
<groupId>com.amazonaws</groupId>
6365
<artifactId>aws-java-sdk-logs</artifactId>
@@ -130,17 +132,11 @@
130132
<groupId>com.amazonaws</groupId>
131133
<artifactId>aws-java-sdk-bom</artifactId>
132134
<!-- Get the latest SDK version from https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-bom -->
133-
<version>1.11.702</version>
135+
<version>1.11.892</version>
134136
<type>pom</type>
135137
<scope>import</scope>
136138
</dependency>
137139
</dependencies>
138140
</dependencyManagement>
139141

140-
<repositories>
141-
<repository>
142-
<id>log4j</id>
143-
<url>https://mvnrepository.com/artifact/log4j/log4j</url>
144-
</repository>
145-
</repositories>
146142
</project>

src/main/java/com/amazonaws/kda/flink/starterkit/SessionProcessor.java

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,8 @@
2323
import org.apache.flink.streaming.api.windowing.time.Time;
2424
import org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer;
2525
import org.apache.flink.streaming.connectors.kinesis.config.ConsumerConfigConstants;
26-
import org.apache.log4j.Logger;
26+
import org.apache.logging.log4j.LogManager;
27+
import org.apache.logging.log4j.Logger;
2728

2829
import com.amazonaws.services.kinesisanalytics.runtime.KinesisAnalyticsRuntime;
2930

@@ -38,7 +39,7 @@
3839
*/
3940
public class SessionProcessor {
4041

41-
private static final Logger log = Logger.getLogger(SessionProcessor.class);
42+
private static final Logger log = LogManager.getLogger(SessionProcessor.class);
4243

4344
/**
4445
* Main method and the entry point for Kinesis Data Analytics Flink Application.
@@ -56,8 +57,7 @@ public static void main(String[] args) throws Exception {
5657
Map<String, Properties> applicationProperties = KinesisAnalyticsRuntime.getApplicationProperties();
5758
Properties flinkProperties = applicationProperties.get("FlinkAppProperties");
5859
if (flinkProperties == null) {
59-
throw new RuntimeException(
60-
"Unable to load properties from Group ID FlinkAppProperties.");
60+
throw new RuntimeException("Unable to load properties from Group ID FlinkAppProperties.");
6161
}
6262
parameter = ParameterToolUtils.fromApplicationProperties(flinkProperties);
6363
}

src/main/java/com/amazonaws/kda/flink/starterkit/SessionUtil.java

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,8 @@
66

77
import org.apache.flink.kinesis.shaded.com.amazonaws.services.s3.AmazonS3;
88
import org.apache.flink.kinesis.shaded.com.amazonaws.services.s3.AmazonS3ClientBuilder;
9-
import org.apache.log4j.Logger;
9+
import org.apache.logging.log4j.LogManager;
10+
import org.apache.logging.log4j.Logger;
1011

1112
import com.amazonaws.regions.Regions;
1213
import com.amazonaws.services.kinesis.AmazonKinesis;
@@ -24,7 +25,7 @@
2425
*/
2526
public class SessionUtil {
2627

27-
private static final Logger log = Logger.getLogger(SessionUtil.class);
28+
private static final Logger log = LogManager.getLogger(SessionUtil.class);
2829

2930
/**
3031
* Method checks if a Kinesis Stream exist
@@ -100,8 +101,9 @@ public static boolean checkIfRegionExist(String region) {
100101
}
101102

102103
/**
103-
* This method validates a data format
104-
* Reference: https://docs.oracle.com/javase/7/docs/api/java/text/SimpleDateFormat.html
104+
* This method validates a data format Reference:
105+
* https://docs.oracle.com/javase/7/docs/api/java/text/SimpleDateFormat.html
106+
*
105107
* @param streamInitialTimestamp
106108
* @return
107109
*/

0 commit comments

Comments
 (0)