Skip to content

Commit 75504fd

Browse files
committed
release v0.0.2
1 parent 2319f70 commit 75504fd

File tree

9 files changed

+113
-7
lines changed

9 files changed

+113
-7
lines changed

.gitignore

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# Compiled class file
2-
target/*
2+
.target/
33
*.class
44

55
# Log file
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
databend.properties buffers
2+
name=databend
3+
connector.class=com.databend.kafka.connect.DatabendSinkConnector
4+
5+
connection.url=jdbc:databend://localhost:8000
6+
connection.user=databend
7+
connection.password=databend
8+
connection.attempts=5
9+
connection.backoff.ms=10000
10+
connection.database=default
11+
12+
table.name.format=default.${topic}
13+
max.retries=10
14+
batch.size=1
15+
auto.create=true
16+
auto.evolve=true
17+
insert.mode=upsert
18+
pk.mode=record_value
19+
pk.fields=id
20+
topics=products
21+
errors.tolerance=all

target/components/packages/databendCloud-databend-kafka-connect-0.0.1-SNAPSHOT/databendCloud-databend-kafka-connect-0.0.1-SNAPSHOT/doc/README.md

Lines changed: 11 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,12 +15,21 @@ The detail introduction docs available in [docs page](./docs/docs.md)
1515
- Clone from repo: `git clone https://github.com/databendcloud/databend-kafka-connect.git`
1616
- From the root of the project:
1717
- Build and package debezium server: `mvn -Passembly -Dmaven.test.skip package`
18-
- After building, you will get the jar file in `target` dir. Just put it into kafka libs dir.
18+
- After building, you will get the jar files in `target` dir. Or download the package jar files from [release](https://github.com/databendcloud/databend-kafka-connect/releases) Just put it into kafka libs dir.
1919
- Create `databend.properties` file in kafka config dir and config it: `nano config/databend.properties`, you can check the example
2020
configuration
2121
in [application.properties.example](src/main/resources/conf/application.properties.example)
2222
- config your source connect such as kafka mysql connector
23-
- Start or restart the Kafka Connect workers.: `bin/connect-standalone.sh config/connect-standalone.properties config/databend.properties config/mysql.properties`
23+
- if you sync data from kafka topic directly, please confirm the data format from kafka topic, for example, if your data if AVRO format, you should add the following config into config properties:
24+
25+
```
26+
key.converter=org.apache.kafka.connect.storage.StringConverter
27+
value.converter=io.confluent.connect.avro.AvroConverter
28+
value.converter.basic.auth.credentials.source=USER_INFO
29+
value.converter.schema.registry.basic.auth.user.info=xxxx
30+
value.converter.schema.registry.url=https://your-registry-url.us-east-2.aws.confluent.cloud
31+
```
32+
- Start or restart the Kafka Connect workers.: `bin/connect-standalone.sh config/connect-standalone.properties config/databend.properties config/mysql.properties`
2433

2534
More details about configure kafka connector, please read [Configure Self-Managed Connectors](https://docs.confluent.io/platform/current/connect/configuring.html).
2635

target/components/packages/databendCloud-databend-kafka-connect-0.0.1-SNAPSHOT/databendCloud-databend-kafka-connect-0.0.1-SNAPSHOT/manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,5 +31,5 @@
3131
"url" : "https://www.apache.org/licenses/LICENSE-2.0"
3232
} ],
3333
"component_types" : [ "sink" ],
34-
"release_date" : "2024-07-13"
34+
"release_date" : "2024-07-14"
3535
}

target/databend-kafka-connect-0.0.1-SNAPSHOT-package/share/doc/databend-kafka-connect/README.md

Lines changed: 11 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,12 +15,21 @@ The detail introduction docs available in [docs page](./docs/docs.md)
1515
- Clone from repo: `git clone https://github.com/databendcloud/databend-kafka-connect.git`
1616
- From the root of the project:
1717
- Build and package debezium server: `mvn -Passembly -Dmaven.test.skip package`
18-
- After building, you will get the jar file in `target` dir. Just put it into kafka libs dir.
18+
- After building, you will get the jar files in `target` dir. Or download the package jar files from [release](https://github.com/databendcloud/databend-kafka-connect/releases) Just put it into kafka libs dir.
1919
- Create `databend.properties` file in kafka config dir and config it: `nano config/databend.properties`, you can check the example
2020
configuration
2121
in [application.properties.example](src/main/resources/conf/application.properties.example)
2222
- config your source connect such as kafka mysql connector
23-
- Start or restart the Kafka Connect workers.: `bin/connect-standalone.sh config/connect-standalone.properties config/databend.properties config/mysql.properties`
23+
- if you sync data from kafka topic directly, please confirm the data format from kafka topic, for example, if your data if AVRO format, you should add the following config into config properties:
24+
25+
```
26+
key.converter=org.apache.kafka.connect.storage.StringConverter
27+
value.converter=io.confluent.connect.avro.AvroConverter
28+
value.converter.basic.auth.credentials.source=USER_INFO
29+
value.converter.schema.registry.basic.auth.user.info=xxxx
30+
value.converter.schema.registry.url=https://your-registry-url.us-east-2.aws.confluent.cloud
31+
```
32+
- Start or restart the Kafka Connect workers.: `bin/connect-standalone.sh config/connect-standalone.properties config/databend.properties config/mysql.properties`
2433

2534
More details about configure kafka connector, please read [Configure Self-Managed Connectors](https://docs.confluent.io/platform/current/connect/configuring.html).
2635

target/manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,5 +31,5 @@
3131
"url" : "https://www.apache.org/licenses/LICENSE-2.0"
3232
} ],
3333
"component_types" : [ "sink" ],
34-
"release_date" : "2024-07-13"
34+
"release_date" : "2024-07-14"
3535
}
Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
com/databend/kafka/connect/sink/DatabendWriterTest$5.class
2+
com/databend/kafka/connect/databendclient/DatabendClientTest.class
3+
com/databend/kafka/connect/sink/DatabendHelperTest.class
4+
com/databend/kafka/connect/sink/DatabendHelper.class
5+
com/databend/kafka/connect/sink/integration/DatabendSinkIT.class
6+
com/databend/kafka/connect/sink/DatabendWriterTest$MockRollbackException.class
7+
com/databend/kafka/connect/sink/DatabendSinkTaskTest$5.class
8+
com/databend/kafka/connect/sink/metadata/FieldsMetadataTest.class
9+
com/databend/kafka/connect/sink/PreparedStatementBinderTest.class
10+
com/databend/kafka/connect/sink/DatabendWriterTest$3.class
11+
com/databend/kafka/connect/sink/DatabendSinkConfigTest.class
12+
com/databend/kafka/connect/sink/DatabendWriterTest.class
13+
com/databend/kafka/connect/sink/DatabendSinkTaskTest$3.class
14+
com/databend/kafka/connect/sink/DatabendHelper$ResultSetReadCallback.class
15+
com/databend/kafka/connect/DatabendSinkConnectorTest.class
16+
com/databend/kafka/connect/sink/BufferedRecordsTest.class
17+
com/databend/kafka/connect/sink/DatabendSinkTaskTest.class
18+
com/databend/kafka/connect/sink/DatabendSinkTaskTest$4.class
19+
com/databend/kafka/connect/sink/DatabendSinkTaskTest$1.class
20+
com/databend/kafka/connect/sink/DatabendWriterTest$6.class
21+
com/databend/kafka/connect/sink/DatabendWriterTest$2.class
22+
com/databend/kafka/connect/sink/DatabendSinkTaskTest$2.class
23+
com/databend/kafka/connect/sink/DatabendWriterTest$1.class
24+
com/databend/kafka/connect/sink/DbStructureTest.class
25+
com/databend/kafka/connect/sink/integration/BaseConnectorIT.class
26+
com/databend/kafka/connect/sink/DatabendWriterTest$4.class
Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/DatabendSinkConnectorTest.java
2+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/sink/integration/BaseConnectorIT.java
3+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/sink/DbStructureTest.java
4+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/databendclient/DatabendClientTest.java
5+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/sink/integration/DatabendSinkIT.java
6+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/sink/metadata/FieldsMetadataTest.java
7+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/sink/DatabendHelper.java
8+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/sink/DatabendWriterTest.java
9+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/sink/DatabendSinkTaskTest.java
10+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/sink/DatabendHelperTest.java
11+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/sink/BufferedRecordsTest.java
12+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/sink/DatabendSinkConfigTest.java
13+
/Users/hanshanjie/git-works/databend-kafka-connect/src/test/java/com/databend/kafka/connect/sink/PreparedStatementBinderTest.java
Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
#
2+
# Copyright 2018 Confluent Inc.
3+
#
4+
# Licensed under the Confluent Community License (the "License"); you may not use
5+
# this file except in compliance with the License. You may obtain a copy of the
6+
# License at
7+
#
8+
# http://www.confluent.io/confluent-community-license
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
12+
# WARRANTIES OF ANY KIND, either express or implied. See the License for the
13+
# specific language governing permissions and limitations under the License.
14+
#
15+
16+
log4j.rootLogger=INFO, stdout
17+
18+
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
19+
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
20+
log4j.appender.stdout.layout.ConversionPattern=[%d] %p %m (%c:%L)%n
21+
22+
log4j.logger.org.apache.kafka=ERROR
23+
log4j.logger.io.confluent.connect=ERROR
24+
25+
#IT test log levels
26+
log4j.logger.kafka=WARN
27+
log4j.logger.org.apache.zookeeper=ERROR
28+
log4j.logger.org.reflections.Reflections=ERROR

0 commit comments

Comments
 (0)