Skip to content

Commit 8d263dc

Browse files
committed
docs: Fixed quick start and formatting issues
1 parent 7eb6d57 commit 8d263dc

File tree

9 files changed

+226
-192
lines changed

9 files changed

+226
-192
lines changed

docs/guide/src/docs/asciidoc/_links.adoc

Lines changed: 14 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,4 +2,17 @@
22
:link_lettuce_uri: link:https://github.com/lettuce-io/lettuce-core/wiki/Redis-URI-and-connection-details#uri-syntax[Redis URI Syntax]
33
:link_redis_notif: link:https://redis.io/docs/manual/keyspace-notifications[Redis Keyspace Notifications]
44
:link_manual_install: link:https://docs.confluent.io/home/connect/community.html#manually-installing-community-connectors/[Manually Installing Community Connectors]
5-
:link_redis_keys: https://redis.io/commands/keys/[Redis KEYS]
5+
:link_redis_keys: link:https://redis.io/commands/keys/[Redis KEYS]
6+
:link_custom_connector: link:https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html[Custom Connector]
7+
:link_docker: link:https://docs.docker.com/get-docker/[Docker]
8+
:link_git: link:https://git-scm.com/book/en/v2/Getting-Started-Installing-Git[Git]
9+
:link_datagen: link:https://github.com/confluentinc/kafka-connect-datagen/[Kafka Connect Datagen]
10+
:link_smt: link:https://docs.confluent.io/platform/current/connect/transforms/overview.html[Single Message Transforms]
11+
:link_egress_endpoints: link:https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html#cc-byoc-endpoints[egress endpoints]
12+
:link_sensitive_props: link:https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html#sensitive[sensitive properties]
13+
:link_pageviews_avro: link:https://github.com/confluentinc/kafka-connect-datagen/blob/master/src/main/resources/pageviews_schema.avro[pageviews_schema.avro]
14+
:link_releases: link:https://github.com/{github-owner}/{github-repo}/releases[Releases]
15+
:link_cflt_hub_client: link:https://docs.confluent.io/current/connect/managing/confluent-hub/client.html[Confluent Hub Client]
16+
:link_stream_msg_id: link:https://redis.io/commands/xread#incomplete-ids[Message ID]
17+
:link_xread: link:https://redis.io/commands/xread[XREAD]
18+
Lines changed: 14 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,30 +1,36 @@
1-
[[_connect]]
2-
= Connect to Redis
1+
= Redis Client
32

4-
This section provides information on configuring the Redis Kafka Source or Sink Connector.
3+
This section provides information on Redis client configuration.
4+
5+
== Redis URI
56

67
Specify the Redis URI in the `redis.uri` property, for example:
78

89
[source,properties]
910
----
10-
redis.uri=redis://redis-12000.redis.com:12000
11+
redis.uri = redis://redis-12000.redis.com:12000
1112
----
1213

13-
For Redis URI syntax see {link_lettuce_uri}.
14+
For complete Redis URI syntax see {link_lettuce_uri}.
1415

1516
TLS connection URIs start with `rediss://`.
17+
18+
== Certificate Verification
19+
1620
To disable certificate verification for TLS connections use the following property:
1721

1822
[source,properties]
1923
----
20-
redis.insecure=true
24+
redis.insecure = true
2125
----
2226

27+
== Credentials
28+
2329
Username and password can be specified in the URI or separately with the following properties:
2430

2531
[source,properties]
2632
----
27-
redis.username=user1
28-
redis.password=pass
33+
redis.username = user1
34+
redis.password = pass
2935
----
3036

Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
[[_docker]]
2+
= Docker Example
3+
4+
The project {project-scm}[repository] contains a script that runs all the steps shown in the <<_quickstart,Quick Start>>.
5+
6+
Clone the {project-scm}[{project-name}] repository and execute `run.sh` in `docker` directory:
7+
8+
[source,console,subs="attributes"]
9+
----
10+
git clone {project-scm}
11+
cd {project-name}
12+
./run.sh
13+
----
14+
15+
This will:
16+
17+
* Run `docker compose up`
18+
* Wait for Redis, Kafka, and Kafka Connect to be ready
19+
* Register the Confluent Datagen Connector
20+
* Register the Redis Kafka Sink Connector
21+
* Register the Redis Kafka Source Connector
22+
* Publish some events to Kafka via the Datagen connector
23+
* Write the events to Redis
24+
* Send messages to a Redis stream
25+
* Write the Redis stream messages back into Kafka
26+
27+
Once running, examine the topics in the Kafka http://localhost:9021/[Control Center]:
28+
29+
The `pageviews` topic should contain the 10 simple documents added, each similar to:
30+
31+
[source,json]
32+
----
33+
include::{includedir}/../resources/pageviews.json[]
34+
----
35+
36+
* The `pageviews` stream should contain the 10 change events.
37+
38+
Examine the stream in Redis:
39+
40+
[source,console]
41+
----
42+
docker compose exec redis /usr/local/bin/redis-cli
43+
xread COUNT 10 STREAMS pageviews 0
44+
----
45+
46+
Messages added to the `mystream` stream will show up in the `mystream` topic.
47+
Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
= {project-title}
22
:author: {project-author}
33
:revnumber: {project-version}
4-
:toclevels: 3
4+
:toclevels: 2
55
:docinfo1:
66

77
include::{includedir}/_links.adoc[]
@@ -10,7 +10,6 @@ include::{includedir}/_links.adoc[]
1010
include::{includedir}/overview.adoc[]
1111
include::{includedir}/quickstart.adoc[]
1212
include::{includedir}/install.adoc[]
13-
include::{includedir}/connect.adoc[]
1413
include::{includedir}/sink.adoc[]
1514
include::{includedir}/source.adoc[]
1615
include::{includedir}/resources.adoc[]

docs/guide/src/docs/asciidoc/install.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,11 +5,11 @@ Select one of the methods below to install {project-title}.
55

66
== Download
77

8-
Download the latest release archive from https://github.com/{github-owner}/{github-repo}/releases[here].
8+
Download the latest release archive: {link_releases}.
99

1010
== Confluent Hub
1111

12-
1. Install the https://docs.confluent.io/current/connect/managing/confluent-hub/client.html[Confluent Hub Client]
12+
1. Install the {link_cflt_hub_client}
1313
2. Install the {project-name} using the Confluent Hub Client
1414

1515
== Manually

docs/guide/src/docs/asciidoc/overview.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,10 +7,10 @@ image:redis-kafka-connector.svg[]
77

88
This guide provides documentation and usage information across the following topics:
99

10+
* <<_quickstart,Quick Start>>
11+
* <<_docker,Docker Example>>
1012
* <<_install,Install>>
11-
* <<_connect,Connect to Redis>>
1213
* <<_sink,Sink Connector>>
1314
* <<_source,Source Connector>>
14-
* <<_docker,Docker Example>>
1515
* <<_resources,Resources>>
1616

docs/guide/src/docs/asciidoc/quickstart.adoc

Lines changed: 17 additions & 58 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
[[_quick_start]]
1+
[[_quickstart]]
22
= Quick Start
33

44
This section shows how to configure the {project-title} to import/export data between Redis and Apache Kafka and provides a hands-on look at the functionality of the source and sink connectors.
@@ -7,8 +7,8 @@ This section shows how to configure the {project-title} to import/export data be
77

88
Download and install the following software:
99

10-
* https://docs.docker.com/get-docker/[Docker]
11-
* https://git-scm.com/book/en/v2/Getting-Started-Installing-Git[Git]
10+
* {link_docker}
11+
* {link_git}
1212

1313
== Start the Sandbox
1414

@@ -20,7 +20,10 @@ The sandbox starts the following Docker services:
2020

2121
To start the sandbox run the following command:
2222

23-
`docker compose up`
23+
[source,console]
24+
-----
25+
docker compose up
26+
-----
2427

2528
After Docker downloads and starts the services you should see the following output:
2629

@@ -46,7 +49,7 @@ Now that the required services are up and running, we can add connectors to Kafk
4649

4750
=== Add a Datagen
4851

49-
https://github.com/confluentinc/kafka-connect-datagen/[Kafka Connect Datagen] is a Kafka Connect source connector for generating mock data.
52+
{link_datagen} is a Kafka Connect source connector for generating mock data.
5053

5154
Create the Datagen connector with the following command:
5255

@@ -68,7 +71,7 @@ curl -X POST -H "Content-Type: application/json" --data '
6871
}}' http://localhost:8083/connectors -w "\n"
6972
-----
7073

71-
This automatically creates the Kafka topic `pageviews` and produces data with a schema configuration from https://github.com/confluentinc/kafka-connect-datagen/blob/master/src/main/resources/pageviews_schema.avro
74+
This automatically creates the Kafka topic `pageviews` and produces data with a schema configuration from {link_pageviews_avro}
7275

7376
[NOTE]
7477
====
@@ -94,7 +97,7 @@ The command below adds a {project-title} sink connector configured with these pr
9497
* The connection URI of the Redis database to which the connector writes data
9598
* The Redis command to use for writing data (`JSONSET`)
9699
* Key and value converters to correctly handle incoming `pageviews` data
97-
* A https://docs.confluent.io/platform/current/connect/transforms/overview.html[Single Message Transform] to extract a key from `pageviews` messages.
100+
* A {link_smt} to extract a key from `pageviews` messages.
98101

99102
[source,console]
100103
-----
@@ -165,66 +168,22 @@ Now add a message to the `mystream` Redis stream:
165168
Examine the topics in the Kafka UI: http://localhost:9021 or http://localhost:8000/.
166169
The `mystream` topic should have the previously sent stream message.
167170

171+
== Custom Connector
168172

169-
== End-to-end Example
170-
171-
The project {project-scm}[repository] contains a script that runs all the steps shown previously.
172-
173-
Clone the {project-scm}[{project-name}] repository and execute `run.sh` in `docker` directory:
174-
175-
[source,console,subs="attributes"]
176-
----
177-
git clone {project-scm}
178-
cd {project-name}
179-
./run.sh
180-
----
181-
182-
This will:
183-
184-
* Run `docker compose up`
185-
* Wait for Redis, Kafka, and Kafka Connect to be ready
186-
* Register the Confluent Datagen Connector
187-
* Register the Redis Kafka Sink Connector
188-
* Register the Redis Kafka Source Connector
189-
* Publish some events to Kafka via the Datagen connector
190-
* Write the events to Redis
191-
* Send messages to a Redis stream
192-
* Write the Redis stream messages back into Kafka
193-
194-
Once running, examine the topics in the Kafka http://localhost:9021/[control center]:
195-
196-
The `pageviews` topic should contain the 10 simple documents added, each similar to:
197-
198-
[source,json]
199-
----
200-
include::{includedir}/../resources/pageviews.json[]
201-
----
202-
203-
* The `pageviews` stream should contain the 10 change events.
204-
205-
Examine the stream in Redis:
206-
[source,console]
207-
----
208-
docker compose exec redis /usr/local/bin/redis-cli
209-
xread COUNT 10 STREAMS pageviews 0
210-
----
211-
212-
Messages added to the `mystream` stream will show up in the `mystream` topic.
213-
214-
215-
== Confluent Cloud
216-
217-
This section describes configuration aspects that are specific to using {project-title} in Confluent Cloud.
173+
This section describes configuration aspects that are specific to using {project-title} as a {link_custom_connector} in Confluent Cloud.
218174

219175
=== Egress Endpoints
220176

221-
It is required to specify https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html#cc-byoc-endpoints[egress endpoints] in order for the connector to reach the Redis database.
177+
It is required to specify {link_egress_endpoints} in order for the connector to reach the Redis database.
222178

223179
=== Sensitive Properties
224180

225-
The following are https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html#sensitive[sensitive properties] that must be marked as such in Confluent Cloud UI.
181+
The following are {link_sensitive_props} that must be marked as such in Confluent Cloud UI.
226182

227183
* `redis.uri`: URI of the Redis database to connect to, e.g. `redis://redis-12000.redis.com:12000`
228184
* `redis.username`: Username to use to connect to Redis
229185
* `redis.password`: Password to use to connect to Redis
230186
* `redis.key.password`: Password of the private key file
187+
188+
include::{includedir}/docker.adoc[leveloffset=+1]
189+

0 commit comments

Comments
 (0)