-
Notifications
You must be signed in to change notification settings - Fork 137
Open
Labels
enhancementNew feature or requestNew feature or request
Milestone
Description
I am using the inserter to periodically write batches of messages from a Kafka topic into ClickHouse. I wonder what happens to the data when an error happens on write/commit. It seems like the internal buffer is dropped and the unfinished binary data stream is aborted.
This essentially means, that any data is lost on error. Is this true? What are some best practices to guarantee data delivery or how would I best implement a retry mechanism?
bocharov
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request