Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

in_kafka: document enable_auto_commit #1520

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

coreidcc
Copy link

Documentation related to the pull-request 'Boost in_kafa throughput #9625' submitted to fluent-bit.

Copy link
Contributor

@lockewritesdocs lockewritesdocs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @coreidcc! I suggested some minor changes to align with style.

@@ -16,6 +16,8 @@ This plugin uses the official [librdkafka C library](https://github.com/edenhill
| Buffer\_Max\_Size | Specify the maximum size of buffer per cycle to poll kafka messages from subscribed topics. To increase throughput, specify larger size. | 4M |
| rdkafka.{property} | `{property}` can be any [librdkafka properties](https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md) | |
| threaded | Indicates whether to run this input in its own [thread](../../administration/multithreading.md#inputs). | `false` |
| enable_auto_commit | Use kafa auto-commit instead of commiting easch individual message. | `false` |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
| enable_auto_commit | Use kafa auto-commit instead of commiting easch individual message. | `false` |
| enable_auto_commit | Use Kafka auto-commit instead of committing each individual message. | `false` |

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants