KAFKA-19721: Update streams documentation with KIP-1147 changes (#20606)

Update KIP-1147 changes (renaming --property to --formatter-property) in
the ops and streams documentation.

Reviewers: Andrew Schofield <aschofield@confluent.io>
This commit is contained in:
Jhen-Yung Hsu 2025-10-02 03:17:47 +08:00 committed by GitHub
parent 7426629ba4
commit 0ddc69da70
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
3 changed files with 24 additions and 24 deletions

View File

@ -4315,7 +4315,7 @@ $ bin/kafka-topics.sh --create --topic tieredTopic --bootstrap-server localhost:
<p>Lastly, we can try to consume some data from the beginning and print offset number, to make sure it will successfully fetch offset 0 from the remote storage.</p> <p>Lastly, we can try to consume some data from the beginning and print offset number, to make sure it will successfully fetch offset 0 from the remote storage.</p>
<pre><code class="language-bash">$ bin/kafka-console-consumer.sh --topic tieredTopic --from-beginning --max-messages 1 --bootstrap-server localhost:9092 --property print.offset=true</code></pre> <pre><code class="language-bash">$ bin/kafka-console-consumer.sh --topic tieredTopic --from-beginning --max-messages 1 --bootstrap-server localhost:9092 --formatter-property print.offset=true</code></pre>
<p>In KRaft mode, you can disable tiered storage at the topic level, to make the remote logs as read-only logs, or completely delete all remote logs.</p> <p>In KRaft mode, you can disable tiered storage at the topic level, to make the remote logs as read-only logs, or completely delete all remote logs.</p>

View File

@ -217,15 +217,15 @@ TimeWindowedDeserializer&lt;String&gt; deserializer = new TimeWindowedDeserializ
<h4>Usage in Command Line</h4> <h4>Usage in Command Line</h4>
<p>When using command-line tools (like <code>bin/kafka-console-consumer.sh</code>), you can configure windowed deserializers by passing the inner class and window size via configuration properties. The property names use a prefix pattern:</p> <p>When using command-line tools (like <code>bin/kafka-console-consumer.sh</code>), you can configure windowed deserializers by passing the inner class and window size via configuration properties. The property names use a prefix pattern:</p>
<pre class="line-numbers"><code class="language-bash"># Time windowed deserializer configuration <pre class="line-numbers"><code class="language-bash"># Time windowed deserializer configuration
--property print.key=true \ --formatter-property print.key=true \
--property key.deserializer=org.apache.kafka.streams.kstream.TimeWindowedDeserializer \ --formatter-property key.deserializer=org.apache.kafka.streams.kstream.TimeWindowedDeserializer \
--property key.deserializer.windowed.inner.deserializer.class=org.apache.kafka.common.serialization.StringDeserializer \ --formatter-property key.deserializer.windowed.inner.deserializer.class=org.apache.kafka.common.serialization.StringDeserializer \
--property key.deserializer.window.size.ms=500 --formatter-property key.deserializer.window.size.ms=500
# Session windowed deserializer configuration # Session windowed deserializer configuration
--property print.key=true \ --formatter-property print.key=true \
--property key.deserializer=org.apache.kafka.streams.kstream.SessionWindowedDeserializer \ --formatter-property key.deserializer=org.apache.kafka.streams.kstream.SessionWindowedDeserializer \
--property key.deserializer.windowed.inner.deserializer.class=org.apache.kafka.common.serialization.StringDeserializer</code></pre> --formatter-property key.deserializer.windowed.inner.deserializer.class=org.apache.kafka.common.serialization.StringDeserializer</code></pre>
<h4>Deprecated Configs</h4> <h4>Deprecated Configs</h4>
<p>The following <code>StreamsConfig</code> parameters are deprecated in favor of passing parameters directly to serializer/deserializer constructors:</p> <p>The following <code>StreamsConfig</code> parameters are deprecated in favor of passing parameters directly to serializer/deserializer constructors:</p>

View File

@ -175,10 +175,10 @@ and inspect the output of the WordCount demo application by reading from its out
<pre><code class="language-bash">$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 \ <pre><code class="language-bash">$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 \
--topic streams-wordcount-output \ --topic streams-wordcount-output \
--from-beginning \ --from-beginning \
--property print.key=true \ --formatter-property print.key=true \
--property print.value=true \ --formatter-property print.value=true \
--property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \ --formatter-property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
--property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer</code></pre> --formatter-property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer</code></pre>
<h4><a id="quickstart_streams_process" href="#quickstart_streams_process">Step 5: Process some data</a></h4> <h4><a id="quickstart_streams_process" href="#quickstart_streams_process">Step 5: Process some data</a></h4>
@ -197,10 +197,10 @@ This message will be processed by the Wordcount application and the following ou
<pre><code class="language-bash">$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 \ <pre><code class="language-bash">$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 \
--topic streams-wordcount-output \ --topic streams-wordcount-output \
--from-beginning \ --from-beginning \
--property print.key=true \ --formatter-property print.key=true \
--property print.value=true \ --formatter-property print.value=true \
--property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \ --formatter-property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
--property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer --formatter-property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer
all 1 all 1
streams 1 streams 1
@ -225,10 +225,10 @@ In your other terminal in which the console consumer is running, you will observ
<pre><code class="language-bash">$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 \ <pre><code class="language-bash">$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 \
--topic streams-wordcount-output \ --topic streams-wordcount-output \
--from-beginning \ --from-beginning \
--property print.key=true \ --formatter-property print.key=true \
--property print.value=true \ --formatter-property print.value=true \
--property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \ --formatter-property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
--property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer --formatter-property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer
all 1 all 1
streams 1 streams 1
@ -255,10 +255,10 @@ The <b>streams-wordcount-output</b> topic will subsequently show the correspondi
<pre><code class="language-bash">$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 \ <pre><code class="language-bash">$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 \
--topic streams-wordcount-output \ --topic streams-wordcount-output \
--from-beginning \ --from-beginning \
--property print.key=true \ --formatter-property print.key=true \
--property print.value=true \ --formatter-property print.value=true \
--property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \ --formatter-property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
--property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer --formatter-property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer
all 1 all 1
streams 1 streams 1