MINOR: Fix typo in docs (#7158)

Reviewer: Matthias J. Sax <matthias@confluent.io>
This commit is contained in:
Victoria Bialas 2019-08-03 13:51:43 -07:00 committed by Matthias J. Sax
parent 96c575a2c7
commit c76f5651fa
1 changed files with 1 additions and 1 deletions

View File

@ -60,7 +60,7 @@ and much lower end-to-end latency.
Many users of Kafka process data in processing pipelines consisting of multiple stages, where raw input data is consumed from Kafka topics and then
aggregated, enriched, or otherwise transformed into new topics for further consumption or follow-up processing.
For example, a processing pipeline for recommending news articles might crawl article content from RSS feeds and publish it to an "articles" topic;
further processing might normalize or deduplicate this content and published the cleansed article content to a new topic;
further processing might normalize or deduplicate this content and publish the cleansed article content to a new topic;
a final processing stage might attempt to recommend this content to users.
Such processing pipelines create graphs of real-time data flows based on the individual topics.
Starting in 0.10.0.0, a light-weight but powerful stream processing library called <a href="/documentation/streams">Kafka Streams</a>