MINOR: Update the documentation's table of contents to add missing headings for Kafka Connect (#14337)

Reviewers: Chris Egerton <chrise@aiven.io>
This commit is contained in:
Yash Mayya 2023-09-05 18:58:44 +01:00 committed by GitHub
parent 37a51e286d
commit 79598b49d6
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
2 changed files with 10 additions and 2 deletions

View File

@ -953,7 +953,7 @@ if (inputsChanged())
<p><code>SinkConnectors</code> usually only have to handle the addition of streams, which may translate to new entries in their outputs (e.g., a new database table). The framework manages any changes to the Kafka input, such as when the set of input topics changes because of a regex subscription. <code>SinkTasks</code> should expect new input streams, which may require creating new resources in the downstream system, such as a new table in a database. The trickiest situation to handle in these cases may be conflicts between multiple <code>SinkTasks</code> seeing a new input stream for the first time and simultaneously trying to create the new resource. <code>SinkConnectors</code>, on the other hand, will generally require no special code for handling a dynamic set of streams.</p>
<h4><a id="connect_configs" href="#connect_configs">Connect Configuration Validation</a></h4>
<h4><a id="connect_configs" href="#connect_configs">Configuration Validation</a></h4>
<p>Kafka Connect allows you to validate connector configurations before submitting a connector to be executed and can provide feedback about errors and recommended values. To take advantage of this, connector developers need to provide an implementation of <code>config()</code> to expose the configuration definition to the framework.</p>
@ -999,7 +999,7 @@ Struct struct = new Struct(schema)
<p>Sink connectors are usually simpler because they are consuming data and therefore do not need to create schemas. However, they should take just as much care to validate that the schemas they receive have the expected format. When the schema does not match -- usually indicating the upstream producer is generating invalid data that cannot be correctly translated to the destination system -- sink connectors should throw an exception to indicate this error to the system.</p>
<h4><a id="connect_administration" href="#connect_administration">Kafka Connect Administration</a></h4>
<h3><a id="connect_administration" href="#connect_administration">8.4 Administration</a></h3>
<p>
Kafka Connect's <a href="#connect_rest">REST layer</a> provides a set of APIs to enable administration of the cluster. This includes APIs to view the configuration of connectors and the status of their tasks, as well as to alter their current behavior (e.g. changing configuration and restarting tasks).

View File

@ -207,6 +207,14 @@
<li><a href="#connect_plugindiscovery">Plugin Discovery</a></li>
</ul>
<li><a href="#connect_development">8.3 Connector Development Guide</a></li>
<ul>
<li><a href="#connect_concepts">Core Concepts and APIs</a></li>
<li><a href="#connect_developing">Developing a Simple Connector</a></li>
<li><a href="#connect_dynamicio">Dynamic Input/Output Streams</a></li>
<li><a href="#connect_configs">Configuration Validation</a></li>
<li><a href="#connect_schemas">Working with Schemas</a></li>
</ul>
<li><a href="#connect_administration">8.4 Administration</a></li>
</ul>
</li>
<li><a href="/{{version}}/documentation/streams">9. Kafka Streams</a>