From 95ef40dd31b6b0ee403413184ef974c6c051ddeb Mon Sep 17 00:00:00 2001 From: Gwen Shapira Date: Wed, 22 Mar 2017 23:06:26 -0700 Subject: [PATCH] MINOR: Adding example to SMT documentation Author: Gwen Shapira Reviewers: Ewen Cheslack-Postava Closes #2721 from gwenshap/improve_smt_docs --- docs/connect.html | 68 ++++++++++++++++++++++++++++++++++++++++++++++- docs/toc.html | 6 +++++ 2 files changed, 73 insertions(+), 1 deletion(-) diff --git a/docs/connect.html b/docs/connect.html index d6b6f006587..48c51395409 100644 --- a/docs/connect.html +++ b/docs/connect.html @@ -104,7 +104,7 @@

Transformations

- Connectors can be configured with transformations to make lightweight message-at-a-time modifications. They can be convenient for minor data massaging and routing changes. + Connectors can be configured with transformations to make lightweight message-at-a-time modifications. They can be convenient for data massaging and event routing. A transformation chain can be specified in the connector configuration. @@ -114,8 +114,74 @@
  • transforms.$alias.$transformationSpecificConfig Configuration properties for the transformation
  • +

    For example, lets take the built-in file source connector and use a transformation to add a static field.

    + +

    Throughout the example we'll use schemaless JSON data format. To use schemaless format, we changed the following two lines in connect-standalone.properties from true to false:

    + +
    +        key.converter.schemas.enable
    +        value.converter.schemas.enable
    +    
    + + The file source connector reads each line as a String. We will wrap each line in a Map and then add a second field to identify the origin of the event. To do this, we use two transformations: +
      +
    • HoistField to place the input line inside a Map
    • +
    • InsertField to add the static field. In this example we'll indicate that the record came from a file connector
    • +
    + + After adding the transformations, connect-file-source.properties file looks as following: + +
    +        name=local-file-source
    +        connector.class=FileStreamSource
    +        tasks.max=1
    +        file=test.txt
    +        topic=connect-test
    +        transforms=MakeMap, InsertSource
    +        transforms.MakeMap.type=org.apache.kafka.connect.transforms.HoistField$Value
    +        transforms.MakeMap.field=line
    +        transforms.InsertSource.type=org.apache.kafka.connect.transforms.InsertField$Value
    +        transforms.InsertSource.static.field=data_source
    +        transforms.InsertSource.static.value=test-file-source
    +    
    + +

    All the lines starting with transforms were added for the transformations. You can see the two transformations we created: "InsertSource" and "MakeMap" are aliases that we chose to give the transformations. The transformation types are based on the list of built-in transformations you can see below. Each transformation type has additional configuration: HoistField requires a configuration called "field", which is the name of the field in the map that will include the original String from the file. InsertField transformation lets us specify the field name and the value that we are adding.

    + + When we ran the file source connector on my sample file without the transformations, and then read them using kafka-console-consumer.sh, the results were: + +
    +        "foo"
    +        "bar"
    +        "hello world"
    +   
    + + We then create a new file connector, this time after adding the transformations to the configuration file. This time, the results will be: + +
    +        {"line":"foo","data_source":"test-file-source"}
    +        {"line":"bar","data_source":"test-file-source"}
    +        {"line":"hello world","data_source":"test-file-source"}
    +    
    + + You can see that the lines we've read are now part of a JSON map, and there is an extra field with the static value we specified. This is just one example of what you can do with transformations. + Several widely-applicable data and routing transformations are included with Kafka Connect: +
      +
    • InsertField - Add a field using either static data or record metadata
    • +
    • ReplaceField - Filter or rename fields
    • +
    • MaskField - Replace field with valid null value for the type (0, empty string, etc)
    • +
    • ValueToKey
    • +
    • HoistField - Wrap the entire event as a single field inside a Struct or a Map
    • +
    • ExtractField - Extract a specific field from Struct and Map and include only this field in results
    • +
    • SetSchemaMetadata - modify the schema name or version
    • +
    • TimestampRouter - Modify the topic of a record based on original topic and timestamp. Useful when using a sink that needs to write to different tables or indexes based on timestamps
    • +
    • RegexpRouter - modify the topic of a record based on original topic, replacement string and a regular expression
    • +
    + + Details on how to configure each transformation are listed below: + +

    REST API

    diff --git a/docs/toc.html b/docs/toc.html index 787153d8cd7..935703bc907 100644 --- a/docs/toc.html +++ b/docs/toc.html @@ -130,6 +130,12 @@