Avro is very popular in streaming data pipeline. Now seatunnel supports Avro format in kafka connector.
How To Use
Kafka uses example
- This is an example to generate data from fake source and sink to kafka with avro format.
env {parallelism = 1job.mode = "BATCH"}source {FakeSource {row.num = 90schema = {fields {c_map = "map<string, string>"c_array = "array<int>"c_string = stringc_boolean = booleanc_tinyint = tinyintc_smallint = smallintc_int = intc_bigint = bigintc_float = floatc_double = doublec_bytes = bytesc_date = datec_decimal = "decimal(38, 18)"c_timestamp = timestampc_row = {c_map = "map<string, string>"c_array = "array<int>"c_string = stringc_boolean = booleanc_tinyint = tinyintc_smallint = smallintc_int = intc_bigint = bigintc_float = floatc_double = doublec_bytes = bytesc_date = datec_decimal = "decimal(38, 18)"c_timestamp = timestamp}}}result_table_name = "fake"}}sink {Kafka {bootstrap.servers = "kafkaCluster:9092"topic = "test_avro_topic_fake_source"format = avro}}
- This is an example read data from kafka with avro format and print to console.
env {parallelism = 1job.mode = "BATCH"}source {Kafka {bootstrap.servers = "kafkaCluster:9092"topic = "test_avro_topic"result_table_name = "kafka_table"kafka.auto.offset.reset = "earliest"format = avroformat_error_handle_way = skipschema = {fields {id = bigintc_map = "map<string, smallint>"c_array = "array<tinyint>"c_string = stringc_boolean = booleanc_tinyint = tinyintc_smallint = smallintc_int = intc_bigint = bigintc_float = floatc_double = doublec_decimal = "decimal(2, 1)"c_bytes = bytesc_date = datec_timestamp = timestamp}}}}sink {Console {source_table_name = "kafka_table"}}
