Devlive 开源社区 本次搜索耗时 0.574 秒,为您找到 355 个相关结果.
  • Redis Connector

    Redis Write Dependency Writing Redis the Regular Way 1.Access to source 2. Write to redis Apache StreamPark™ Writes to Redis 1. Configure policy and connection information 2. ...
  • DataX

    DATAX 节点 综述 创建任务 任务参数 任务样例 在 DolphinScheduler 中配置 DataX 环境 配置 DataX 任务节点 查看运行结果 注意事项: DATAX 节点 综述 DataX 任务类型,用于执行 DataX 程序。对于 DataX 节点,worker 会通过执行 ${DATAX_LAUNCHER} 来...
  • Concurrency Control

    5850 2024-06-28 《Apache Hudi 0.15.0》
    Deployment models with supported concurrency controls Model A: Single writer with inline table services Single Writer Guarantees Model B: Single writer with async table services ...
  • SQL Server CDC

    Support SQL Server Version Support Those Engines Key Features Description Supported DataSource Info Using Dependency Install Jdbc Driver For Spark/Flink Engine For SeaTunnel ...
  • S3File

    Support Those Engines Key Features Description Supported DataSource Info Database Dependency Data Type Mapping Orc File Type Parquet File Type Sink Options path [string] h...
  • General

    5650 2024-07-01 《Apache Hudi 0.15.0》
    When is Hudi useful for me or my organization? What are some non-goals for Hudi? What is incremental processing? Why does Hudi docs/talks keep talking about it? How is Hudi opti...
  • Categraf

    基本介绍 采集插件 基本介绍 Categraf 是一款 all-in-one 的采集器,由 快猫团队 开源,代码托管在: github: https://github.com/flashcatcloud/categraf Categraf 不但可以采集 OS、MySQL、Redis、Oracle 等常见的监控对象,也准备提供日志采集能力和 ...
  • JDBC Connector

    JDBC 信息配置 semantic 语义配置 EXACTLY_ONCE AT_LEAST_ONCE && NONE 其他配置 JDBC 读取数据 queryFunc获取一条sql resultFunc 处理查询到的数据 JDBC 读取写入 根据数据流生成目标SQL 设置写入批次大小 多实例 JDBC 支持 手动指定 JDBC 连接信...
  • PostgreSQL

    Support Those Engines Using Dependency For Spark/Flink Engine For SeaTunnel Zeta Engine Key Features Description Supported DataSource Info Database Dependency Data Type Map...
  • Troubleshooting

    5489 2024-07-01 《Apache Hudi 0.15.0》
    Writing Tables org.apache.parquet.io.InvalidRecordException: Parquet/Avro schema mismatch: Avro field ‘col1’ not found java.lang.UnsupportedOperationException: org.apache.parquet....