Devlive 开源社区 本次搜索耗时 0.625 秒,为您找到 356 个相关结果.
  • What is Apache StreamPark™

    Apache StreamPark™ 🚀 What is Apache StreamPark™ Why Apache StreamPark™? 🎉 Features 🏳‍🌈 Architecture of Apache StreamPark™ 1️⃣ streampark-core 2️⃣ streampark-console Apache...
  • 名词解释

    名词解释 模块介绍 在对 Apache DolphinScheduler 了解之前,我们先来认识一下调度系统常用的名词 名词解释 DAG: 全称 Directed Acyclic Graph,简称 DAG。工作流中的 Task 任务以有向无环图的形式组装起来,从入度为零的节点进行拓扑遍历,直到无后继节点为止。举例如下图: 流程定义 :通过拖...
  • Using Spark

    1351 2024-06-28 《Apache Hudi 0.15.0》
    Hudi Streamer Options Using hudi-utilities bundle jars Concurrency Control Checkpointing Transformers SQL Query Transformer SQL File Transformer Flattening Transformer Chai...
  • 1. Getting Started with Kyuubi

    Getting Started with Kyuubi Getting Kyuubi Requirements Installation Running Kyuubi Setup JAVA Starting Kyuubi Using Hive Beeline Opening a Connection Execute Statements Cl...
  • About SeaTunnel

    About SeaTunnel Why we need SeaTunnel Features of SeaTunnel SeaTunnel work flowchart Connector Who uses SeaTunnel Landscapes Learn more About SeaTunnel SeaTunnel i...
  • Storage

    1305 2024-07-01 《Apache Hudi 0.15.0》
    Does Hudi support cloud storage/object stores? What is the difference between copy-on-write (COW) vs merge-on-read (MOR) table types? How do I migrate my data to Hudi? How to co...
  • FAQs

    Why should I install a computing engine like Spark or Flink? I have a question, and I cannot solve it by myself How do I declare a variable? How do I write a configuration item ...
  • Docker Demo

    1249 2024-06-28 《Apache Hudi 0.15.0》
    A Demo using Docker containers Prerequisites Setting up Docker Cluster Build Hudi Bringing up Demo Cluster Demo Step 1 : Publish the first batch to Kafka Step 2: Incrementall...
  • Configurations

    1233 2024-07-05 《Apache Kyuubi 1.9.1》
    Environments Kyuubi Configurations Authentication Backend Batch Credentials Ctl Delegation Engine Event Frontend Ha Kinit Kubernetes Lineage Metadata Metrics Operat...
  • Writing Tables

    1205 2024-07-01 《Apache Hudi 0.15.0》
    What are some ways to write a Hudi table? How is a Hudi writer job deployed? Can I implement my own logic for how input records are merged with record on storage? How do I delet...