Devlive 开源社区 本次搜索耗时 0.666 秒,为您找到 571 个相关结果.
  • Spark

    SPARK节点 综述 创建任务 任务参数 任务样例 spark submit 执行 WordCount 程序 在 DolphinScheduler 中配置 Spark 环境 上传主程序包 配置 Spark 节点 spark sql 执行 DDL 和 DML 语句 注意事项: SPARK节点 综述 Spark 任务类型用于执行...
  • 6. Distributed objects

    5165 2024-06-19 《Redisson 3.31.0》
    6.1. Object holder 6.1.1. Object holder listeners 6.2. Binary stream holder 6.2.1. Binary stream holder listeners 6.3. Geospatial holder 6.4. BitSet 6.4.1. BitSet data partiti...
  • 扩/缩容

    DolphinScheduler扩容/缩容 文档 1. DolphinScheduler扩容文档 1.1. 基础软件安装(必装项请自行安装) 1.2. 获取安装包 1.3. 创建部署用户 1.4. 修改配置 1.4. 重启集群&验证 2. 缩容 2.1 停止缩容节点上的服务 2.2 修改配置文件 DolphinScheduler扩容/...
  • Replication

    5072 2024-06-22 《Apache Accumulo 2.x》
    Overview Configuration Site Configuration Instance Configuration Table Configuration Monitoring Work Assignment ReplicaSystems AccumuloReplicaSystem Other Configuration E...
  • Kyuubi Hive JDBC Driver

    5070 2024-07-05 《Apache Kyuubi 1.9.1》
    Referencing the JDBC Driver Libraries Using the Driver in Java Code Maven sbt Gradle Using the Driver in a JDBC Application Registering the Driver Class Building the Connect...
  • 10. Additional features

    5045 2024-06-19 《Redisson 3.31.0》
    10.1. Operations with Redis nodes 10.2. References to Redisson objects 10.3. Execution batches of commands 10.4. Transactions 10.5. XA Transactions 10.6. Scripting 10.7. Func...
  • Writing Tables

    4997 2024-07-01 《Apache Hudi 0.15.0》
    What are some ways to write a Hudi table? How is a Hudi writer job deployed? Can I implement my own logic for how input records are merged with record on storage? How do I delet...
  • 3. Integration with Hive Metastore

    Integration with Hive Metastore Requirements Default Behavior Related Configurations Remote Metastore Database Remote Metastore Server Activate Configurations Via kyuubi-defau...
  • Accumulo Clients

    4869 2024-06-21 《Apache Accumulo 2.x》
    Creating Client Code Creating an Accumulo Client Authentication Writing Data BatchWriter ConditionalWriter Durability Reading Data Scanner Isolated Scanner BatchScanner ...
  • Flink

    Flink节点 综述 Flink 任务类型,用于执行 Flink 程序。对于 Flink 节点: 当程序类型为 Java、Scala 或 Python 时,worker 使用 Flink 命令提交任务 flink run 。更多详情查看 flink cli 。 当程序类型为 SQL 时,worker 使用sql-client.sh 提交任务。...