Devlive 开源社区 本次搜索耗时 0.275 秒,为您找到 1369 个相关结果.
  • S3Redshift

    Description Key features Options jdbc_url jdbc_user jdbc_password execute_sql path [string] bucket [string] access_key [string] access_secret [string] hadoop_s3_propertie...
  • Configure Kerberos for clients to Access Kerberized Kyuubi

    4804 2024-07-05 《Apache Kyuubi 1.9.1》
    Instructions Install Kerberos Client Configure Kerberos Client Get Kerberos TGT Add Kerberos Client Configuration File to JVM Search Path Add Kerberos Ticket Cache to JVM Sear...
  • Function

    4803 2024-06-02 《Ramda 0.16.0》
    always comparator compose construct curry useWith flip groupBy identity invoker nAry once pipe tap binary unary ap empty of constructN converge curryN __ bin...
  • Erasure Coding

    4796 2024-06-22 《Apache Accumulo 2.x》
    Important Warning EC and Threads HDFS ec Command Configuring EC for a New Instance Configuring EC for an Existing Instance Defining Custom EC Policies With the release of ve...
  • deployment

    Deployment SeaTunnel Engine 1. Download 2 Config SEATUNNEL_HOME 3. Config SeaTunnel Engine JVM options 4. Config SeaTunnel Engine 4.1 Backup count 4.2 Slot service 4.3 Checkpo...
  • GitHub 趋势日报 (2025年05月02日)

    📈 今日整体趋势 Top 10 📊 分语言趋势 Top 5 Go C++ Rust Ruby Lua C Swift TypeScript Java PHP C MDX Dart Vim Script Python JavaScript Markdown PowerShell Shell Kotlin HTML J...
  • IoTDB

    Support Those Engines Description Using Dependency For Spark/Flink Engine For SeaTunnel Zeta Engine Key Features Supported DataSource Info Data Type Mapping Sink Options E...
  • Partitioning

    4767 2024-06-29 《Apache Iceberg 1.5.2》
    What is partitioning? What does Iceberg do differently? Partitioning in Hive Problems with Hive partitioning Iceberg’s hidden partitioning What is partitioning? Partitioning...
  • My Hours

    Support Those Engines Key Features Description Key features Supported DataSource Info Source Options How to Create a My Hours Data Synchronization Jobs Parameter Interpretat...
  • Batch Writes

    4749 2024-06-30 《Apache Hudi 0.15.0》
    Spark DataSource API The hudi-spark module offers the DataSource API to write a Spark DataFrame into a Hudi table. There are a number of options available: HoodieWriteConfig :...