Devlive 开源社区 本次搜索耗时 0.253 秒,为您找到 1568 个相关结果.
  • SQL DML

    4130 2024-06-30 《Apache Hudi 0.15.0》
    Spark SQL Insert Into Insert Overwrite Update Merge Into Delete From Data Skipping and Indexing Flink SQL Insert Into Update Delete From Setting Writer/Reader Configs F...
  • 2. Deploy Kyuubi engines on Kubernetes

    Deploy Kyuubi engines on Kubernetes Requirements Configurations Master Docker Image Test Cluster ServiceAccount Volumes PodTemplateFile Other Deploy Kyuubi engin...
  • GitHub 趋势日报 (2025年04月22日)

    📈 今日整体趋势 Top 10 📊 分语言趋势 Top 5 C++ Dart Java Lua Go PHP C MDX Rust Python C Ruby Kotlin JavaScript Vim Script Jupyter Notebook Dockerfile TypeScript HTML Shell ...
  • AWS Datasync

    DataSync 节点 综述 创建任务 任务样例 独有参数 环境配置 DataSync 节点 综述 AWS DataSync 是一种在线数据传输服务,可简化、自动化和加速本地存储系统和 AWS Storage 服务之间,以及不同 AWS Storage 服务之间的数据移动。 DataSync 支持的组件: Network File ...
  • Hive

    Description Key features Options table_name [string] metastore_uri [string] hdfs_site_path [string] hive_site_path [string] hive.hadoop.conf [map] hive.hadoop.conf-path [str...
  • Function

    4122 2024-06-02 《Ramda 0.24.0》
    always comparator compose construct curry useWith flip groupBy identity invoker nAry once pipe tap binary unary ap empty of constructN converge curryN __ bin...
  • 1. Deploy Kyuubi engines on Yarn

    Deploy Kyuubi engines on Yarn Requirements Configurations Environment Spark Properties Master Queue Sizing Tuning Others Kerberos Deploy Kyuubi engines on Yarn ...
  • Scan Executors

    4115 2024-06-22 《Apache Accumulo 2.x》
    Configuring and using Scan Executors Configuring and using Scan Prioritizers. Providing hints from the client side. Accumulo scans operate by repeatedly fetching batches of dat...
  • 快速开始

    如何使用 部署 DataStream 任务 部署 FlinkSql 任务 任务启动流程 如何使用 在上个章节已经详细介绍了一站式平台 streampark-console 的安装, 本章节看看如果用 streampark-console 快速部署运行一个作业, streampark-console 对标准的 Flink 程序 ( 按照 Fl...
  • S3Redshift

    Description Key features Options jdbc_url jdbc_user jdbc_password execute_sql path [string] bucket [string] access_key [string] access_secret [string] hadoop_s3_propertie...