Devlive 开源社区 本次搜索耗时 0.443 秒,为您找到 906 个相关结果.
  • Python-JayDeBeApi

    3770 2024-07-05 《Apache Kyuubi 1.9.1》
    Requirements Preparation Usage The JayDeBeApi module allows you to connect from Python code to databases using Java JDBC. It provides a Python DB-API v2.0 to that database. ...
  • Hive

    Description Key features Options table_name [string] metastore_uri [string] hdfs_site_path [string] hive_site_path [string] hive.hadoop.conf [map] hive.hadoop.conf-path [str...
  • 2. Deploy Kyuubi engines on Kubernetes

    Deploy Kyuubi engines on Kubernetes Requirements Configurations Master Docker Image Test Cluster ServiceAccount Volumes PodTemplateFile Other Deploy Kyuubi engin...
  • SNMP

    switch_legacy snmp 监控网络设备,主要是通过 SNMP 协议,Categraf、Telegraf、Datadog-Agent、snmp_exporter 都提供了这个能力。 switch_legacy Categraf 提供了一个网络设备的采集插件:switch_legacy,在 conf/input.switch_legacy ...
  • CosFile

    Support Those Engines Key features Description Options path [string] file_format_type [string] bucket [string] secret_id [string] secret_key [string] region [string] read_...
  • Http

    Support Those Engines Key Features Description Key features Supported DataSource Info Source Options How to Create a Http Data Synchronization Jobs Parameter Interpretation ...
  • RHEL/CentOS/Oracle Linux 7

    步骤 下一步 更多信息 在可以访问 Internet 的服务器主机上,使用命令行编辑器执行以下操作 步骤 以 root 身份登录到您的主机。 将 Ambari 存储库文件下载到安装主机上的目录。 wget - nv https :// username : password@archive . cloudera . com / p /...
  • 1. Deploy Kyuubi engines on Yarn

    Deploy Kyuubi engines on Yarn Requirements Configurations Environment Spark Properties Master Queue Sizing Tuning Others Kerberos Deploy Kyuubi engines on Yarn ...
  • Setup

    3717 2024-06-24 《Apache AGE 0.6.0》
    Getting Apache AGE Releases Source Code Installing From Source Code Pre-Installation CentOS Fedora Ubuntu Install PostgreSQL Install From Source Code Install From a Package...
  • Batch Writes

    3716 2024-06-30 《Apache Hudi 0.15.0》
    Spark DataSource API The hudi-spark module offers the DataSource API to write a Spark DataFrame into a Hudi table. There are a number of options available: HoodieWriteConfig :...