Devlive 开源社区 本次搜索耗时 1.035 秒,为您找到 351 个相关结果.
  • SQL Server

    Support SQL Server Version Support Those engines Description Using Dependency For Spark/Flink Engine For SeaTunnel Zeta Engine Key Features Supported DataSource Info Databa...
  • SpotBugs FAQ

    3773 2024-06-13 《SpotBugs 4.8.5》
    Q1: I’m getting java.lang.UnsupportedClassVersionError when I try to run SpotBugs Q2: SpotBugs is running out of memory, or is taking a long time to finish Q3: What is the “auxil...
  • Telegraf

    Telegraf 是 InfluxData 公司开源的一款采集器,内置非常多的采集插件,不过 Telegraf 是面向 InfluxDB 生态的,采集的监控数据推给 InfluxDB 非常合适,推给 Prometheus、Victoriametrics、Thanos 这些时序库,可能会带来问题。主要是两点: 有些数据是 string 类型的,Prome...
  • S3Redshift

    Description Key features Options jdbc_url jdbc_user jdbc_password execute_sql path [string] bucket [string] access_key [string] access_secret [string] hadoop_s3_propertie...
  • Python-JayDeBeApi

    3758 2024-07-05 《Apache Kyuubi 1.9.1》
    Requirements Preparation Usage The JayDeBeApi module allows you to connect from Python code to databases using Java JDBC. It provides a Python DB-API v2.0 to that database. ...
  • 2. Deploy Kyuubi engines on Kubernetes

    Deploy Kyuubi engines on Kubernetes Requirements Configurations Master Docker Image Test Cluster ServiceAccount Volumes PodTemplateFile Other Deploy Kyuubi engin...
  • Functions

    3743 2024-06-07 《Underscore.js 1.13.6》
    bind bindAll partial memoize delay defer throttle debounce once after before wrap negate compose restArguments bind _.bind(function, object, *arguments) source ...
  • CosFile

    Support Those Engines Key features Description Options path [string] file_format_type [string] bucket [string] secret_id [string] secret_key [string] region [string] read_...
  • 1. Deploy Kyuubi engines on Yarn

    Deploy Kyuubi engines on Yarn Requirements Configurations Environment Spark Properties Master Queue Sizing Tuning Others Kerberos Deploy Kyuubi engines on Yarn ...
  • Batch Writes

    3711 2024-06-30 《Apache Hudi 0.15.0》
    Spark DataSource API The hudi-spark module offers the DataSource API to write a Spark DataFrame into a Hudi table. There are a number of options available: HoodieWriteConfig :...