Devlive 开源社区 本次搜索耗时 2.672 秒,为您找到 789 个相关结果.
  • Python-JayDeBeApi

    3765 2024-07-05 《Apache Kyuubi 1.9.1》
    Requirements Preparation Usage The JayDeBeApi module allows you to connect from Python code to databases using Java JDBC. It provides a Python DB-API v2.0 to that database. ...
  • Functions

    3748 2024-06-07 《Underscore.js 1.13.6》
    bind bindAll partial memoize delay defer throttle debounce once after before wrap negate compose restArguments bind _.bind(function, object, *arguments) source ...
  • CosFile

    Support Those Engines Key features Description Options path [string] file_format_type [string] bucket [string] secret_id [string] secret_key [string] region [string] read_...
  • Http

    Support Those Engines Key Features Description Key features Supported DataSource Info Source Options How to Create a Http Data Synchronization Jobs Parameter Interpretation ...
  • Function

    3736 2024-06-02 《Ramda 0.15.0》
    always comparator compose construct curry useWith flip groupBy identity invoker nAry once pipe tap binary unary ap empty of constructN converge curryN __ bin...
  • deployment

    Deployment SeaTunnel Engine 1. Download 2 Config SEATUNNEL_HOME 3. Config SeaTunnel Engine JVM options 4. Config SeaTunnel Engine 4.1 Backup count 4.2 Slot service 4.3 Checkpo...
  • 告警格式

    需求:如何自定义展示标签 使用模板语法自定义规则备注 注意 告警事件的消息通知格式,是由模板控制的,模板文件在 etc/template 下: dingtalk.tpl 钉钉的消息模板 feishu.tpl 飞书的消息模板 wecom.tpl 企业微信的消息模板 subject.tpl 邮件标题模板 mailbody.tpl 邮件内容模板 这...
  • 1. Deploy Kyuubi engines on Yarn

    Deploy Kyuubi engines on Yarn Requirements Configurations Environment Spark Properties Master Queue Sizing Tuning Others Kerberos Deploy Kyuubi engines on Yarn ...
  • Function

    3720 2024-06-02 《Ramda 0.16.0》
    always comparator compose construct curry useWith flip groupBy identity invoker nAry once pipe tap binary unary ap empty of constructN converge curryN __ bin...
  • Batch Writes

    3713 2024-06-30 《Apache Hudi 0.15.0》
    Spark DataSource API The hudi-spark module offers the DataSource API to write a Spark DataFrame into a Hudi table. There are a number of options available: HoodieWriteConfig :...