Devlive 开源社区 本次搜索耗时 0.553 秒,为您找到 355 个相关结果.
  • Function

    5278 2024-06-02 《Ramda 0.29.2》
    always comparator compose construct curry useWith flip groupBy identity invoker nAry once pipe tap binary unary ap empty of constructN converge curryN __ bin...
  • RHEL/CentOS/Oracle Linux 7

    步骤 下一步 更多信息 在可以访问 Internet 的服务器主机上,使用命令行编辑器执行以下操作 步骤 在安装 Ambari 之前,您必须更新 ambari.repo 文件中的 username 和 password 。运行以下命令: vi /etc/yum.repos.d/ambari.repo 例如,输出显示以下内容: ...
  • Importing and Exporting Datasources

    5251 2024-05-25 《Apache Superset 4.0.1》
    Exporting Datasources to YAML Importing Datasources Legacy Importing Datasources From older versions of Superset to current version From older versions of Superset to older vers...
  • Function

    5212 2024-06-02 《Ramda 0.24.0》
    always comparator compose construct curry useWith flip groupBy identity invoker nAry once pipe tap binary unary ap empty of constructN converge curryN __ bin...
  • Hive

    Description Key features Options table_name [string] metastore_uri [string] hdfs_site_path [string] hive_site_path [string] hive.hadoop.conf [map] hive.hadoop.conf-path [str...
  • Snowflake

    Support those engines Key features Description Supported DataSource list Database dependency Data Type Mapping Options tips Task Example simple: parallel: parallel bounda...
  • GitHub 趋势日报 (2025年04月03日)

    GitHub 趋势日报 (2025年04月03日) 📈 今日整体趋势 Top 10 📊 分语言趋势 Top 5 Rust PHP Go C TypeScript Ruby C++ Java JavaScript MDX GitHub 趋势日报 (2025年04月03日) 本日报由 TrendForge 系统生成 https:/...
  • Function

    5139 2024-06-02 《Ramda 0.16.0》
    always comparator compose construct curry useWith flip groupBy identity invoker nAry once pipe tap binary unary ap empty of constructN converge curryN __ bin...
  • S3Redshift

    Description Key features Options jdbc_url jdbc_user jdbc_password execute_sql path [string] bucket [string] access_key [string] access_secret [string] hadoop_s3_propertie...
  • Batch Writes

    5110 2024-06-30 《Apache Hudi 0.15.0》
    Spark DataSource API The hudi-spark module offers the DataSource API to write a Spark DataFrame into a Hudi table. There are a number of options available: HoodieWriteConfig :...