Devlive 开源社区 本次搜索耗时 0.766 秒,为您找到 451 个相关结果.
  • User defined functions

    972 2024-06-24 《Apache AGE 0.6.0》
    Users may add custom functions to AGE. When using Cypher functions, all function calls with a Cypher query use the default namespace of: ag_catalog . However if a user wants to us...
  • SKIP

    959 2024-06-24 《Apache AGE 0.6.0》
    Introduction Skip first three rows Return middle two rows Using an expression with SKIP to return a subset of the rows SKIP defines from which record to start including the r...
  • Connectors for Spark SQL Query Engine

    953 2024-07-05 《Apache Kyuubi 1.9.1》
    Delta Lake Delta Lake with Microsoft Azure Blob Storage Hudi Iceberg Kudu Hive Apache Paimon (Incubating) TiDB TPC-DS TPC-H
  • Creating your first interoperable table

    Creating your first interoperable table Using Apache XTable™ (Incubating) to sync your source tables in different target format involves running sync on your current dataset usi...
  • Configuration Glossary

    Table of Contents Properties File Format Creating a Basic Properties File Job Launcher Properties Common Job Launcher Properties SchedulerDaemon Properties CliMRJobLauncher Pr...
  • Gobblin CLI

    Gobblin Commands & Execution Modes Gobblin Commands The Distcp Quick App The OneShot Quick App Developing quick apps for the CLI Implementing new Gobblin commands Gobblin Ser...
  • Tailwind CSS 移动端适配实践

    本节将详细介绍如何使用 Tailwind CSS 进行移动端适配,包括响应式设计、触摸交互优化、性能优化等方面。 基础配置 视口配置 <!-- public/index.html --> <meta name = "viewport" content = "width=device-width, initial-scale=1.0, max...
  • Object

    926 2024-06-05 《Ramda 0.1.0》
    clone values eqProps keys omit pick pickAll project prop props clone {*} → {*} Parameters valueThe object or array to clone Returns * A deeply cloned copy of val...
  • Flink Writes

    Flink Writes Iceberg support batch and streaming writes With Apache Flink ‘s DataStream API and Table API. Writing with SQL Iceberg support both INSERT INTO and INSERT OVERWRIT...
  • Procedures

    Spark Procedures To use Iceberg in Spark, first configure Spark catalogs . Stored procedures are only available when using Iceberg SQL extensions in Spark 3. Usage Procedures c...