Devlive 开源社区 本次搜索耗时 0.599 秒,为您找到 700 个相关结果.
  • Logic

    595 2024-06-05 《Ramda 0.9.0》
    and isEmpty not or cond ifElse allPass anyPass and a → b → a | b Added in v0.1.0 Returns the first argument if it is falsy, otherwise the second argument. Acts as the...
  • Distributed collections

    567 2025-03-14 《Redisson 3.45.0》
    Map Redis or Valkey based distributed Map object for Java implements ConcurrentMap interface. This object is thread-safe. Consider to use Live Object service to store POJO obje...
  • Nessie

    Iceberg Nessie Integration Iceberg provides integration with Nessie through the iceberg-nessie module. This section describes how to use Iceberg with Nessie. Nessie provides seve...
  • Integration with Spring

    546 2025-03-15 《Redisson 3.45.0》
    Spring Boot Starter Integrates Redisson with Spring Boot library. Depends on Spring Data Redis module. Supports Spring Boot 1.3.x - 3.4.x Usage 1. Add redisson-spring-boot-sta...
  • Flink Getting Started

    Flink Apache Iceberg supports both Apache Flink ‘s DataStream API and Table API. See the Multi-Engine Support page for the integration of Apache Flink. Feature support Flink...
  • Hive

    Hive Iceberg supports reading and writing Iceberg tables through Hive by using a StorageHandler . Feature support The following features matrix illustrates the support for diff...
  • Creating your first interoperable table

    Creating your first interoperable table Using Apache XTable™ (Incubating) to sync your source tables in different target format involves running sync on your current dataset usi...
  • Getting Started

    Introduction Getting a Gobblin Release Building a Distribution Run Your First Job Steps Running Gobblin as a Daemon Preliminary Steps Other Example Jobs Introduction Thi...
  • Architecture

    Gobblin Architecture Overview Gobblin Job Flow Gobblin Constructs Source and Extractor Converter Quality Checker Fork Operator Data Writer Data Publisher Gobblin Task Flow...
  • Procedures

    Spark Procedures To use Iceberg in Spark, first configure Spark catalogs . Stored procedures are only available when using Iceberg SQL extensions in Spark 3. Usage Procedures c...