Devlive 开源社区 本次搜索耗时 1.770 秒,为您找到 377 个相关结果.
  • Troubleshooting

    5501 2024-07-01 《Apache Hudi 0.15.0》
    Writing Tables org.apache.parquet.io.InvalidRecordException: Parquet/Avro schema mismatch: Avro field ‘col1’ not found java.lang.UnsupportedOperationException: org.apache.parquet....
  • SQL DDL

    5491 2024-06-30 《Apache Hudi 0.15.0》
    Spark SQL Create table Create non-partitioned table Create partitioned table Create table with record keys and ordering fields Create table from an external location Create Ta...
  • 2.2. How To Use Spark Adaptive Query Execution (AQE) in Kyuubi

    How To Use Spark Adaptive Query Execution (AQE) in Kyuubi The Basics of AQE Dynamically Switch Join Strategies Dynamically Coalesce Shuffle Partitions Other Tips for Best Practise...
  • Clustering

    5481 2024-06-30 《Apache Hudi 0.15.0》
    Background How is compaction different from clustering? Clustering Architecture Overall, there are 2 steps to clustering Schedule clustering Execute clustering Clustering Use...
  • Java API

    5462 2024-06-29 《Apache Iceberg 1.5.2》
    Tables Table metadata Scanning File level Transactions Types Primitives Nested types Expressions Expression binding Expression example Modules Tables The main purpose...
  • Design & Concepts

    5440 2024-07-01 《Apache Hudi 0.15.0》
    How does Hudi ensure atomicity? Does Hudi extend the Hive table layout? What concurrency control approaches does Hudi adopt? Hudi’s commits are based on transaction start time i...
  • HTTP Connector

    HTTP 异步写入 Apache StreamPark™ 方式写入 http异步写入支持类型 http异步写入配置参数列表 http异步写入数据 其他配置 一些后台服务通过 HTTP 请求接收数据,这种场景下 Apache Flink 可以通过 HTTP 请求写入结果数据,目前 Apache Flink 官方未提供通过 HTTP 请求写入 数据...
  • Flink Getting Started

    5415 2024-06-29 《Apache Iceberg 1.5.2》
    Preparation when using Flink SQL Client Flink’s Python API Adding catalogs. Catalog Configuration Hive catalog Creating a table Writing Branch Writes Reading Type conversi...
  • Flink Quick Start

    5356 2024-06-28 《Apache Hudi 0.15.0》
    Setup Flink Support Matrix Download Flink and Start Flink cluster Start Flink SQL client Create Table Insert Data Query Data Update Data Delete Data Row-level Delete Batch...
  • MongoDB

    Support Those Engines Key Features Description Supported DataSource Info Data Type Mapping Source Options Tips How to Create a MongoDB Data Synchronization Jobs Parameter I...