Devlive 开源社区 本次搜索耗时 0.304 秒,为您找到 1179 个相关结果.
  • JDO 3.0 Overview

    4856 2024-05-25 《Apache JDO 3.2.1》
    Background Metadata API Enhancer API Query Cancel/Timeout API Control of read objects locking Background Java Data Objects (JDO) is a specification begun in 2000, with 2 maj...
  • JDBC

    Description Using Dependency For Spark/Flink Engine For SeaTunnel Zeta Engine Key Features Options driver [string] user [string] password [string] url [string] query [stri...
  • Kafka

    Support Those Engines Key Features Description Supported DataSource Info Sink Options Parameter Interpretation Topic Formats Semantics Partition Key Fields Assign Partition...
  • REST API v1

    4827 2024-07-05 《Apache Kyuubi 1.9.1》
    REST API v1 Session Resource GET /sessions Response Body GET /sessions/${sessionHandle} Response Body GET /sessions/${sessionHandle}/info/${infoType} Request Parameters Respon...
  • String

    4796 2024-06-14 《Lodash 3.10.1》
    _.camelCase([string=’’]) Arguments Returns Example _.capitalize([string=’’]) Arguments Returns Example _.deburr([string=’’]) Arguments Returns Example _.endsWith([strin...
  • High-Speed Ingest

    4787 2024-06-22 《Apache Accumulo 2.x》
    Pre-Splitting New Tables Multiple Ingest Clients Bulk Ingest Logical Time for Bulk Ingest MapReduce Ingest Accumulo is often used as part of a larger data processing and stor...
  • Caching

    4763 2024-05-25 《Apache Superset 4.0.1》
    Dependencies Fallback Metastore Cache Chart Cache Timeout SQL Lab Query Results Caching Thumbnails Superset uses Flask-Caching for caching purposes. Flask-Caching supports v...
  • RHEL/CentOS/Oracle Linux 7

    Steps Next Step More Information On a server host that has Internet access, use a command line editor to perform the following Steps Before installing Ambari, you must upda...
  • 3. Trouble Shooting

    Trouble Shooting Common Issues java.lang.UnsupportedClassVersionError .. Unsupported major.minor version 52.0 org.apache.spark.SparkException: When running with master ‘yarn’ eith...
  • Creating your first interoperable table

    Pre-requisites Steps Initialize a pyspark shell Create dataset Running sync Conclusion Next steps Using OneTable to sync your source tables in different target format invo...