Devlive 开源社区 本次搜索耗时 0.645 秒,为您找到 116 个相关结果.
  • Programming Paradigm

    Architecture Programming paradigm DataStream Flink Sql TableEnvironment StreamTableEnvironment RunTime Context StreamingContext TableContext StreamTableContext Life Cycle...
  • Kerberos

    2358 2024-06-22 《Apache Accumulo 2.x》
    Overview Within Hadoop Delegation Tokens Configuring Accumulo Servers Generate Principal and Keytab Server Configuration KerberosAuthenticator Administrative User Verifying ...
  • Storage

    2327 2024-07-01 《Apache Hudi 0.15.0》
    Does Hudi support cloud storage/object stores? What is the difference between copy-on-write (COW) vs merge-on-read (MOR) table types? How do I migrate my data to Hudi? How to co...
  • JDBC Connector

    JDBC 信息配置 semantic 语义配置 EXACTLY_ONCE AT_LEAST_ONCE && NONE 其他配置 JDBC 读取数据 queryFunc获取一条sql resultFunc 处理查询到的数据 JDBC 读取写入 根据数据流生成目标SQL 设置写入批次大小 多实例 JDBC 支持 手动指定 JDBC 连接信...
  • Detectors

    2247 2024-06-13 《SpotBugs 4.8.5》
    Standard detectors OverridingMethodsMustInvokeSuperDetector FindRoughConstants SynchronizeAndNullCheckField InitializeNonnullFieldsInConstructor BooleanReturnNull OptionalRetu...
  • 14. Integration with frameworks

    2246 2024-06-19 《Redisson》
    14.1. Spring Framework 14.2. Spring Cache 14.2.1 Spring Cache. Local cache and data partitioning 14.2.2 Spring Cache. YAML config format 14.3. Hibernate Cache 14.3.1. Hibernate...
  • Table Configuration

    2195 2024-06-21 《Apache Accumulo 2.x》
    Locality Groups Managing Locality Groups via the Shell Managing Locality Groups via the Client API Constraints Bloom Filters Iterators Setting Iterators via the Shell Setting...
  • Redis Connector

    Redis Write Dependency Writing Redis the Regular Way 1.Access to source 2. Write to redis Apache StreamPark™ Writes to Redis 1. Configure policy and connection information 2. ...
  • Docker Demo

    2155 2024-06-28 《Apache Hudi 0.15.0》
    A Demo using Docker containers Prerequisites Setting up Docker Cluster Build Hudi Bringing up Demo Cluster Demo Step 1 : Publish the first batch to Kafka Step 2: Incrementall...
  • Hadoop 资源集成

    在 Flink on Kubernetes 上使用 Apache Hadoop 资源 1. Apache HDFS 1.1 添加 shade jar 1.2、添加 core-site.xml 和 hdfs-site.xml 2、Apache Hive i、添加 hive 相关的 jar 2.1. 添加 hive 的配置文件(hive-site.x...