Devlive 开源社区 本次搜索耗时 2.112 秒,为您找到 906 个相关结果.
  • 认证方式

    认证方式 修改认证方式 开发者LDAP测试 通过 Casdoor 实现 SSO 登录 步骤1. 部署 Casdoor 步骤2. 配置 Casdoor 步骤3. 配置 Dolphinscheduler 通过OAuth2授权认证登录 步骤1. 获取OAuth2客户端凭据 步骤2. 在api的配置文件中开启oauth2登录 步骤3.使用oauth2...
  • Apache Hudi Stack

    8163 2024-06-28 《Apache Hudi 0.15.0》
    Lake Storage File Formats Transactional Database Layer Table Format Indexes Table Services Clustering Compaction Cleaning Indexing Concurrency Control Lake Cache* Metase...
  • Exploring Data in Superset

    8145 2024-05-25 《Apache Superset 4.0.1》
    Enabling Data Upload Functionality ​ Loading CSV Data ​ Table Visualization ​ Dashboard Basics ​ Pivot Table ​ Line Chart ​ Markup ​ Publishing Your Dashboard ​ Ann...
  • siteConfig.js

    8143 2024-06-03 《Docusaurus 1.14.7》
    User Showcase siteConfig Fields Mandatory Fields baseUrl [string] colors [object] copyright [string] favicon [string] headerIcon [string] headerLinks [array] organizat...
  • 部署参数分析

    Dolphin Scheduler目录配置文件解读 bin conf common目录 config目录 alert.properties application-api.properties application-dao.properties master.properties worker.properties Zookeeper.p...
  • FAQs

    Why should I install a computing engine like Spark or Flink? I have a question, and I cannot solve it by myself How do I declare a variable? How do I write a configuration item ...
  • Platform Deployment

    Environmental requirements ​ Hadoop ​ Kubernetes ​ Build & Deploy ​ Environmental requirements ​ install streampark ​ Initialize table structure ​ Modify the configurat...
  • 集群部署(Cluster)

    1.1 : 基础软件安装(必装项请自行安装) 1.2 : 下载后端tar.gz包 1.3:创建部署用户和hosts映射 1.4 : 配置hosts映射和ssh打通及修改目录权限 1.5 : 数据库初始化 1.6 : 修改运行参数 1.7 : 执行install.sh部署脚本 1.8 : 登录系统 1.2.1之前DolphinSchedule...
  • Using Spark

    7898 2024-06-28 《Apache Hudi 0.15.0》
    Hudi Streamer Options Using hudi-utilities bundle jars Concurrency Control Checkpointing Transformers SQL Query Transformer SQL File Transformer Flattening Transformer Chai...
  • Apache Kafka Connector

    Dependencies Kafka Source (Consumer) example Advanced configuration parameters Consume multiple Kafka instances Consume multiple topics Topic dynamic discovery Consume from t...