Registration And Configuration

Register An Account And Log In

Regarding the Microsoft Azure account, please contact your organization or register an account as an individual. For details, please refer to the Microsoft Azure official website.

Create Storage Container

After logging in with your Microsoft Azure account, please follow the steps below to create a data storage container:

Image 1: ../../_images/azure_create_new_container.png

Get Access Key

Image 2: ../../_images/azure_create_azure_access_key.png

Deploy Spark

Download Spark Package

Download spark package that matches your environment from spark official website. And then unpackage:

  1. tar -xzvf spark-3.2.0-bin-hadoop3.2.tgz

Config Spark

Enter the $SPARK_HOME/conf directory and execute:

  1. cp spark-defaults.conf.template spark-defaults.conf

Add following configuration to spark-defaults.conf, please refer to your own local configuration for specific personalized configuration:

  1. spark.master spark://<YOUR_HOST>:7077
  2. spark.sql.extensions io.delta.sql.DeltaSparkSessionExtension
  3. spark.sql.catalog.spark_catalog org.apache.spark.sql.delta.catalog.DeltaCatalog

Create a new file named core-site.xml under $SPARK_HOME/conf directory, and add following configuration:

  1. <?xml version="1.0" encoding="UTF-8"?>
  2. <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
  3. <configuration>
  4. <property>
  5. <name>fs.AbstractFileSystem.wasb.Impl</name>
  6. <value>org.apache.hadoop.fs.azure.Wasb</value>
  7. </property>
  8. <property>
  9. <name>fs.azure.account.key.YOUR_AZURE_ACCOUNT.blob.core.windows.net</name>
  10. <value>YOUR_AZURE_ACCOUNT_ACCESS_KEY</value>
  11. </property>
  12. <property>
  13. <name>fs.azure.block.blob.with.compaction.dir</name>
  14. <value>/hbase/WALs,/tmp/myblobfiles</value>
  15. </property>
  16. <property>
  17. <name>fs.azure</name>
  18. <value>org.apache.hadoop.fs.azure.NativeAzureFileSystem</value>
  19. </property>
  20. <property>
  21. <name>fs.azure.enable.append.support</name>
  22. <value>true</value>
  23. </property>
  24. </configuration>

Copy Dependencies To Spark

Copy jar packages required by delta lake and microsoft azure to ./spark/jars directory:

  1. wget https://repo1.maven.org/maven2/io/delta/delta-core_2.12/1.0.0/delta-core_2.12-1.0.0.jar -O ./spark/jars/delta-core_2.12-1.0.0.jar
  2. wget https://repo1.maven.org/maven2/com/microsoft/azure/azure-storage/8.6.6/azure-storage-8.6.6.jar -O ./spark/jars/azure-storage-8.6.6.jar
  3. wget https://repo1.maven.org/maven2/com/azure/azure-storage-blob/12.14.2/azure-storage-blob-12.14.2.jar -O ./spark/jars/azure-storage-blob-12.14.2.jar
  4. wget https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-azure/3.1.1/hadoop-azure-3.1.1.jar -O ./spark/jars/hadoop-azure-3.1.1.jar

Start Spark Standalone cluster

  1. ./spark/sbin/start-master.sh -h <YOUR_HOST> -p 7077 --webui-port 9090
  2. ./spark/sbin/start-worker.sh spark://<YOUR_HOST>:7077

Test The connectivity Of Spark And Delta Lake

Start spark shell:

  1. ./bin/spark-shell

Generate a piece of random data and push them to delta lake:

  1. scala> val data = spark.range(1000, 2000)
  2. scala> data.write.format("delta").mode("overwrite").save("wasbs://<YOUR_CONTAINER_NAME>@<YOUR_AZURE_ACCOUNT>.blob.core.windows.net/<YOUR_TABLE_NAME>")

After this, you can check your data on azure web UI. For example, my container name is 1000 and table name is alexDemo20211127:

Image 3: ../../_images/azure_spark_connection_test_storage.pngYou can also check data by reading back the data from delta lake:

  1. scala> val df=spark.read.format("delta").load("wasbs://<YOUR_CONTAINER_NAME>@<YOUR_AZURE_ACCOUNT>.blob.core.windows.net/<YOUR_TABLE_NAME>")
  2. scala> df.show()

If there is no problem with the above, it proves that spark has been built with delta lake.

Deploy Kyuubi

Install Kyuubi

1.Download the latest version of kyuubi from kyuubi download page.

2.Unpackage

tar -xzvf apache-kyuubi-1.9.1-bin.tgz

Config Kyuubi

Enter the ./kyuubi/conf directory

  1. cp kyuubi-defaults.conf.template kyuubi-defaults.conf
  2. vim kyuubi-defaults.conf

Add the following content:

  1. spark.master spark://<YOUR_HOST>:7077
  2. kyuubi.authentication NONE
  3. kyuubi.frontend.bind.host <YOUR_HOST>
  4. kyuubi.frontend.bind.port 10009
  5. # If you use your own zk cluster, you need to configure your zk host port.
  6. # kyuubi.ha.addresses <YOUR_HOST>:2181

Start Kyuubi

  1. bin/kyuubi start

Check kyuubi log, in order to check kyuubi start status and find the jdbc connection url:

  1. 2021-11-26 17:49:50.235 INFO service.ThriftFrontendService: Starting and exposing JDBC connection at: jdbc:hive2://HOST:10009/
  2. 2021-11-26 17:49:50.265 INFO client.ServiceDiscovery: Created a /kyuubi/serviceUri=host:10009;version=1.3.1-incubating;sequence=0000000037 on ZooKeeper for KyuubiServer uri: host:10009
  3. 2021-11-26 17:49:50.267 INFO server.KyuubiServer: Service[KyuubiServer] is started.

You can get the jdbc connection url by the log above.

Test The Connectivity Of Kyuubi And Delta Lake

Use $KYUUBI_HOME/bin/beeline tool,

  1. ./bin//beeline -u 'jdbc:hive2://<YOUR_HOST>:10009/'

At the same time, you can also check whether the engine is running on the spark UI:

Image 4: ../../_images/kyuubi_start_status_spark_UI.pngWhen the engine started, it will expose a thrift endpoint and register itself into ZooKeeper, Kyuubi server can get the connection info from ZooKeeper and establish the connection to the engine. So, you can check the registration details in zookeeper path ‘/kyuubi_USER/anonymous’.

Dealing Delta Lake Data By Using Kyuubi Examples

Operate delta-lake data through SQL:

Create Table

  1. -- Create or replace table with path
  2. CREATE OR REPLACE TABLE delta.`wasbs://1000@azure_account.blob.core.windows.net/alexDemo20211129` (
  3. date DATE,
  4. eventId STRING,
  5. eventType STRING,
  6. data STRING)
  7. USING DELTA
  8. PARTITIONED BY (date);

Insert Data

Append Mode

  1. INSERT INTO delta.`wasbs://1000@azure_account.blob.core.windows.net/alexDemo20211129` (
  2. date,
  3. eventId,
  4. eventType,
  5. data)
  6. VALUES
  7. (now(),'001','test','Hello World!'),
  8. (now(),'002','test','Hello World!'),
  9. (now(),'003','test','Hello World!');

Result:

  1. +-------------+----------+------------+---------------+
  2. | date | eventId | eventType | data |
  3. +-------------+----------+------------+---------------+
  4. | 2021-11-29 | 001 | test | Hello World! |
  5. | 2021-11-29 | 003 | test | Hello World! |
  6. | 2021-11-29 | 002 | test | Hello World! |
  7. +-------------+----------+------------+---------------+

Overwrite Mode

  1. INSERT OVERWRITE TABLE delta.`wasbs://1000@azure_account.blob.core.windows.net/alexDemo20211129`(
  2. date,
  3. eventId,
  4. eventType,
  5. data)
  6. VALUES
  7. (now(),'001','test','hello kyuubi'),
  8. (now(),'002','test','hello kyuubi');

Result:

  1. +-------------+----------+------------+---------------+
  2. | date | eventId | eventType | data |
  3. +-------------+----------+------------+---------------+
  4. | 2021-11-29 | 002 | test | hello kyuubi |
  5. | 2021-11-29 | 001 | test | hello kyuubi |
  6. +-------------+----------+------------+---------------+

Delete Table Data

  1. DELETE FROM
  2. delta.`wasbs://1000@azure_account.blob.core.windows.net/alexDemo20211129`
  3. WHERE eventId = 002;

Result:

  1. +-------------+----------+------------+---------------+
  2. | date | eventId | eventType | data |
  3. +-------------+----------+------------+---------------+
  4. | 2021-11-29 | 001 | test | hello kyuubi |
  5. +-------------+----------+------------+---------------+

Update table data

  1. UPDATE
  2. delta.`wasbs://1000@azure_account.blob.core.windows.net/alexDemo20211129`
  3. SET data = 'This is a test for update data.'
  4. WHERE eventId = 001;

Result:

  1. +-------------+----------+------------+----------------------------------+
  2. | date | eventId | eventType | data |
  3. +-------------+----------+------------+----------------------------------+
  4. | 2021-11-29 | 001 | test | This is a test for update data. |
  5. +-------------+----------+------------+----------------------------------+

Select table data

  1. SELECT *
  2. FROM
  3. delta.`wasbs://1000@azure_account.blob.core.windows.net/alexDemo20211129`;

Result:

  1. +-------------+----------+------------+----------------------------------+
  2. | date | eventId | eventType | data |
  3. +-------------+----------+------------+----------------------------------+
  4. | 2021-11-29 | 001 | test | This is a test for update data. |
  5. +-------------+----------+------------+----------------------------------+