上一篇文章尝试了建库,建表,插入数据,还差删除和更新,所以在这篇文章补充一下,代码很简单,具体如下:
java
import org.apache.spark.sql.SaveMode;
import org.apache.spark.sql.SparkSession;
public class DeltaLakeWithSparkSql2 {
public static void main(String[] args) {
SparkSession spark = SparkSession.builder()
.master("local[*]")
.appName("delta_lake")
.config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
.config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
.getOrCreate();
String sourcePath="D:\\bookcode\\delta-lake-up-and-running-main\\data\\YellowTaxi\\";
var df = spark.read().format("parquet").load(sourcePath);
System.out.println("总数据量:"+df.count());
df.write().format("delta").mode(SaveMode.Overwrite).save("file:///D:\\\\bigdata\\\\detla-lake-with-java\\\\YellowTaxi");
spark.sql("CREATE DATABASE IF NOT EXISTS taxidb");
spark.sql("CREATE TABLE IF NOT EXISTS taxidb.YellowTaxi USING DELTA LOCATION 'file:///D:\\\\bigdata\\\\detla-lake-with-java\\\\YellowTaxi'");
spark.sql("DESCRIBE TABLE taxidb.YellowTaxi").show(false);
spark.sql("SELECT COUNT(*) from taxidb.YellowTaxi WHERE VendorID>0").show(false);
spark.sql("SELECT * from taxidb.YellowTaxi WHERE tpep_pickup_datetime='2021-01-01 00:30:10'").show(false);
spark.sql("DELETE FROM taxidb.YellowTaxi WHERE tpep_pickup_datetime='2021-01-01 00:30:10'").show(false);
spark.sql("SELECT * from taxidb.YellowTaxi WHERE tpep_pickup_datetime='2021-01-01 00:30:10'").show(false);
spark.sql("DESCRIBE HISTORY taxidb.YellowTaxi").show(false);
spark.sql("SELECT INPUT_FILE_NAME(), * from taxidb.YellowTaxi WHERE tpep_pickup_datetime='2022-01-01 00:35:40'").show(false);
spark.sql("UPDATE taxidb.YellowTaxi SET passenger_count=99 WHERE tpep_pickup_datetime='2022-01-01 00:35:40'").show(false);
spark.sql("SELECT INPUT_FILE_NAME(), * from taxidb.YellowTaxi WHERE tpep_pickup_datetime='2022-01-01 00:35:40'").show(false);
spark.sql("DESCRIBE HISTORY taxidb.YellowTaxi").show(false);
}
}
里面涉及的数据集YellowTaxi,在如下地址下载,这个也是Delta Lake Up and Runing的配套仓库
具体运行结果如下,没有什么需要注意的,纯粹就是输入一下代码,验证一下结果