How Spark Read Sftp Files from Hadoop SFTP FileSystem

Gradle Dependencies

gradle 复制代码
        implementation('org.apache.spark:spark-sql_2.13:3.5.3')
        implementation 'org.apache.hadoop:hadoop-common:3.3.4'

        testImplementation "org.springframework.boot:spring-boot-starter-test"
        testImplementation "org.apache.sshd:sshd-core:2.8.0"
        testImplementation "org.apache.sshd:sshd-sftp:2.8.0"

Setup a Fake SFTP server

java 复制代码
        // GIVEN
        // SETUP Fake SFTP Server
        String host = "127.0.0.1";
        String user = "username";
        String passwd = "password";
        int port = 9188;

        SshServer sshd = SshServer.setUpDefaultServer();
        sshd.setPort(port);
        sshd.setKeyPairProvider(new SimpleGeneratorHostKeyProvider());
        sshd.setPasswordAuthenticator((username, password, session) -> user.equals(username) && passwd.equals(password) );
        sshd.setSubsystemFactories(Collections.singletonList(new SftpSubsystemFactory()));
        sshd.setFileSystemFactory(new VirtualFileSystemFactory(rootPath));

        sshd.start();
        System.out.println("Fake SFTP server started at port " + port);

Generate A tested CSV file based on Hadoop SFTP FileSystem api

java 复制代码
        String sftpURL = String.format("sftp://%s:%s@%s:%d", user, passwd, host, port);
        String testedCsvFile = "test.csv";
        // WHEN
        // Create a CSV file by Hadoop FileSystem api
        Configuration conf = new Configuration();
        conf.set("fs.sftp.impl", "org.apache.hadoop.fs.sftp.SFTPFileSystem");
        conf.set("fs.defaultFS", sftpURL);

        // get FileSystem instance by a root Path
        Path path = new Path("/");
        FileSystem sftpFileSystem = FileSystem.get(path.toUri(),conf);
        Assertions.assertTrue(sftpFileSystem instanceof SFTPFileSystem);

        // Create a test csv file and write text contents to it
        try (BufferedWriter br = new BufferedWriter(new OutputStreamWriter(sftpFileSystem.create(new Path(testedCsvFile), true)))) {
            br.write("A|B|C|D");
            br.newLine();
            br.write("1|2|3|4");
        }

        // check the tested file
        FileStatus[] statuses = sftpFileSystem.listStatus(new Path("/"));
        Assertions.assertEquals(1, statuses.length);
        Assertions.assertTrue(statuses[0].isFile());
        Assertions.assertEquals(testedCsvFile, statuses[0].getPath().getName());

Finally, Read the tested data from SFTP Server

java 复制代码
    // THEN
    // Read the test csv file by Spark
    SparkConf sparkConf = new SparkConf()
            .setAppName("spark-test")
            .setMaster("local[2]")
            .set("spark.ui.enabled","false")
            .set("spark.hadoop.fs.sftp.impl","org.apache.hadoop.fs.sftp.SFTPFileSystem")
            .set("spark.hadoop.fs.defaultFS",sftpURL)
            ;
    SparkSession sparkSession = SparkSession.builder().config(sparkConf).getOrCreate();

    // read csv file by the sftp connection
    Dataset<Row> dataset = sparkSession.read()
            .option("header","true").option("delimiter","|")
            .csv(testedCsvFile);
    dataset.printSchema();
    dataset.show();
        
text 复制代码
root
    |-- A: string (nullable = true)
    |-- B: string (nullable = true)
    |-- C: string (nullable = true)
    |-- D: string (nullable = true)

+---+---+---+---+
|  A|  B|  C|  D|
+---+---+---+---+
|  1|  2|  3|  4|
+---+---+---+---+
相关推荐
秃了也弱了。16 分钟前
elasticSearch之API:基础命令及文档基本操作
大数据·elasticsearch·搜索引擎
l1t36 分钟前
DeepSeek辅助生成的PostgreSQL 查询优化实战幻灯片脚本
大数据·数据库·postgresql
Timer_Cooker1 小时前
Hive Sum(null)编译报错分析
数据仓库·hive·hadoop
heimeiyingwang1 小时前
企业级知识库构建:从数据清洗到向量检索
大数据·人工智能·机器学习
globaldomain1 小时前
立海世纪:.com和.net域名哪个更适合你的网站
大数据·前端·人工智能·新媒体运营·国外域名·域名注册
媒体人8882 小时前
孟庆涛:生成式引擎优化(GEO)的投毒攻击防御策略研究
大数据·人工智能·搜索引擎·生成式引擎优化·geo优化
志栋智能2 小时前
AI驱动的自动化运维机器人:从“数字劳动力”到“智能协作者”的进化
大数据·运维·网络·人工智能·机器人·自动化
AI周红伟2 小时前
周红伟:2026年10个AI预言:迈向AGI通用人工智能体时代
大数据·人工智能·机器学习·大模型·agi·智能体·seedance
systeminof2 小时前
折叠屏竞争进入新阶段:三星新品对标苹果Fold战略
大数据
阿杰学AI2 小时前
AI核心知识109—大语言模型之 Industry Agent Operations Specialist(简洁且通俗易懂版)
大数据·人工智能·ai·语言模型·agent·智能体·行业智能体运营师