JAVA HDFS API Client 串連HA

來源:互聯網
上載者:User

標籤:void   sts   lov   path   hive   main   ide   ast   conf   

如果Hadoop開啟HA,那麼用Java Client串連Hive的時候,需要指定一些額外的參數

package cn.itacst.hadoop.hdfs;import java.io.FileInputStream;import java.io.InputStream;import java.io.OutputStream;import java.net.URI;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IOUtils;public class HDFS_HA {        public static void main(String[] args) throws Exception {        Configuration conf = new Configuration();        conf.set("fs.defaultFS", "hdfs://ns1");        conf.set("dfs.nameservices", "ns1");        conf.set("dfs.ha.namenodes.ns1", "nn1,nn2");        conf.set("dfs.namenode.rpc-address.ns1.nn1", "itcast01:9000");        conf.set("dfs.namenode.rpc-address.ns1.nn2", "itcast02:9000");        conf.set("dfs.client.failover.proxy.provider.ns1", "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider");        FileSystem fs = FileSystem.get(new URI("hdfs://ns1"), conf, "hadoop");        FileStatus[] list = fs.listStatus(new Path("/"));
        for (FileStatus fileStatus : list) {
            System.out.println(fileStatus.toString());
        }
     fs.close();
}}

 

JAVA HDFS API Client 串連HA

相關文章

聯繫我們

該頁面正文內容均來源於網絡整理,並不代表阿里雲官方的觀點,該頁面所提到的產品和服務也與阿里云無關,如果該頁面內容對您造成了困擾,歡迎寫郵件給我們,收到郵件我們將在5個工作日內處理。

如果您發現本社區中有涉嫌抄襲的內容,歡迎發送郵件至: info-contact@alibabacloud.com 進行舉報並提供相關證據,工作人員會在 5 個工作天內聯絡您,一經查實,本站將立刻刪除涉嫌侵權內容。

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.