Hadoop security-hftp
By default, hftp is enabled, allowing you to access and download files in a browser. In this way, you can read all files, leaving a security risk.
The test is as follows:
/User/hive/warehouse/cdntest. the owner of the parent directory selfreadonly of db/selfreadonly/hosts is zhouyang and the permission is 700. However, if bkjia Users enter the following address in the browser, they can download the file.
Http: // localhost: 50070/webhdfs/v1/user/hive/warehouse/cdntest. db/selfreadonly/hosts? Op = OPEN & offset = 0 & length = 1024
Add the following configuration to the hdfs-site.xml to disable webhdfs
<Property>
<Name> dfs. webhdfs. enabled </name>
<Value> false </value>
</Property>
After webhdfs is disabled, the hftp protocol can be used. The test is as follows:
[Bkjia @ localhost ~] $ Hadoop fs-ls hftp: /localhost: 50070/user/hive/warehouse/cdntest. db/selfreadonly
Ls: user = bkjia, access = READ_EXECUTE, inode = "/user/hive/warehouse/cdntest. db/selfreadonly": zhouyang: cdn: drwx ------
[Bkjia @ localhost ~] $ Hadoop fs-ls hftp: /localhost: 50070/user/hive/warehouse/cdntest. db
Found 4 items
Drwx -------zhouyang cdn 0 hftp: // localhost: 50070/user/hive/warehouse/cdntest. db/selfreadonly
Drwxrwxr-x-wangjing cdn 0 hftp: // localhost: 50070/user/hive/warehouse/cdntest. db/testp1
Drwxrwx ----cdn 0 hftp: // localhost: 50070/user/hive/warehouse/cdntest. db/testp2
Drwxrwxr-x-wangjing cdn 0 hftp: // localhost: 50070/user/hive/warehouse/cdntest. db/wangjing
/User/hive/warehouse/cdntest. the owner of the parent directory selfreadonly of db/selfreadonly/hosts is zhouyang and the permission is 700. However, if bkjia Users enter the following address in the browser, they can download the file.
Http: // localhost: 50070/webhdfs/v1/user/hive/warehouse/cdntest. db/selfreadonly/hosts? Op = OPEN & offset = 0 & length = 1024
After webhdfs is disabled, the hftp protocol can continue to be used.
[Bkjia @ localhost ~] $ Hadoop fs-ls hftp: /localhost: 50070/user/hive/warehouse/cdntest. db/selfreadonly
Ls: user = bkjia, access = READ_EXECUTE, inode = "/user/hive/warehouse/cdntest. db/selfreadonly": zhouyang: cdn: drwx ------
[Bkjia @ localhost ~] $ Hadoop fs-ls hftp: /localhost: 50070/user/hive/warehouse/cdntest. db
Found 4 items
Drwx -------zhouyang cdn 0 hftp: // localhost: 50070/user/hive/warehouse/cdntest. db/selfreadonly
Drwxrwxr-x-wangjing cdn 0 hftp: // localhost: 50070/user/hive/warehouse/cdntest. db/testp1
Drwxrwx ----cdn 0 hftp: // localhost: 50070/user/hive/warehouse/cdntest. db/testp2
Drwxrwxr-x-wangjing cdn 0 hftp: // localhost: 50070/user/hive/warehouse/cdntest. db/wangjing
Tutorial on standalone/pseudo-distributed installation and configuration of Hadoop2.4.1 under Ubuntu14.04
Install and configure Hadoop2.2.0 on CentOS
Build a Hadoop environment on Ubuntu 13.04
Cluster configuration for Ubuntu 12.10 + Hadoop 1.2.1
Build a Hadoop environment on Ubuntu (standalone mode + pseudo Distribution Mode)
Configuration of Hadoop environment in Ubuntu
Detailed tutorial on creating a Hadoop environment for standalone Edition