After the Hadoop cluster is set up, the Hadoop cluster is accessed locally via the Java API, as follows (see all node name information on the Hadoop cluster)
Import org.apache.hadoop.conf.Configuration;
Import Org.apache.hadoop.fs.FileSystem;
Import Org.apache.hadoop.hdfs.DistributedFileSystem;
Import Org.apache.hadoop.hdfs.protocol.DatanodeInfo;
Import java.io.IOException;
Import Java.net.URI;
public class Accesshdfs {public
static void Main (string[] args) throws IOException {
Configuration conf=new confi Guration ();
FileSystem fs= Filesystem.get (uri.create ("Hdfs://your-ip:9000/file-path"), conf);
Distributedfilesystem DFS = (distributedfilesystem) fs;
datanodeinfo[] Datanodestats = Dfs.getdatanodestats ();
for (int i=0; i<datanodestats.length; i++) {
System.out.println ("Datanode_" + i + "_node:" + datanodestats[i]. GetHostName ());}}
The access denied for user Administrator appears. Superuser privilege is required error message:
Exception in thread "main" org.apache.hadoop.ipc.RemoteException (org.apache.hadoop.security.AccessControlException ): Access denied for user Administrator. Superuser privilege is required at Org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkSuperuserPrivilege (Fspermissionchecker.java :) at Org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkSuperuserPrivilege (fsnamesystem.java:4484) at Org.apache.hadoop.hdfs.server.namenode.FSNamesystem.datanodeReport (fsnamesystem.java:4137) at Org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDatanodeReport (namenoderpcserver.java:1151) at Org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDatanodeReport ( clientnamenodeprotocolserversidetranslatorpb.java:728) at Org.apache.hadoop.hdfs.protocol.proto.clientnamenodeprotocolprotos$clientnamenodeprotocol$2.callblockingmethod (Clientnamenodeprotocolprotos.java) at Org.apache.hadoop.ipc.protobufrpcengine$server$protobufrpcinvoker.call ( Protobufrpcengine.java:503) at Org.apache.hadoop.ipc.rpc$server.call (rpc.java:989) at org.apache.hadoop.ipc.server$ Rpccall.run (server.java:868) at Org.apache.hadoop.ipc.server$rpccall.run (server.java:814) at
Java.security.AccessController.doPrivileged (Native method) at Javax.security.auth.Subject.doAs (subject.java:422) At Org.apache.hadoop.security.UserGroupInformation.doAs (usergroupinformation.java:1886) at Org.apache.hadoop.ipc.server$handler.run (server.java:2603) at Org.apache.hadoop.ipc.Client.call (client.java:1470 ) at Org.apache.hadoop.ipc.Client.call (client.java:1401) at Org.apache.hadoop.ipc.protobufrpcengine$invoker.invoke (protobufrpcengine.java:232) at Com.sun.proxy. $Proxy 9.getDatanodeReport (Unknown Source) at Org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDatanodeReport ( clientnamenodeprotocoltranslatorpb.java:607) at Sun.reflect.NativeMethodAccessorImpl.invoke0 (Native method) at Sun.reflect.NativeMethodAccessorImpl.invoke (Nativemethodaccessorimpl.java:62) at Sun.reflect.DelegatingMethodAccessorImpl.invoke (delegatingmethodaccessorimpl.java:43) at Java.lang.reflect.Method.invoke (method.java:498) at Org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod (retryinvocationhandler.java:187) at Org.apache.hadoop.io.retry.RetryInvocationHandler.invoke (retryinvocationhandler.java:102) at com.sun.proxy.$
Proxy10.getdatanodereport (Unknown Source) at Org.apache.hadoop.hdfs.DFSClient.datanodeReport (dfsclient.java:2390) At Org.apache.hadoop.hdfs.DistributedFileSystem.getDataNodeStats (distributedfilesystem.java:1009) at Org.apache.hadoop.hdfs.DistributedFileSystem.getDataNodeStats (distributedfilesystem.java:1003) at Com.fhpt.AccessHdfs.main (accesshdfs.java:28)
The reason for this error is that the administrator user does not have permission to refer to the http://blog.sina.com.cn/s/blog_e699b42b0102xfnd.html or to use the following simple method.
Add the following information to your code, where "root" is the user name with access to the Hadoop cluster
System.setproperty ("Hadoop_user_name", "root");