Servlets are used in many places in Hadoop, and jetty is used as a container for Servlets to provide HTTP services. It is mainly implemented through the Org.apache.hadoop.http.HttpServer class, the Httpserver class is a simple encapsulation of jetty, and by invoking the Addservlet method of the Httpserver class, the function of adding servlet to Jetty can be realized:
public void Addservlet (string name, String pathspec, class<? extends httpservlet> Clazz) {//name, URL access path, processing class Addinternalservlet (name, Pathspec, Clazz, false); Addfilterpathmapping (Pathspec, Webappcontext); }
By default in the Httpserver constructor, Adddefaultservlets is called to add the servlets that need to be added:
Public Httpserver (string name, string bindaddress, int port, Boolean findport, Configuration conf, accesscontrollist Adminsacl, Connector Connector, string[] pathspecs) throws IOException {.... webappcontext = new Webappcontext (); Webappcontext.setdisplayname (name); Webappcontext.setcontextpath ("/"); Webappcontext.setwar (Appdir + "/" + name), ... adddefaultservlets ();....
Start Adddefaultservlets defines the default-loaded servlet:
protected void Adddefaultservlets () {//Set up default Servlets addservlet ("Stacks", "/stacks", Stackservlet.clas s); Addservlet ("LogLevel", "/loglevel", LogLevel.Servlet.class); Addservlet ("Metrics", "/metrics", Metricsservlet.class); Addservlet ("Jmx", "/jmx", Jmxjsonservlet.class); Addservlet ("conf", "/conf", Confservlet.class); }
Hadoop uses the Httpserver class in many places:
For example, in the Org.apache.hadoop.hdfs.server.datanode.DataNode class:
The Datanode constructor--->startdatanode-->initdataxceiver+startinfoserver
Private httpserver infoserver = null;...private void startinfoserver ( configuration conf) throws ioexception { // create a servlet to serve full-file content InetSocketAddress infoSocAddr = DATANODE.GETINFOADDR (conf); string infohost = infosocaddr.gethostname (); Int tmpinfoport = infosocaddr.getport (); this.infoserver = ( Secureresources == null) ? new httpserver ("Datanode", infohost, tmpinfoport, tmpinfoport == 0, conf, new accesscontrollist (Conf.get (dfs_admin, " ")) : new httpserver ("Datanode", infohost, tmpinfoport, tmpinfoport == 0, Conf, new accesscontrollist (Conf.get (dfs_admin, " ")), secureresources.getlistener ()); log.info ("Opened info server at " + infoHost + ": " + tmpinfoport);..... This.infoServer.addInternalServlet (null, "/streamfile/*", streamfile.class); // Add Datanode exclusive Servlet this.infoserver.addinternalservlet (null, "/getfilechecksum/*", filechecksumservlets.getservlet.class); this.infoserver.setattribute ( "Datanode", this); this.infoserver.setattribute (jsphelper.current_conf, conf); this.infoserver.addservlet (null, "/blockscannerreport", datablockscanner.servlet.class); if (webhdfsfilesystem.isenabled (conf, log)) { Infoserver.addjerseyresourcepackage (datanodewebhdfsmethods.class .getpackage (). GetName () + ";" + param.class.getpackage (). GetName (), webhdfsfilesystem.path_prefix + "/*"); } this.infoserver.start ();}
The summary is as follows:
1) Httpserver is a simple package for the jetty
2) All Hadoop components will be httpserver,datanode/namenode,resourcemanager and so on.
Its main functions are: Internal status display, operation and management of Hadoop
3) The Adddefaultservlets method of Httpserver defines a number of common servlets (such as a servlet that changes the log level), and in each class defines its own servlet
This article is from the "Food and Light Blog" blog, please make sure to keep this source http://caiguangguang.blog.51cto.com/1652935/1592799
Application of Hadoop Jetty