Introduction: Jetty is a Java-enabled, Open-source, standards-based, and rich-functionality Http server and Web container that can be used for free in business practices. Jetty This project was founded in 1995, and now has a lot of successful products based on Jetty, such as Apache Geromino, JBoss, IBM Tivoli, Cisco sesm and so on. Jetty can be used as a traditional WEB server or as a dynamic content server, and Jetty can be easily embedded in Java applications.
Introduction to Features
Ease of Use
Ease of use is the basic principle of Jetty design, ease of use is mainly embodied in the following aspects:
1. Configure the Jetty through XML or API;
2. The default configuration can meet most of the requirements;
3. Embedding Jetty into an application requires very little code;
Scalability
In applications that use Ajax WEB 2.0, each connection needs to stay longer so that the consumption of threads and memory increases dramatically. This makes us worry that the entire program will affect the performance of the entire program because a single component is stuck in a bottleneck. But with the Jetty:
1. Even in the case of a large number of service requests, the performance of the system can be maintained in an acceptable state.
2. Use the continuation mechanism to handle a large number of user requests and long time connections.
In addition, Jetty has designed a very good interface, so that when some implementations of Jetty do not meet the needs of users, users can easily modify some of the Jetty implementations to make Jetty applicable to the needs of particular applications.
Ease of embedding
Jetty design is designed as an excellent component, which means that Jetty can be easily embedded in the application without the need for a program to make changes to use Jetty. In a way, you can also interpret Jetty as an embedded Web server.
--------------------------------------------------------------------------------
deploying applications
Deploying your application to Jetty is very simple, starting with a WAR package of developed applications and placing them under the Jetty WebApps directory. Then use the following command to start the Jetty server: Java–jar Start.jar, after starting the server. We have access to our application, and the default port for Jetty is 8080,war's name, which is the Root context of our application. For example, a typical URL is: http://127.0.0.1:8080/sample/index.jsp.
--------------------------------------------------------------------------------
How to embed Jetty in a program
Embedding Jetty into a program is very simple, as shown in code 1: First we create a server object, set the port to 8080, and then add a default Handler for the server object. Then we use the configuration file Jetty.xml to set up this server, and finally we use the method Server.start () to start up the server. As you can see from this code, Jetty is ideal for embedding into our applications as a component, which is a very important feature of Jetty.
Listing 1. Code Snippets
Copy Code code as follows:
public class Jettyserver {
public static void Main (string[] args) {
Server server = new server (8080);
Server.sethandler (New DefaultHandler ());
Xmlconfiguration configuration = null;
try {
Configuration = new Xmlconfiguration (
New FileInputStream ("C:/development/jetty/jetty-6.1.6rc0/etc/jetty.xml"));
catch (FileNotFoundException E1) {
E1.printstacktrace ();
catch (Saxexception E1) {
E1.printstacktrace ();
catch (IOException E1) {
E1.printstacktrace ();
}
try {
Configuration.configure (server);
Server.start ();
catch (Exception e) {
E.printstacktrace ();
}
}
}
Let's analyze how the Jetty Server started. First we notice the server class, which actually inherits Httpserver, and when you start the Jetty server, that is, under the Jetty root, if you enter Java-jar Start.jar etc/jetty.xml, notice that there is a Configuration file Jetty.xml as run parameters, this parameter can also be other configuration files, can be more than one XML configuration file, in fact, this configuration file is like we use Struts when the Struts-config.xml file, will run the Server The components that need to be used are written in, such as the component classes required by the Httpserver configuration in the previous section can be written in this configuration file. When you start the Jetty server as described above, the main method inside the server class is invoked, which first constructs an instance of the server class (in fact, it constructs a httpserver) and constructs the instance in the process of creating the Xmlconfigurat Ion the object of the class to read the parameter configuration file and then configure the server by the Xmlconfiguration object generated by this profile, the configuration process is actually using the Java reflection mechanism, invoking the Server's method and passing in the parameters written in the configuration file to the Server adds Httplistener,httpcontext,httphandler, as well as Web application (corresponding to our web application).
--------------------------------------------------------------------------------
The continuation mechanism of Jetty
Discuss the continuation mechanism of Jetty, first need to mention Ajax technology, AJAX technology is currently developing Web applications very popular technology, and Web 2.0 is an important component. One of the core objects of Ajax technology is the XMLHttpRequest object, which supports asynchronous requests, which means that clients do not have to wait for a response from the server when a client sends a request to the server. This will not cause the entire page to refresh, to bring the user a better experience. When the server-side response returns, the client uses a Javascript function to process the return value to update the value of some elements on the page. But most of the time this asynchronous event happens only in a very small part of the situation, so how do we make sure that once the server has a response, the client immediately knows, we have two ways to solve this problem, one is to let the browser every few seconds to request the server to get changes, we call it polling. The second is that the server maintains the long time connection with the browser to pass the data, the long connection technology calls the Comet.
It is easy to find that the main drawback of polling is that there is a lot of waste of transport. Because most of the requests to the server are not valid, that is, the client waiting for the event does not occur, if there are a large number of clients, then the waste of this network transmission is very serious. Especially for applications that have been updated for a long time on the server, such as mail programs, the waste is huge. and the ability to process requests on the server side also increases the requirements. If a request is sent to the server for a long time, then the client cannot get a prompt response.
If you use Comet technology, the client and server side must maintain a long connection, in general, each of the server side of the Servlet will monopolize a thread, so that there are many threads on the server side exist simultaneously, This also poses a great challenge to server-side processing capabilities in the case of a very large number of clients.
Jetty uses the Java language's non blocking I/O technology to handle concurrent, large numbers of connections. Jetty has a mechanism for dealing with long connections: a feature called continuations. With the continuation mechanism, Jetty can enable a thread to handle multiple asynchronous requests sent from the client at the same time, we demonstrate the use of the continuation mechanism and the use of the server-side code of a simplified chat program Continuation the difference.
Listing 2. Continuation mechanism
Copy Code code as follows:
public class Chatcontinuation extends httpservlet{
public void DoPost (HttpServletRequest request, httpservletresponse response) {
PostMessage (request, response);
}
private void PostMessage (HttpServletRequest request, httpservletresponse response)
{
HttpSession session = Request.getsession (true);
People people = (people) Session.getattribute (Session.getid ());
if (!people.hasevent ())
{
Continuation continuation =
Continuationsupport.getcontinuation (request, this);
People.setcontinuation (continuation);
Continuation.suspend (1000);
}
People.setcontinuation (NULL);
People.sendevent (response);
}
}
You notice that you first get a continuation object and then hang it for 1 seconds until the timeout or the middle of the resume function wakes up, and what you need to explain is that after the suspend function is called, the thread can handle the other requests. This also greatly improves the concurrency of the program, so that long connections can achieve very good scalability.
If we don't use the continuation mechanism, then the program looks like Listing 3:
Listing 3. Do not use the continuation mechanism
Copy Code code as follows:
public class Chat extends httpservlet{
public void DoPost (HttpServletRequest request, httpservletresponse response) {
PostMessage (request, response);
}
private void PostMessage (HttpServletRequest request, httpservletresponse response)
{
HttpSession session = Request.getsession (true);
People people = (people) Session.getattribute (Session.getid ());
while (!people.hasevent ())
{
try {
Thread.Sleep (1000);
catch (Interruptedexception e) {
E.printstacktrace ();
}
}
People.setcontinuation (NULL);
People.sendevent (response);
}
}
It is noted that in the time of waiting for an event to occur, the thread is suspended until the event that is waiting, but the thread is not able to process other requests during the wait process, which causes the server to be unable to keep up with the processing power of the client in very many cases. Now let's explain how Jetty's continuation mechanism works.
In order to use Continuatins,jetty, it must be configured to use its Selectchannelconnector processing request. This connector is built on top of the Java.nio API, allowing it to keep each connection open without consuming a single thread. When using Selectchannelconnector, Continuationsupport.getcontinuation () provides a selectchannelconnector.retrycontinuation Instance (however, you must program against the continuation interface). When suspend () is raised on retrycontinuation, it throws a special Run-time exception-retryrequest, which propagates outside the servlet and goes back to the filter chain, and finally Selectchannelconnec Tor capture. Instead of sending an exception response to the client but maintaining the request in the pending continuations queue, the HTTP connection remains open. In this way, the thread used to service the request is returned to ThreadPool, which can then be used to service other requests. The paused request stays in the pending continuations queue until the specified expiration time, or the resume () method is raised in its continuation. When any one of the conditions is triggered, the request is submitted back to the servlet (through the filter chain). In this way, the entire request is "replayed" until the retryrequest exception is no longer thrown, and then continues to perform as normal.
--------------------------------------------------------------------------------
Security of Jetty
To prevent anyone from having permission to shut down an open Jetty server, we can control it by specifying the parameters when starting the Jetty server, so that the user must provide a password to shut down the Jetty server, and the command to start the Jetty server is as follows:
Copy Code code as follows:
Java-dstop. Port=8079-dstop. Key=mypassword-jar Start.jar
In this way, users must provide the password "MyPassword" when they stop Jetty the server.
--------------------------------------------------------------------------------
Summarize
Jetty is a very handy Web server, characterized by being very small, easy to embed in our applications, and specifically optimized for Web 2.0 AJAX technology, which also makes our AJAX-enabled applications better able to perform.