According to Unix network programming, IO models can be divided into: blocking IO, non-blocking IO, IO reuse, signal-driven IO, and asynchronous IO. According to the POSIX standard, there are only two types: synchronous IO and asynchronous IO. First, an IO operation is divided into two steps: initiating an IO request and the actual IO operation. The difference between synchronous IO and asynchronous IO lies in whether the second step is blocked, if the actual IO reads and writes block the request process, it is synchronous IO, so blocking IO, non-blocking IO, taking IO, and signal-driven IO are both synchronous IO. If not blocked, but the operating system will help you complete the IO operation and then return the result to you, that is, asynchronous IO. The difference between blocking IO and non-blocking IO lies in the first step: whether or not the initiating IO request will be blocked. If blocking is completed, it is the traditional blocking IO. If it is not blocked, it is non-blocking IO.
Netty is an asynchronous, event-driven network programming framework and tool. Netty can be used to quickly develop a highly scalable, high-performance protocol service and client applications. Netty simplifies and streamlined the programming and development process of network applications, such as TCP and UDP socket service development.
Take the time service provided in the Netty document as an example. When a client is connected, the current time is sent to it. Ignore messages sent from the client
Server implementation
The homepage defines a UnixTime class in Unix format.
The code is as follows: |
Copy code |
Package com. netty. timeserver; Import java. util. Date; Public class UnixTime { Private final int value; Public UnixTime (int value ){ This. value = value; } Public int getValue (){ Return value; } @ Override Public String toString (){ Return new Date (value * 1000L). toString (); } } |
The message processing class on the server, inherited from SimpleChannelHandler, can implement multiple different
The code is as follows: |
Copy code |
Package com. netty. timeserver; Import org. jboss. netty. channel. ChannelFuture; Import org. jboss. netty. channel. ChannelFutureListener; Import org. jboss. netty. channel. ChannelHandlerContext; Import org. jboss. netty. channel. ChannelStateEvent; Import org. jboss. netty. channel. ExceptionEvent; Import org. jboss. netty. channel. MessageEvent; Import org. jboss. netty. channel. SimpleChannelHandler; Public class TimeServerHandler extends SimpleChannelHandler { // When a connection comes in, the current sending time @ Override Public void channelConnected (ChannelHandlerContext ctx, ChannelStateEvent e) throws Exception { System. out. println ("Client from:" + e. getChannel (). getRemoteAddress ()); UnixTime time = new UnixTime (int) (System. currentTimeMillis ()/1000 )); E. getChannel (). write (time ); ChannelFuture f = e. getChannel (). write (time ); F. addListener (ChannelFutureListener. CLOSE ); } // Save all connected channels, which must be processed when the service is closed @ Override Public void channelOpen (ChannelHandlerContext ctx, ChannelStateEvent e) throws Exception { TimeServer. allChannels. add (e. getChannel ()); } // Triggered when an exception occurs @ Override Public void exceptionCaught (ChannelHandlerContext ctx, ExceptionEvent e) throws Exception { E. getCause (). printStackTrace (); E. getChannel (). close (); } } |
Encode POJO object UnixTime for sending
The code is as follows: |
Copy code |
Package com. netty. timeserver; Import org. jboss. netty. buffer. ChannelBuffer; Import org. jboss. netty. buffer. ChannelBuffers; Import org. jboss. netty. channel. ChannelHandlerContext; Import org. jboss. netty. channel. Channels; Import org. jboss. netty. channel. MessageEvent; Import org. jboss. netty. channel. SimpleChannelHandler; Public class TimeEncoder extends SimpleChannelHandler { @ Override Public void writeRequested (ChannelHandlerContext ctx, MessageEvent e ){ UnixTime time = (UnixTime) e. getMessage (); ChannelBuffer buf = ChannelBuffers. buffer (4 ); Buf. writeInt (time. getValue ()); Channels. write (ctx, e. getFuture (), buf ); } }
|
Start the server program
The code is as follows: |
Copy code |
Package com. netty. timeserver; Import java.net. InetSocketAddress; Import java. util. concurrent. Executors; Import org. jboss. netty. bootstrap. ServerBootstrap; Import org. jboss. netty. channel. Channel; Import org. jboss. netty. channel. ChannelFactory; Import org. jboss. netty. channel. ChannelPipeline; Import org. jboss. netty. channel. ChannelPipelineFactory; Import org. jboss. netty. channel. Channels; Import org. jboss. netty. channel. group. ChannelGroup; Import org. jboss. netty. channel. group. ChannelGroupFuture; Import org. jboss. netty. channel. group. DefaultChannelGroup; Import org. jboss. netty. channel. socket. nio. NioServerSocketChannelFactory; Public class TimeServer { Static final ChannelGroup allChannels = new DefaultChannelGroup ("time-server "); Public static void main (String [] args ){ ChannelFactory factory = new NioServerSocketChannelFactory (Executors. newCachedThreadPool (), Executors. newCachedThreadPool ()); ServerBootstrap bootstrap = new ServerBootstrap (factory ); Bootstrap. setPipelineFactory (new ChannelPipelineFactory (){ Public ChannelPipeline getPipeline (){ Return Channels. pipeline (new TimeEncoder (), new TimeServerHandler ()); } }); Bootstrap. setOption ("child. tcpNoDelay", true ); Bootstrap. setOption ("child. keepAlive", true ); Bootstrap. setOption ("reuseAddress", true ); Channel channel = bootstrap. bind (new InetSocketAddress (9000 )); AllChannels. add (channel ); // Notification of service closure // WaitForShutdownCommand (); ChannelGroupFuture future = allChannels. close (); Future. awaitUninterruptibly (); Factory. releaseExternalResources (); } } |
Client implementation
The message processing method of the client, also from SimpleChannelHandler
The code is as follows: |
Copy code |
Package com. netty. timeserver; Import org. jboss. netty. channel. ChannelHandlerContext; Import org. jboss. netty. channel. ExceptionEvent; Import org. jboss. netty. channel. MessageEvent; Import org. jboss. netty. channel. SimpleChannelHandler; Public class TimeClientHandler extends SimpleChannelHandler { @ Override Public void messageReceived (ChannelHandlerContext ctx, MessageEvent e ){ UnixTime m = (UnixTime) e. getMessage (); System. out. println ("RECEIVED:" + m ); E. getChannel (). close (); } @ Override Public void exceptionCaught (ChannelHandlerContext ctx, ExceptionEvent e ){ E. getCause (). printStackTrace (); E. getChannel (). close (); } } |
Time when the client parses the time sent by the server
The code is as follows: |
Copy code |
Package com. netty. timeserver; Import org. jboss. netty. buffer. ChannelBuffer; Import org. jboss. netty. channel. Channel; Import org. jboss. netty. channel. ChannelHandlerContext; Import org. jboss. netty. handler. codec. frame. FrameDecoder; Public class TimeDecoder extends FrameDecoder { @ Override Protected Object decode (ChannelHandlerContext ctx, Channel channel, ChannelBuffer buffer) throws Exception { If (buffer. readableBytes () <4 ){ Return null; } Return new UnixTime (buffer. readInt ()); } } |
Client call
The code is as follows: |
Copy code |
Package com. netty. timeserver; Import java.net. InetSocketAddress; Import java. util. concurrent. Executors; Import org. jboss. netty. bootstrap. ClientBootstrap; Import org. jboss. netty. channel. Channel; Import org. jboss. netty. channel. ChannelFactory; Import org. jboss. netty. channel. ChannelFuture; Import org. jboss. netty. channel. ChannelPipeline; Import org. jboss. netty. channel. ChannelPipelineFactory; Import org. jboss. netty. channel. Channels; Import org. jboss. netty. channel. socket. nio. NioClientSocketChannelFactory; Public class TimeClient { Public static void main (String [] args ){ String host = args [0]; Int port = Integer. parseInt (args [1]); ChannelFactory factory = new NioClientSocketChannelFactory (Executors. newCachedThreadPool (), Executors. newCachedThreadPool ()); ClientBootstrap bootstrap = new ClientBootstrap (factory ); Bootstrap. setPipelineFactory (new ChannelPipelineFactory (){ Public ChannelPipeline getPipeline (){ Return Channels. pipeline (new TimeDecoder (), new TimeClientHandler ()); } }); Bootstrap. setOption ("tcpNoDelay", true ); Bootstrap. setOption ("keepAlive", true ); ChannelFuture future = bootstrap. connect (new InetSocketAddress (host, port )); Future. awaitUninterruptibly (); If (! Future. isSuccess ()){ Future. getCause (). printStackTrace (); } Channel channel = future. getChannel (); Channel. getCloseFuture (). awaitUninterruptibly (); Factory. releaseExternalResources (); } }
|
Mina was born from the open-source apache organization, and Netty was born from the commercial open-source tycoon jboss, while Grizzly was born from Sun, a local company. From the design concept, MINA's design philosophy is the most elegant. Of course, because the leading author of Netty is the same as that of Mina, Netty from the same person is basically consistent with Mina in terms of design philosophy. Grizzly has a poor design concept, which is almost a simple encapsulation of Java NIO. However, access to the Mina project homepage is slow, so I chose Netty.