Part 1: hadoop RPC Basics
RPC, remote program call, an application instance of the C/S model in distributed computing.
Like other RPC frameworks, hadoop is divided into four parts:
- Serialization layer: supports multiple frameworks for serialization and deserialization.
- Function call layer: implemented using Java reflection and dynamic proxy
- Network Transmission Layer: TCP/IP-based socket Mechanism
- Service Processing Framework: event-driven Io Model Based on reactor Mode
Hadoop RPC mainly provides two external interfaces
Public static protocolproxy getproxy/waitforproxy:
Construct a client proxy object and send an RPC request to the server
Public static server RPC. Builder (configuration). Build (...) :
Creates a Server Object for a protocol instance to process requests sent by the client.
How to Use hadoop RPC?
In fact, it is very easy to complete the high-performance CS network model in the following four steps
1. Define the RPC protocol
2. Implement the RPC protocol
3. Construct and start the RPC server
4. Construct the RPC client and send the request
The following is a code instance dependent on hadoop-common-version. jar.
1. Define the RPC protocol
All custom RPC interfaces in hadoop must inherit the versionedprotocol interface.
Package myrpc; Import org. Apache. hadoop. IPC. versionedprotocol; /** * Created by ywszjut on 14-8-22. */ Public interface clientprotocol extends versionedprotocol { Public static final long versionid = 1l; String echo (string value ); } |
2. Implementation Protocol
Package myrpc; Import org. Apache. hadoop. IPC. protocolsignature; Import java. Io. ioexception; /** * Created by ywszjut on 14-8-22. */ Public class clientprotocolimpl implements clientprotocol { @ Override Public String echo (string value ){ Return "hello" + value; } @ Override Public long getprotocolversion (string s, long l) throws ioexception { Return clientprotocol. versionid; } @ Override Public protocolsignature getprotocolsignature (string s, long l, int I) throws ioexception { Return new protocolsignature (clientprotocol. versionid, null ); } } |
3. Construct and start the RPC server
Package myrpc; Import org. Apache. hadoop. conf. configuration; Import org. Apache. hadoop. IPC. RPC; Import org. Apache. hadoop. IPC. rpc. server; Import java. Io. ioexception; /** * Created by ywszjut on 14-8-22. */ Public class myrpcserver { Public static void main (string [] ARGs) throws ioexception { Server = new rpc. Builder (new configuration (). setprotocol (clientprotocol. Class) . Setinstance (New clientprotocolimpl (). setbindaddress ("Wagner. 0.0.1"). setport (8787) . Setnumhandlers (5). Build (); Server. Start (); } } |
4. Construct a client and send a request
Package myrpc; Import org. Apache. hadoop. conf. configuration; Import org. Apache. hadoop. IPC. RPC; Import java. Io. ioexception; Import java.net. inetsocketaddress; /** * Hello world! */ Public class client { Public static void main (string [] ARGs) throws ioexception { Clientprotocol proxy = (clientprotocol) RPC. getproxy (clientprotocol. Class, clientprotocol. versionid, new inetsocketaddress ("127.0.0.1", 8787), new configuration ()); String result = proxy. Echo ("123 "); System. Out. println (result ); } } |
During the test, start 3 first and then 4 again. You will find that the console displays Hello 123
Okay, now that Hello world is finished, start to learn the specific implementation of the RPC class.
Hadoop RPC Basics