Java Internet for interview-Message Queuing-application scenario

Source: Internet
Author: User
Tags message queue
Java Internet for interview-Message Queuing-application scenario 1 Asynchronous processing

Scene Description: After the user registers, needs to send the registration mail and the registration message. The traditional approach has two kinds of 1 serial ways; 2. Parallel mode.

(1) Serial mode: The registration information will be written to the database after the successful, send the registered mail, and then send the registered SMS. Once all three of these tasks are complete, return to the client. (Schema kkq:466097527, welcome to join)

(2) Parallel mode: After the registration information written to the database successfully, send the registered mail at the same time, send the registration SMS. When the above three tasks are completed, return to the client. The difference with the serial is that the parallel way can improve the processing time.

Assuming that three business nodes each use 50 seconds, regardless of other costs such as the network, the serial mode time is 150 milliseconds, and the parallel time may be 100 milliseconds.

Because the number of requests processed by the CPU within the unit time is certain, it is assumed that throughput is 100 times in CPU1 seconds. The number of requests that the CPU can handle in a serial mode of 1 seconds is 7 times (1000/150). The number of requests processed in parallel mode is 10 times (1000/100).

Summary: As described above, the traditional way system performance (concurrency, throughput, response time) will have bottlenecks. How to solve this problem.

The introduction of Message Queuing will not be necessary for the business logic to be processed asynchronously. The reconstructed structure is as follows:

According to the above conventions, the user's response time is equivalent to the time the registration information was written to the database, which is 50 milliseconds. Registering messages, sending text messages to message queues, and returning directly, so the message queue is written quickly and can be largely ignored, so the user's response time may be 50 milliseconds. As a result, the system throughput increases to QPS per second after the schema changes. 3 times times faster than serial, and twice times higher than parallel. 2 Application Decoupling

Scene Description: After the user orders, the order system needs to inform the inventory system. The traditional approach is that the order system calls the inventory system's interface. The following figure:

Disadvantages of traditional models:

1 if the inventory system can not be accessed, the order to reduce inventory will fail, resulting in an order failure;

2 The order system is coupled with the inventory system;

How to solve the above problems. Introduce the scenario after applying Message Queuing, as shown in the following illustration:

Order system: After the user orders, the order system completes the persistence processing, writes the message to the message queue, returns the user order to be successful. Inventory System: Subscribe to a single message, using pull/push the way to obtain the next single information, inventory system according to the order of information, inventory operations. If: In the order of the inventory system can not be used normally. Also does not affect the normal order, because after ordering, the system writes to Message Queuing and no longer cares about other subsequent operations. Realize the application of order system and inventory system decoupling. 3 Flow Cutting front

Flow sharpening is also a common scene in message queues, generally used in the second kill or a group of looting activities.

Application scenario: Second kill activity, usually because the flow is too large, resulting in increased traffic, application hang out. To solve this problem, it is generally necessary to join Message Queuing at the front end of the application. Can control the number of activities, can alleviate a short period of high flow pressure collapse application;

The user's request, after the server receives, writes the message queue first. If the message queue length exceeds the maximum number, the user is directly discarded or jumps to the error page; second kill business. Follow up on the request information in the message queue. 4th log Processing

Log processing refers to the use of Message Queuing in log processing, such as Kafka applications, to solve a large number of log transmission problems. The architecture is simplified as follows:

Log capture client, responsible for log data collection, write Kafka queue regularly; Kafka message queues, responsible for receiving, storing and forwarding log data; log processing applications: Subscribe and consume log data in Kafka queues;

The following is the Sina Kafka log processing application case:

(1) Kafka: The message queue that receives the user log.

(2) Logstash: Do log parsing, unified into JSON output to elasticsearch.

(3) Elasticsearch: The core technology of real-time log Analysis Services, a schemaless, real-time data storage services, through the index organization of data, both powerful search and statistical functions.

(4) Kibana: Based on the Elasticsearch data visualization component, the powerful data visualization ability is the important reason that many companies choose Elk Stack. 5 Message Communications

Message communication means that Message Queuing generally has an efficient communication mechanism built into it, so it can also be used in pure message communication. such as the implementation of point-to-point Message Queuing, or chat room and so on.

Point to point communication:

Client A and Client B use the same queue for message communication.

Chat Room Newsletter:

Client A, client B, client n subscribe to the same topic for message publishing and reception. Achieve a similar chat room effect.

The above is actually two message modes for Message Queuing, point to Point, or publish subscription mode.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.