Cloud computing design model (16) -- priority queue model
Send requests with higher priority to the service, so that requests with higher priority are received and processed more quickly than a lower priority. This mode is useful in applications that provide different service level guarantees or for independent customers.
Background and problems
A specific task that an application can delegate to other services, for example, to perform background processing or integrate with other applications or services. In the cloud, message queues are generally used to delegate tasks to the background for processing. In many cases, the order in which requests are received by the service is not important. However, in some cases, specific requirements may need to be prioritized. These requirements must be processed by applications earlier than those with lower priority that may have been previously sent.
Solution
A queue is usually a first-in-first-out (FIFO) structure, while a consumer usually receives messages in the same order as the messages they publish to the queue. However, some message queues support priority message transmission. The application publishes a message that can be assigned a priority and automatically sorts the messages in the queue, this allows messages with higher priority to receive messages with lower priority. Figure 1 shows a queue that provides priority messages.
Figure 1-message priority queuing mechanism supported
Note:
The implementation of most message queues supports multiple consumers (the following competing consumer models) and the number of consumption processes can be adjusted proportionally to increase or decrease the demand.
In a message queue system that does not support priority, an alternative solution is an independent queue with each priority. This application is responsible for delivering the mail to the appropriate queue. Each queue can have a separate consumer pool. A high-priority queue can have a larger pool of hardware than a consumer running in a low-priority queue. Figure 2 shows this method.
Figure 2-use different message queues for each priority
This policy changes when a consumer considers that the consumer checks messages in a high-priority queue before starting to read messages from the low-priority queue. If there is no higher-priority message, the consumer waits for a pool. Also, some semantic differences between solutions in a pool using the consumption process (or using support for different priorities or multiple queues, A single queue for each message that processes a single priority message), and a separate swimming pool for each queue using solutions for multiple queues.
In a single pool, messages with a higher priority will always receive messages with a lower priority. Theoretically, messages with a very low priority can be constantly replaced and may never be processed. In multi-pool methods, messages with lower priority will always be processed, but those with higher priority are not as fast as possible (depending on the pool and their relative size with available resources ).
The priority queuing mechanism provides the following advantages:
? It allows applications to meet the necessary availability or performance-first business needs, such as providing different levels of services to specific groups of customers.
? It can help minimize operation costs. In the single queue mode, you can reduce the number of users, if necessary. Messages with a higher priority will still be processed first (although it may be slower), and messages with a lower priority may have a longer delay. If you have implemented multiple message queues for each separate queue pool with the consumer, you can reduce the low-priority queue of the consumer pool, or even the consumer that stops all listening messages pause processing some very low-priority queues.
? The methods of multiple message queues can help to process required messages based on the division, to maximize the performance and scalability of applications. For example, important tasks can be run immediately with priority, while less important background tasks can be processed by receivers scheduled to run during busy periods.
Problems and precautions
Consider the following when deciding how to implement this mode:
? When defining a priority solution. For example, "high priority" may mean that the information should be processed within 10 seconds. Identify the items that require high priority processing and what other resources must be allocated to those that meet these criteria.
? Determine whether all high-priority projects must be processed before any low-priority projects. If the message is processed by a consumer's pool, it may be necessary to provide a mechanism for preemptive and paused low-priority messages that are being processed. If there is a higher-priority message, there is a task mechanism.
? When a method in multiple queues uses this method to listen to all queues, instead of a pool in the consumption process of each queue in a dedicated customer pool, the consumer must apply an algorithm, to ensure that it is always provided from those queues with a higher priority from the lower level.
? Monitor the processing speed of high and low-priority queues to ensure that messages in these queues are processed at the expected rate.
? If necessary, to ensure that messages with low priority will be processed, it may be necessary to implement multiple message queues with multiple consumer pools. Or, in a message priority queue, it may dynamically increase the priority of a queue message because of its age. However, this method relies on message queue to provide this function.
? Each message priority in a separate queue is most suitable for a few clearly defined priority systems.
? The message priority can be determined by the system logic. For example, rather than clear messages of high and low priority, they can be designated as "self-paid customers" or "non-self-paid customers ." Based on your business model, your system may allocate more resources, from billing to message paying users than non-paid users.
? It is possible to check the financial and processing costs associated with messages in the queue (some commercial mail systems charge a small fee each time when messages are published or retrieved, and query messages in one queue each time ). This cost increases when multiple queues are checked.
? It can be the size of a pool of consumers who can dynamically adjust the queue length of the pool. For more information, see auto scaling guide.
When to use this mode
This mode is ideal for scenarios:
? The system must handle multiple tasks that may have different focuses.
? Different users or tenants should have different priorities.
Example
Microsoft Azure does not provide sorted local support for automatic mail priority queuing. However, it does provide Azure service bus topics and subscriptions, supports the queuing mechanism, provides mail filtering, and has a variety of flexible functions, this makes it ideal for the implementation of almost all priority queues together.
An Azure solution can implement service bus topics. One application can publish messages and act as a queue in the same way. Messages can be contained in metadata in the form of custom properties defined by the application. Service bus subscriptions can be associated with topics, and these subscriptions can be filtered based on their attributes. When an application sends a message to a topic, the message is directed to the topic where the consumer can read the corresponding subscription. The consumer process can retrieve messages from a subscription using the same semantic message queue (subscription is a logical queue.
Figure 3 shows the Azure service bus topic and subscription solution used.
Figure 3-implement the service bus topic and subscription priority queue with Azure
The application in figure 3 creates multiple messages and each message and value assignment is a custom attribute called a priority, either high or low. This application Post is a topic of these messages. This topic has two related subscriptions, and the messages of these two filters are checked for priority attributes. One message is subscribed, with the priority attribute set as high, while others accept messages with the priority attribute set as low. The consumer pool reads messages for each subscription. High-priority subscriptions have large swimming pools, and these consumers may run more powerful (and expensive) computers to provide more resources than consumers in low-priority pools.
Note that there are no special high-priority messages specified in this example. These are labels specified as attributes of each message and used to guide messages to a specific subscription. If the additional priority is required, it is easier to create a further subscription and consumer process pool to process these priorities.
When this pilot code is available, the Queue solution contains an implementation of this method. This solution contains a project named PriorityQueue. High and PriorityQueue. Low. The classes inherited by the two secondary roles are called PriorityWorkerRole, which contains the functions used to connect to a specified OnStart method in the reservation.
The PriorityQueue. High and PriorityQueue. Low auxiliary roles are connected to different reservations and their configuration settings are defined. The administrator can configure different numbers of roles to run. There are usually more instances than PriorityQueue. High secondary roles of PriorityQueue. Low workers.
Each message received in the queue of the virtual ProcessMessage method (defined in the PriorityWorkerRole class) is executed in the Run method of the PriorityWorkerRole class. The following code shows the methods for running and ProcessMessage. In the QueueManager of the class, in the PriorityQueue. Shared project definition, the Azure service bus queue is provided for auxiliary methods.
Public class PriorityWorkerRole: RoleEntryPoint {private QueueManager queueManager ;... public override void Run () {// Start listening for messages on the subscribe. var subscriptionName = CloudConfigurationManager. getSetting ("SubscriptionName"); this. queueManager. receiveMessages (subscriptionName, this. processMessage );...;}... protected virtual async Task ProcessMessage (BrokeredMessage message) {// Simulating processing. await Task. delay (TimeSpan. fromSeconds (2 ));}}
The PriorityQueue. High and PriorityQueue. Low auxiliary roles not only cover the default functions of the ProcessMessage method. The following code shows that the ProcessMessage method is PriorityQueue. High secondary role.
Copyprotected override async Task ProcessMessage (BrokeredMessage message) {// Simulate message processing for High priority messages. await base. processMessage (message); Trace. traceInformation ("High priority message processed by" + RoleEnvironment. currentRoleInstance. id + "MessageId:" + message. messageId );}
When an application publishes a message to the specified PriorityQueue. high and PriorityQueue. the topic associated with the subscription of a Low-level secondary role. It specifies a custom attribute with priority, as shown in the following code example. This code (which is implemented in the WorkerRole class of the PriorityQueue. Sender project) uses the SendBatchAsync auxiliary method of the QueueManager class to post topics in batches.
// Send a low priority batch. var lowMessages = new List <BrokeredMessage> (); for (int I = 0; I <10; I ++) {var message = new BrokeredMessage () {MessageId = Guid. newGuid (). toString ()}; message. properties ["Priority"] = Priority. low; lowMessages. add (message);} this. queueManager. sendBatchAsync (lowMessages ). wait ();... // Send a high priority batch. var highMessages = new List <BrokeredMessage> (); for (int I = 0; I <10; I ++) {var message = new BrokeredMessage () {MessageId = Guid. newGuid (). toString ()}; message. properties ["Priority"] = Priority. high; highMessages. add (message);} this. queueManager. sendBatchAsync (highMessages ). wait ();
MSDN: http://msdn.microsoft.com/en-us/library/dn589794.aspx
Cloud computing design model (16) -- priority queue model