[ASP. NET web API tutorial] 5.4 ASP. NET web API batch processor

Source: Internet
Author: User




Note:This article is part of the [ASP. NET web API series tutorial]. If this is the first time you read this series of tutorials, read the previous content first.




Batching handler for ASP. NET web API

5.4 ASP. NET web API batch processor



This article cited from: http://bradwilson.typepad.com/blog/2012/06/batching-handler-for-web-api.html






Maid | June 20,201 2

Author: Brad Wilson | Date:






While there is no batching standard built into the HTTP protocol, there is a standard for mime encoding HTTP request and response messages ("application/HTTP" with "msgtype = request" and "msgtype = response", respectively ). ASP. net web API has built-in support for both mime multipart as well as encoded request and response messages, so we have all the building blocks we need to make a simple batch request handler.

When the batch processing standard has not entered the HTTP protocol, the MIME standard for encoding HTTP requests and response messages is available ("application/HTTP" with "msgtype = request" and "msgtype = response" respectively "). ASP. net web API provides built-in support for mime multipart (multi-part content type), encoded requests, and response messages. Therefore, we have all the building blocks for making simple request batch processors.






All we need to make this work is an endpoint which can accept a multipart batch (an attack of our own), which then parses the requests, runs them sequentially, and returns the responses back in a multipart batch response.

All we need to do is an endpoint, which can receive a multipart batch (multiple batches, a self-invented content type ), it is then used to parse the request, execute the request in order, and return a response in the form of a multipart batch response.






Starting with a web API Project (built against the latest nightly build), I updated the web API config to look like this:

Starting from a web API project (a project created based on the latest version), I modified the web API config, which looks like this:






var batchHandler = new BatchHandler (config);

config.Routes.MapHttpRoute ("batch", "api / batch",
    null, null, batchHandler);

config.Routes.MapHttpRoute ("default", "api / {controller} / {id}",
    new {id = RouteParameter.Optional});
I've inserted the handler for "api / batch" as our endpoint for batching requests, using the new "route-specific endpoint handler" feature in Web API. Note that since its URL is "api / batch", I made sure to add it before the default API route.
I have inserted a processor for "api / batch" as an endpoint for batching requests, which takes advantage of the "routing-specific endpoint handler" feature in Web API. Note that because its URL is "api / batch", it must be added before the default API route.

Using async & await in .NET 4.5 makes the implementation of BatchHandler fairly straight-forward. All we need is an in-memory HttpServer which uses our existing configuration, so that the batched requests hit the exact same endpoints as requests from the Internet:
Using async and await in .NET 4.5, you can directly construct a BatchHandler implementation. All we need is an HttpServer in memory, which uses the current configuration so that when the request comes from the Internet, the batch of requests will find the exact same endpoint:

public class BatchHandler: HttpMessageHandler
{
    HttpMessageInvoker _server;

    public BatchHandler (HttpConfiguration config)
    {
        _server = new HttpMessageInvoker (new HttpServer (config));
    }

    protected override async Task <HttpResponseMessage> SendAsync (
            HttpRequestMessage request,
            CancellationToken cancellationToken)
    {
        // Return 400 for the wrong MIME type
        // For wrong MIME type, return 400
        if ("multipart / batch"! =
                request.Content.Headers.ContentType.MediaType)
        {
            return request.CreateResponse (HttpStatusCode.BadRequest);
        }

        // Start a multipart response
        // start a multipart response
        var outerContent = new MultipartContent ("batch");
        var outerResp = request.CreateResponse ();
        outerResp.Content = outerContent;

        // Read the multipart request
        // read multipart request
        var multipart = await request.Content.ReadAsMultipartAsync ();
        foreach (var httpContent in multipart.Contents)
        {
            HttpResponseMessage innerResp = null;
            try
            {
                // Decode the request object
                // decode request object
                var innerReq = await
                httpContent.ReadAsHttpRequestMessageAsync ();

                // Send the request through the pipeline
                // Send the request through the pipeline
                innerResp = await _server.SendAsync (
                    innerReq,
                    cancellationToken
                );
            }
            catch (Exception)
            {
                // If exceptions are thrown, send back generic 400
                // If an exception is thrown, return a generic 400
                innerResp = new HttpResponseMessage (
                    HttpStatusCode.BadRequest
                );
            }

            // Wrap the response in a message content and put it
            // into the multipart response
            // Encapsulate the response in the message content and put it in the multipart response
            outerContent.Add (new HttpMessageContent (innerResp));
        }

        return outerResp;
    }
}
Now we have an endpoint that we can send multipart / batch requests to, which are assumed to be HTTP request objects (anything which isn't is going to yield a 400).
Now that we have an endpoint, we can send multipart / batch requests to it, assuming these requests are HTTP request objects (any object that is not an HTTP request will generate a 400 status code).

On the client side, we make a multipart request and push requests into the multipart batch, one at a time:
On the client, we form a multipart request and push the request into a multipart batch, pushing one request at a time:

var client = new HttpClient ();
var batchRequest = new HttpRequestMessage (
    HttpMethod.Post,
    "http: // localhost / api / batch"
);

var batchContent = new MultipartContent ("batch");
batchRequest.Content = batchContent;

batchContent.Add (
    new HttpMessageContent (
        new HttpRequestMessage (
            HttpMethod.Get,
            "http: // localhost / api / values"
        )
    )
);

batchContent.Add (
    new HttpMessageContent (
        new HttpRequestMessage (
            HttpMethod.Get,
            "http: // localhost / foo / bar"
        )
    )
);

batchContent.Add (
    new HttpMessageContent (
        new HttpRequestMessage (
            HttpMethod.Get,
            "http: // localhost / api / values / 1"
        )
    )
);
In a console application, we can log both the request and response with code like this:
In a console application, we can log requests and responses with the following code:

using (Stream stdout = Console.OpenStandardOutput ())
{
    Console.WriteLine ("<<< REQUEST >>>");
    Console.WriteLine ();
    Console.WriteLine (batchRequest);
    Console.WriteLine ();

    batchContent.CopyToAsync (stdout) .Wait ();

    Console.WriteLine ();
    var batchResponse = client.SendAsync (batchRequest) .Result;
    Console.WriteLine ("<<< RESPONSE >>>");
    Console.WriteLine ();
    Console.WriteLine (batchResponse);
    Console.WriteLine ();
    batchResponse.Content.CopyToAsync (stdout) .Wait ();
    Console.WriteLine ();
    Console.WriteLine ();
}
When I run this console application, I see output similar to this:
When you run this console application, you will see output similar to this:

<<< REQUEST >>>

Method: POST,
Request Uri: 'http: // localhost / api / batch',
Version: 1.1,
Content: System.Net.Http.MultipartContent,
Headers:
{
    Content-Type: multipart / batch; boundary = "3bc5bd67-3517-4cd0-bcdd-9d23f3850402"
}

--3bc5bd67-3517-4cd0-bcdd-9d23f3850402
Content-Type: application / http; msgtype = request

GET / api / values HTTP / 1.1
Host: localhost

--3bc5bd67-3517-4cd0-bcdd-9d23f3850402
Content-Type: application / http; msgtype = request
GET / foo / bar HTTP / 1.1
Host: localhost

--3bc5bd67-3517-4cd0-bcdd-9d23f3850402
Content-Type: application / http; msgtype = request

GET / api / values / 1 HTTP / 1.1
Host: localhost

--3bc5bd67-3517-4cd0-bcdd-9d23f3850402--
<<< RESPONSE >>>

StatusCode: 200,
ReasonPhrase: 'OK',
Version: 1.1,
Content: System.Net.Http.StreamContent,
Headers:
{
    Pragma: no-cache
    Cache-Control: no-cache
    Date: Thu, 21 Jun 2012 00:21:40 GMT
    Server: Microsoft-IIS / 8.0
    X-AspNet-Version: 4.0.30319
    X-Powered-By: ASP.NET
    Content-Length: 658
    Content-Type: multipart / batch
    Expires: -1
}

--3d1ba137-ea6a-40d9-8e34-1b8812394baa
Content-Type: application / http; msgtype = response

HTTP / 1.1 200 OK
Content-Type: application / json; charset = utf-8

["Hello", "world!"]

--3d1ba137-ea6a-40d9-8e34-1b8812394baa
Content-Type: application / http; msgtype = response

HTTP / 1.1 404 Not Found
Content-Type: application / json; charset = utf-8

{"Message": "No HTTP resource was found that matches the request URI 'http: // localhost / foo / bar'."}


--3d1ba137-ea6a-40d9-8e34-1b8812394baa
Content-Type: application / http; msgtype = response

HTTP / 1.1 200 OK
Content-Type: application / json; charset = utf-8

"world!"
--3d1ba137-ea6a-40d9-8e34-1b8812394baa--
As you can see, our batch was successfully run, and the results show what we'd expected (the two real API calls returned back 200 with their data, and the bogus request we threw in the middle returns back a 404).
As we can see, the batch ran successfully and showed what we expected (two real API calls returned a 200 status code with their data, while the fake request pushed in the middle returned 404 status code).

If you feel something after reading this article, please give a recommendation


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.