Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles / web / HTML

How to Avoid 26 API Requests On Your Page?

4.95/5 (17 votes)
19 Dec 2014CPOL5 min read 24.3K   142  
How to avoid 26 API requests on your page

The Problem

Creating an application relying on web APIs seems to be quite popular these days. There is already an impressive collection of ready-to-use public APIs (check it at http://www.programmableweb.com ) that we can consume to create mashup, or add features into our web sites.

In the meantime, it has never been so easy to create your own REST-like API with node.js, ASP.NET web API, ruby or whatever tech you want. It’s also very common to create your own private/restricted API for your SPA, Cross-platform mobiles apps or your own IoT device. The naïve approach, when building our web API, is to add one API method for every feature; at the end, we got a well-architectured and brilliant web API following the Separation of Concerns principle: one method for each feature. Let’s put it all together in your client … it’s a drama in terms of web performance for the end user and there are never less than 100s requests/sec on staging environment; Look at your page, there are 26 API calls on your home page!

I talk here about web apps but it’s pretty much the same for mobile native applications. RTT and latency is much more important than bandwidth speed. It’s impossible to create a responsive and efficient application with a chatty web API.

The Proper Approach

At the beginning of December 2014, I attended the third edition in APIdays in Paris. There was an interesting session –among the others- on Scenario Driven Design by @ijansch.

The fundamental concept in any RESTful API is the resource. It’s an abstract concept and it’s merely different from a data resource. A resource should not be a raw data model (the result of an SQL query for the Web) but should be defined with client usage in mind. “REST is not an excuse to expose your raw data model. “ With this kind of approach, you will create dump clients and smart APIs with a thick business logic.

A common myth is “Ok, but we don’t know how our API is consumed, that’s why we expose raw data”. Most of the times, it’s false. Let’s take the example of twitter timelines. They are list of tweets or messages displayed in the order in which they were sent, with the most recent on top. This is a very common feature and you can see timelines in every twitter client. Twitter exposes a timeline API and API clients just have to call this API to get timelines. Especially, clients don’t have to compute timelines by themselves, by requesting XX times the twitter API for friends, tweets of friends, etc.

I think this is an important idea to keep in mind when designing our APIs. Generally, we don’t need to be so RESTful (what about HATEOS?). Think more about API usability and scenarios, that RESTfullness.

The slides of this session are available here.

Another Not-So-New Approach: Batch Requests

Reducing the number of requests from a client is a common and well-known Web Performance Optimization technique. Instead of several small images, it’s better to use sprites. Instead of many js library files, it’s better to combine them. Instead of several API calls, we can use batch requests.

Batch requests are not REST-compliant, but we already know that we should sometimes break the rules to have better performance and better scalability.

If you find yourself in need of a batch operation, then most likely you just haven’t defined enough resources., Roy T. Fieldin, Father of REST

What is a Batch Request ?

A batch request contains several different API requests into a single POST request. HTTP provides a special content type for this kind of scenario: Multipart. On server-side, requests are unpacked and dispatched to the appropriate API methods. All responses are packed together and sent back to the client as a single HTTP response.

Here is an example of a batch request:

Request

POST http://localhost:9000/api/batch HTTP/1.1
Content-Type: multipart/mixed; boundary="1418988512147"
Content-Length: 361

--1418988512147
Content-Type: application/http; msgtype=request

GET /get1 HTTP/1.1
Host: localhost:9000


--1418988512147
Content-Type: application/http; msgtype=request

GET /get2 HTTP/1.1
Host: localhost:9000


--1418988512147
Content-Type: application/http; msgtype=request

GET /get3 HTTP/1.1
Host: localhost:9000


--1418988512147--

Response

HTTP/1.1 200 OK
Content-Length: 561
Content-Type: multipart/mixed; boundary="91b1788f-6aec-44a9-a04f-84a687b9d180"
Server: Microsoft-HTTPAPI/2.0
Date: Fri, 19 Dec 2014 11:28:35 GMT

--91b1788f-6aec-44a9-a04f-84a687b9d180
Content-Type: application/http; msgtype=response

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

"I am Get1 !"
--91b1788f-6aec-44a9-a04f-84a687b9d180
Content-Type: application/http; msgtype=response

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

"I am Get2 !"
--91b1788f-6aec-44a9-a04f-84a687b9d180
Content-Type: application/http; msgtype=response

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8

"I am Get3 !"
--91b1788f-6aec-44a9-a04f-84a687b9d180--

Batch requests are already supported by many web Frameworks and allowed by many API providers: ASP.NET web API, a node.js module, Google Cloud platform, Facebook, Stackoverflow, Twitter …

Batch Support in ASP.NET Web API

To support batch requests in your ASP.NET web API, you just have to add a new custom route:

C#
config.Routes.MapHttpBatchRoute(
routeName: "batch",
routeTemplate: "api/batch",
batchHandler: new DefaultHttpBatchHandler(GlobalConfiguration.DefaultServer)
);

Tip: The DefaultBatchHandler doesn’t provide a way to limit the number of requests in a batch. To avoid performance issues, we may want to limit to 100/1000/… concurrent requests. You have to create your own implementation by inheriting DefaultHttpBatchHandler.

This new endpoint will allow client to send batch requests and you have nothing else to do on server-side. On client side, to send batch requests, you can use this jquery.batch, batchjs, angular-http-batcher module, …

I will not explain all details here but there is an interesting feature provided by DefaultHttpBatchHandler: the property ExecutionOrder allows to choose between sequential or no sequential processing order. Thanks to the TAP programming model, it’s possible to execute API requests in parallel (for true async API methods)

Here is the result of async/sync batch requests for a pack of three web methods taking one second to be processed.

batch

Finally, Batch requests are not a must-have feature but it’s certainly something to keep in mind. It can help a lot in some situations. A very simple demo application is available here. Run this console app or try to browse localhost:9000/index.html. From my point of view, here are some pros/cons of this approach.

Pros Cons
Better client performance (less calls) May increase complexity of client code
Really easy to implement on server side Hide real clients scenarios, not REST compliant
Parallel requests processing on server side Should limit batch size at server level for public API
Allow to use GET, POST, PUT, DELETE, … Browser cache may not work properly


License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)