In the IT world, software applications are being rapidly developed. Clients, and so employers, are just looking for those
teams/individuals who can build up applications rapidly, just bothering to make their application live; but what often happens after an
application goes live is that users start to use the application and it doesn’t respond well. At this point, clients start to lose users and
To code an application is not a big deal; I believe it can be done by virtually anyone, meaning it is not necessary to have
great knowledge or experience. Improving performance of an existing application (especially an one put together rapidly) could be quite risky
and could cause many ripple effects. Things must be planned first to avoid horrible results.
The following are a few points that can
make a site scalable and reliable; but which may initially slow down development. I believe that overall, when maintenance and future changes
are taken into account, total development time would be reduced.
1. Minimize HTTP based Requestss
Problem 1: Serving images - no matter if they are of less than 1 KB - as separate web resources, cause separate web requests to
the server, which impact performance.
- Use Image Maps to merge up images, though image
Maps could only merge up those images which are in sequence, like navigation images, so it depends upon your web site/page design.
- Use Inline images. Inline images could increase your HTML
page size but would cause fewer requests to the server.
- CSS Sprites can also be used to merge up images and setting
their position and backgrounds.
Problem 2: Using CSS is very good practice but serving stylesheets as separate resources, thus causing separate requests,
should be considered very carefully.
- Try your best to combine all your CSS based classes into a single .css file as lot of .css files will cause a large amount of
requests, regardless of the file sizes.
- .css files are normally cached by browsers, so a single and heavy .css file doesn’t cause a long wait on each page request.
- Inline .css classes could make HTML heavy, so again: go ahead with a single.css file.
used carefully not only for request size issues; but also because it can have a way of causing unpredictable performance issues.
they usually aren’t requested each time the page is loaded by the browsers.
2. HTTP Compression
HTTP Compression is used to compress contents from the web server. HTTP requests and responses could be compressed, which can result in
great performance gains. Through HTTP compression, the size of the payload can be reduced by about 50%, which is great. Isn’t it?
HTTP Compression is now widely supported by browsers and web servers.
If HTTP compression is enabled on the web server, and if
the request header includes an
Accept-Encoding: gzip, deflate header, the browser supports gzip and deflate compression
mechanisms, so the response can be compressed in any of the given formats by the web server in order to reduce the payload size. This leads
to an increase in performance. Latter that compressed response is decompressed by the browser and rendered normally.
Following are very
good links which detail HTTP Compression and their implementations:
- Click here to get detailed knowledge on
- Click here
to learn how to enable HTTP compression in IIS.
3. Correct Formatted Images at the Right Place
Problem: Normally designers use JPG or GIF formats quite randomly and ignore some other good formats to compress
Solution: Correct format should be used for right purpose like
- If you have to place a background image, some large image or a screenshot then the suggested format is JPG/JPEG.
- If you have to use small graphics like button images, header images, footer images, navigation bar images or clip arts, then the
suggested format is PNG.
- If an image is not required to be in high or true colors and 256 colors are enough, then GIF is preferred.
comments, unnecessary code and such other things. A number of high quality (and free) utilities are available to help you pre-compress your
Following are a few good links for such utilities:
- Compress PNG images by clicking here
- Compress JPG images by clicking here
- Compress .CSS files by clicking here and here and here
- Compress .Js files by clicking here and here and here
I have used these utilities and seen compression results of about 50% in file size reduction after using such loss-less compression,
so I recommend them.
5. CSS at Top
The recommended approach is to put CSS links on top of the web page, as it makes the page render progressively efficient. Since users want
to see the contents of a page whilst it’s loading rather than white spaces, contents/formats should be given on top. HTML Specifications clearly say to declare style sheets in the head section of a web
When scripts are defined on top of the page they can take unnecessary time to load; they don’t show the contents that users are expecting
after making any request to an HTTP web server. It's better to display a the HTML contents of a page, then load any scripting code (when
possible, of course).
defer attribute, which runs the script at the end of page loading, but that is not the preferable approach as it is not browser
independent. For example, Firefox doesn’t support it and could mess up with
document.write, so only use it once you fully
understand the implications.
7. Content Delivery Network: (CDN)
When a browser makes a request to any web page – that is, he types a URL/URI of any web page or web site, a request goes through many hops
(routers and computers) and then finally reaches its final destination. This happens both for requests and responses. This operation affects
performance and can severely effect load time.
A Content Delivery Network implies a collection of computers, distributed all over the
world, which deliver data (contents). Through a CDN you can have your website data on multiple servers distributed in different locations
around the world. Distribute web application data in different places around the world so request can be served from the nearest location and
save time (which means performance and money as well).
Problemm: Ajax is being increasingly used to improve usability, but oftentimes in a way which increases overall server
Preferably use the GET method for Ajax based Requests, because if you use POST method then the request
header would be sent first, followed by the data, which basically splits the request in two steps. A single-step request can be achieved with
GET if a cookie is not too long and the URL is not larger than 2k.
- When using ASP.NET AJAX and the UpdatePanel control for partial page rendering, use the maximum number of update panels to update
small chunks of page, but use them wisely. Don’t set the
Update property to
Always unless needed. Instead, set
the update mode to
Conditional, otherwise all the partial chunks would be sent together after each asynchronous postback.
- Ajax based requests can also be cached when using the GET method. If the URL is the same, then cached data can be used from the
client, and a round trip to the server can be avoided.
9. Ajax vs. Callback
Problem: Ajax is a great solution for asynchronous communication between client (web browser) and HTTP servers, but one solution
can't be applied to every problem. This means that Ajax is great mechanism for sending requests to the server without making a full page
postback, but what if you need to send a request to the server and don’t even need partial rendering?
Solution: best solution
For example, if you need to check whether a user exists or not, or if a user has forgotten his/her password and you just
need to send a request to the server to check if user name exist, there is no need for client-side render - just a server side operation.
Following are a couple of great links which explain callbacks: Please click here and here.
10. Reduce Cookie size
Cookies are stored on the client side to keep information about users (authentication and personalization). Since HTTP is a stateless
protocol, cookies are common in web development to maintain information and state. Cookies are sent with every HTTP requests, so try to keep
them low in size to minimize effects on the HTTP response.
Cookie’s size should be minimized as much as possible.
shouldn’t contain secret information. If really needed, that information should be either encrypted or encoded.
Try to minimize the
number of cookies by removing unnecessary cookies.
Cookies should expire as soon as they become useless for an application.
11. Use Cache appropriately
Cache mechanism is a great way to save server round trips - and also database server round trips - as both round trips are expensive
processes. By caching data we can avoid hitting them when unnecessary. Following are few guidelines for implementing caching::
- Static contents should be cached, like “Contact us” and “About us” pages, and such other pages which contain static information.
- If a page is not fully static, it contains some dynamic information. Such pages can leverage the ASP.NET technology, which
supports partial page caching.
- If data is dynamically accessed and used in web pages - like data being accessed from some file or database - and even if data is
consistently or regularly changed, then that data could be cached by using ASP.NET 2.0 cache dependency features. As soon as data changes
from the back-end by some other means, the cache would be updated.
Now that web technologies such ASP.NET have matured and offer such great caching capabilities, there's really no reason not to make
extensive use of them.
Following are few very good links to implement caching for different types of data (static and dynamic):
- Click here to cache Full page (static page
- Click here and here to cache partial page caching.
- Click here to cache dynamic data with
12. Upload compiled code rather than source code
Pre-compiled ASP.NET pages perform much better than source code versions. Actually pre-compilation give web sites a performance boost
especially when the first request is made to a folder containing that resource.
Uploading a pre-compiled version boosts up
performance since the server doesn’t need to compile a page at request-time.
Following are few good practices to gain better performance::
- For HTTP compression, GZip is considered the most effective and most popular by means of browsers and HTTP server. It can reduce
file size up to 70% in size.
- Avoid redirects until needed.
Server.Transfer is also provided so consider that as well since it performs better in
- Minimize use of Iframes as it's costly.
try-catch blocks for control-flow as they perform poorly. Exceptions should be used only in truly exceptional
- Minimize Cookie/CSS sizes.
- Minimize DOM objects on page as they are heavy weight.
link tags rather than
@import to use/link up CSS.
- Favicon, being a static image displayed in the browser’s address bar, should be cacheable and compressed.
- Always prefer a cache-friendly folder structure. For example, create specific folders for static contents, like /static for
static images/static pages…
- SSL can never be cached so minimize its usage. Keep it for those pages which need to be secure, rather than using it for all the
- HTTP Post requests can’t be cached, so choose the HTTP method appropriately.
- Prevent Denial of Service (Dos) attacks. Recommended article here.
- Prevent SQL Injection. Recommended article here.
- Prevent Cross Site Scripting (XSS). Recommended article here.
I hope you have learned some very good approaches and techniques to keep your web application in good shape on an HTTP server. I
personally don’t think any are flat-out ignorable nor is any too difficult to implement.
As performance is a vital part of success for
any web application, I have tried to be as general as possible, so every web technology (ASP.NET, asp, php, jsp, jsf and so on) can follow