0% found this document useful (0 votes)
62 views4 pages

A Case Study For Improving The Performance of Web Application

In the IT world, software applications are being rapidly developed. Clients, and so employers, are just looking for those teams/individuals who can build up applications rapidly, just bothering to make their application live; but what often happens after an application goes live is that users start to use the application and it doesn�t respond well. At this point, clients start to lose users and business. To code an application is not a big agreement; I believe it can be done by virtually anyone

Uploaded by

ijbui iir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
62 views4 pages

A Case Study For Improving The Performance of Web Application

In the IT world, software applications are being rapidly developed. Clients, and so employers, are just looking for those teams/individuals who can build up applications rapidly, just bothering to make their application live; but what often happens after an application goes live is that users start to use the application and it doesn�t respond well. At this point, clients start to lose users and business. To code an application is not a big agreement; I believe it can be done by virtually anyone

Uploaded by

ijbui iir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Integrated Intelligent Research (IIR) International Journal of Web Technology

Volume: 02 Issue: 01 June 2013 Page No.17-20


ISSN: 2278-2389

A Case Study for Improving the Performance of


Web Application
Balamurugan Subrayen1, Gurumoorthi Elangovan2, Vasuki Muthusamy3, Angayarkanni Anantharajan4
Department of MCA, Sri Manakula Vinayager Engg,College,Pondicherry, India
Email:[email protected], [email protected], [email protected], [email protected]

Abstract- In the IT world, software applications are being if your servers are under heavy load reducing it is important,
rapidly developed. Clients, and so employers, are just looking but I believe that only looking at performance from the user’s
for those teams/individuals who can build up applications point of view will give you the whole picture.
rapidly, just bothering to make their application live; but what
often happens after an application goes live is that users start to II. ANALYZING WEB APPLICATION
use the application and it doesn’t respond well. At this point, PERFORMANCE
clients start to lose users and business. To code an application is
Even if a web page appears to the users as a single entity with
not a big agreement; I believe it can be done by virtually
a define URL, behind the scene it is assemblage of web
anyone, meaning it is not necessary to have great knowledge or
resources that the browser fetches and renders. Even the
experience. Improving performance of an existing application
smallest page is composed of multiples resources that
(especially a one put together rapidly) could be quite risky and
usually include images, JavaScript code, and cascading style
could cause many current issues. Things must be planned first to
sheet (CSS). Before the user sees the page and can interact with
avoid horrible results.
it, at least some of these resources must be fetched and
processed by the browser. For this reason, we have four
Keywords-Performance,Web application, Improving, Software non-mutually exclusive strategies to reduce the perceived
application, Optimizing. delay:Reducing the time the browser takes to fetch a given
resource- This can be done, for instance, by reducing the server
I. INTRODUCTION processing time, using browser caching, and HTTP
Web applications are an integral component of any and every pipelining.Decreasing the number of requests- This can be
website, especially, a business website. These web enabled done by reducing the number of resources by using sprites
applications streamline a business process and even go and combining JavaScript. To decrease the number of elements
onto improves business performance. The best web that need to be effectively loaded, we can leverage the browser
applications are known for attracting a huge amount of traffic, cache and use a mutualized version of popular JavaScript
which is always good news for a website. But, there are libraries (i.e., JQuery)Optimizing the rendering speed- This can
times when even the best of applications don’t perform be done, for example, by using more efficient CSS selectors,
along expected lines. They fall short of what the users expect a better page layout, and optimizing the JavaScript
from them, which can ultimately lead to them being replaced or code.Making the loading time appear shorter.-Leverage how
reengineered. The website performance mainly focuses on the humans perceive information to make the delay appear less than
techniques and tools that can be use to improve a website’s it really is by adding loading indicators, pre-caching, and
performance. Optimizing web application performance is all deferred loading strategies (the famous AJAX paradigm)
about numbers and metrics so, before delve into optimization
techniques, it is essential to understand what can be III. LIFE CYCLE OF WEB RESOURCE
optimized and how to measure improvements in performance. For each resource the browser fetches, the five steps depicted in
In this post, we will review the five areas where website the following diagram occur, caching mechanisms here as
performance can be improved, how to establish a performance they will be the subjects of an entire post.In the first step,
baseline, and how to measure progress.The overall goal of resolving, the browser needs to translate the URL into an IP
improving performance is to minimize the perceived delay the address by performing DNS queries. Believe it or not, this step
user experiences between the moment he clicks on the link and is not free, as it can take a couple of hundred milliseconds,
the page is finally displayed. The reason why we are focusing depending of your DNS server and visitor physical locations, as
on minimizing the user-perceived delay rather than any other we will see below.In a requesting step, the browser connects to
metric is because, in the end, what matters is improving your your web server to ask for the resource. The duration of this
user’s/visitor’s experience. Having a user-centric m i n d step does not depend on your server bandwidth but on the
set is important because it gives us a clear way to prioritize physical distance between your web server and your visitor. If
where to spend our resources and focus our efforts. For your server is located in Europe and your visitor on the east coast
example, even if implementing a new caching system seems of the United States, no matter what you do, the
exciting, it might not be useful if the resource-processing time connecting step will takes at least 60 ms because of the speed of
only accounts for 5% of the user-perceived delay. Of course, light. As we will discuss extensively in upcoming blog posts,

1
Integrated Intelligent Research (IIR) International Journal of Web Technology
Volume: 02 Issue: 01 June 2013 Page No.17-20
ISSN: 2278-2389
using a CDN (content delivery network) will help reduce this picture, for three reasons. First, it only tells you how fast it is for
delay by relocating your content closer to your client.In the you. For visitors from a different country and ISP, these numbers
processing step, the server generates the content or reads the will be different in term of latency and download speed. Second,
resources from the disk. The duration of this step can be the browser does not tell you how long it took to resolve the
reduced by implementing server side caching, using more URL. While in some cases (e.g., browser DNS-prefetching) it
powerful servers, and optimizing your web application code. does not matter, it is still important to know this. Last, but not
Load balancing between servers is also one of the optimization least, you don’t know how stable these numbers are, so you need
techniques that can be applied at this stage. multiple data points. These data will be provided by any good
monitoring service.

V. IMPROVE THE PERFORMANCE


The following are a few points that can make a site scalable and
reliable; but which may initially slow down development. I
believe that overall, when maintenance and future changes are
taken into account, total development time would be reduced.
1. Minimize HTTP based Requests
Serving images - no matter if they are of less than 1 KB - as
separate web resources, cause separate web requests to the
server, which impact performance.

Solutions:
Use Image Maps to merge up images, though image Maps could
only merge up those images which are in sequence, like
navigation images, so it depends upon your web site/page design.
Use Inline images. Inline images could increase your HTML
The duration of the transferring step is mainly dominated by page size but would cause fewer requests to the server. CSS
how big the element is and how much bandwidth the clients Sprites can also be used to merge up images and setting their
have with your server. Having smaller elements and position and backgrounds. Using CSS is very good practice but
compressing them can reduce the duration of this step. For serving style sheets as separate resources, thus causing separate
example dedicated software exists to make JavaScript and CSS requests, should be considered very carefully.
files smaller (this process is called minification). The download
speed can be improved by having a better bandwidth provider, Solutions:
better peering, or using CDN.The rendering step occurs on the Try your best to combine all your CSS based classes into a single
browser side. The rendering speed can be improved by making .css file as lot of .css files will cause a large amount of requests,
the browser’s job easier. This includes adding dimension to the regardless of the file sizes. css files are normally cached by
images, writing faster JavaScript code and linear zing the page browsers, so a single and heavy .css file doesn’t cause a long
layout. wait on each page request. Inline .css classes could make HTML
heavy, so again: go ahead with a single.css file. JavaScript is an
IV. MEASURE WEB APPLICATION awesome scripting language which can be quite powerful to play
PERFORMANCE with. Nonetheless, it should be used carefully not only for
We need the right tools to measure our progress and lot of request size issues; but also because it can have a way of causing
performance/benchmark tools, all of which with their utility. For unpredictable performance issues. Inline JavaScript could make
example recommendation tools, such as yslow, that analyzes the HTML page heavy, so it’s preferred to serve separate .js files
your page and gives you recommendations on how to improve or a single JavaScript file to keep all JavaScript-based scripts in a
you performance are very useful and will be covered in an single place. JavaScript files also get cached automatically by
upcoming post. However, to start optimizing your website, you browsers, so they usually aren’t requested each time the page is
only need two kinds of tools: A browser performance monitor loaded by the browsers.
and a resource performance monitor. Here is the short list of
those I use daily. I am sure there are a ton of others tools that are 2. HTTP Compression
worth mentioning, if you know one, let me know by commenting HTTP Compression is used to compress contents from the web
or tweeting browser performance monitor is the essential tool server. HTTP requests and responses could be compressed,
that will allow you to understand how the browser spends time which can result in great performance gains. Through HTTP
rendering your page. Every major browser has either a built-in compression, the size of the payload can be reduced by about
monitor (Chrome, Safari, and Internet Explorer 8/9) or an add-on 50%, which is great. Isn’t it? HTTP Compression is now widely
that provides it (Firebug for Firefox). Analyzing resources supported by browsers and web servers. If HTTP compression is
loading time with a browser is useful to get a quick estimate of enabled on the web server, and if the request header includes an
the performance, but be aware that it won’t give you the full Accept-Encoding: gzip, deflate header, the browser supports gzip
2
Integrated Intelligent Research (IIR) International Journal of Web Technology
Volume: 02 Issue: 01 June 2013 Page No.17-20
ISSN: 2278-2389
and deflate compression mechanisms, so the response can be places around the world so request can be served from the
compressed in any of the given formats by the web server in nearest location and save time (which means performance and
order to reduce the payload size. This leads to an increase in money as well).
performance. Latter that compressed response is decompressed
by the browser and rendered normally. 8. Ajax
Ajax is being increasingly used to improve usability, but
3. Correct Formatted Images at the Right Place oftentimes in a way which increases overall server load.
Normally designers use JPG or GIF formats quite randomly and Solutions:
ignore some other good formats to compress images. Preferably use the GET method for Ajax based Requests,
Solution: Correct format should be used for right purpose like If because if you use POST method then the request header would
you have to place a background image, some large image or a be sent first, followed by the data, which basically splits the
screenshot then the suggested format is JPG/JPEG. If you have to request in two steps. A single-step request can be achieved with
use small graphics like button images, header images, footer GET if a cookie is not too long and the URL is not larger than
images, navigation bar images or clip arts, then the suggested 2k. When using ASP.NET AJAX and the UpdatePanel control
format is PNG. If an image is not required to be in high or true for partial page rendering, use the maximum number of update
colors and 256 colors are enough, then GIF is preferred. panels to update small chunks of page, but use them wisely.
Don’t set the Update property to Always unless needed. Instead,
4. Compress CSS, JavaScript and Images set the update mode to Conditional, otherwise all the partial
CSS files (.css), images and JavaScript (.js) files can be chunks would be sent together after each asynchronous postback.
compressed, as normally .css and .js files contain unnecessary Ajax based requests can also be cached when using the GET
spaces, comments, unnecessary code and such other things. A method. If the URL is the same, then cached data can be used
number of high quality (and free) utilities are available to help from the client, and a round trip to the server can be avoided.
you pre-compress your files. I have used these utilities and seen
compression results of about 50% in file size reduction after 9. Ajax vs. Callback
using such loss-less compression, so I recommend them. Ajax is a great solution for asynchronous communication
between client (web browser) and HTTP servers, but one
5. CSS at Top solution can't be applied to every problem. This means that Ajax
The recommended approach is to put CSS links on top of the is great mechanism for sending requests to the server without
web page, as it makes the page render progressively efficient. making a full page postback, but what if you need to send a
Since users want to see the contents of a page whilst it’s loading request to the server and don’t even need partial rendering?
rather than white spaces, contents/formats should be given on Solution: best solution is Callback.
top. HTML Specifications clearly say to declare style sheets in For example, if you need to check whether a user exists or not, or
the head section of a web page. if a user has forgotten his/her password and you just need to send
a request to the server to check if user name exist, there is no
6. Javascript at Bottom need for client-side render - just a server side operation.
When scripts are defined on top of the page they can take
unnecessary time to load; they don’t show the contents that users 10. Reduce Cookie size
are expecting after making any request to an HTTP web server. Cookies are stored on the client side to keep information about
It's better to display a the HTML contents of a page, then load users (authentication and personalization). Since HTTP is a
any scripting code (when possible, of course). Preferably use/link stateless protocol, cookies are common in web development to
up JavaScript-based scripts at the bottom of a web page. maintain information and state. Cookies are sent with every
Alternatively you can use the defer attribute, which runs the HTTP requests, so try to keep them low in size to minimize
script at the end of page loading, but that is not the preferable effects on the HTTP response. Cookie’s size should be
approach as it is not browser independent. For example, Firefox minimized as much as possible. Cookies shouldn’t contain secret
doesn’t support it and could mess up with document. Write, so information. If really needed, that information should be either
only use it once you fully understand the implications. encrypted or encoded. Try to minimize the number of cookies by
removing unnecessary cookies. Cookies should expire as soon as
7. Content Delivery Network: (CDN) they become useless for an application.
When a browser makes a request to any web page – that is, he
types a URL/URI of any web page or web site, a request goes 11. Use Cache appropriately
through many hops (routers and computers) and then finally Cache mechanism is a great way to save server round trips - and
reaches its final destination. This happens both for requests and also database server round trips - as both round trips are
responses. This operation affects performance and can severely expensive processes. By caching data we can avoid hitting them
effect load time. A Content Delivery Network implies a when unnecessary. Following are few guidelines for
collection of computers, distributed all over the world, which implementing caching:Static contents should be cached, like
deliver data (contents). Through a CDN you can have your ―Contact us‖ and ―About us‖ pages, and such other pages
website data on multiple servers distributed in different locations which contain static information. If a page is not fully static, it
around the world. Distribute web application data in different
3
Integrated Intelligent Research (IIR) International Journal of Web Technology
Volume: 02 Issue: 01 June 2013 Page No.17-20
ISSN: 2278-2389
contains some dynamic information. Such pages can leverage the [11] Bellonch, Albert. "Web performance optimization for everyone".
Retrieved 4 December 2012.
ASP.NET technology, which supports partial page caching. If [12] Killelea, Patrick (2002). Web Performance Tuning. Sebastopol: O'Reilly
data is dynamically accessed and used in web pages - like data is Media. p. 480. ISBN 059600172X.
being accessed from some file or database - and even if data is [13] Souders, Steve (2007). High Performance Websites. Farnham: O'Reilly
consistently or regularly changed, then that data could be cached Media. p. 170. ISBN 0596529309.
[14] "Steve Souders Bio". Retrieved 3 December 2012.
by using ASP.NET 2.0 cache dependency features. As soon as
data changes from the back-end by some other means, the cache VIII WEBSITES
would be updated. Now that web technologies such ASP.NET
has matured and offers such great caching capabilities, there's [15] http://www.plaveb.com
[16] http://www.plaveb.com
really no reason not to make extensive use of them. [17] http://www.elie.im
[18] http://dotnetslackers.com
12. Upload compiled code rather than source code [19] http://developer.yahoo.com
Pre-compiled ASP.NET pages perform much better than source [20] http://sixrevisions.com
code versions. Actually pre-compilation give web sites a Mr. Balamurugan Souprayen received his
performance boost especially when the first request is made to a Master’s Degree in Computer Applications in
folder containing that resource. Uploading a pre-compiled 2003 and Completed M.Phil (Computer Science)
version boosts up performance since the server doesn’t need to from Karpagam University, Tamilnadu, India. He
is working as Asst.Professor in Department of
compile a page at request-time. MCA, Sri Manakula Vinayagar Engg. College,
Tamilnadu. His areas of interest are Web
VI. CONCLUSION Technology

In this study, we have discussed best Practices for speeding up


Web Site to gain better performance which is For HTTP
compression, GZip is considered the most effective and most
popular by means of browsers and HTTP server. It can reduce
file size up to 70% in size. Always keep JavaScript and CSS in
external files. Avoid redirects until needed. Server. Transfer is
also provided so consider that as well since it performs better in
some conditions. Minimize use of Iframes as its costly. Avoid
try-catch blocks for control-flow as they perform poorly.
Exceptions should be used only in truly exceptional situations.
Minimize Cookie/CSS sizes. Minimize DOM objects on page as
they are heavy weight. Use link tags rather than @import to
use/link up CSS. Favicon, being a static image displayed in the
browser’s address bar, should be cacheable and compressed.
Always prefer a cache-friendly folder structure. For example,
create specific folders for static contents, like /static for static
images/static pages… SSL can never be cached so minimize its
usage. Keep it for those pages which need to be secure, rather
than using it for all the pages. HTTP Post requests can’t be
cached, so choose the HTTP method appropriately. Prevent
Denial of Service (Dos) attacks. Prevent SQL Injection. Prevent
Cross Site Scripting (XSS).
REFERENCES
[1] Improve Web Application Performance, Adnan Aman, 2008.
[2] Analyzing Web Application Performance, Elie Bursztein, 2011
[3] Different ways to Improve Your Web Page Performance, Jacab Gube,
2008
[4] A study of security and performance issues in designing web-based
applications, Shin-jer yang , Soochow univ., Taipei and Jia-shin chen ,
2007
[5] Characterizing web application performance for maximizing service
providers' profits in clouds, Xi Chen, 2001
[6] Web Application to Improve Police Management Performance, Debnath,
N. 2010
[7] Research the performance testing and performance improvement strategy
in web application, Kunhua Zhu, 2010
[8] Sharon, Bell. "WPO | Preparing for Cyber Monday Traffic". CDNetworks.
Retrieved 4 December 2012.
[9] "Webmaster Guidelines". Retrieved 2 December 2012.
[10] Souders, Steve. "Web First for Mobile". Retrieved 4 December 2012.

You might also like