Optimizing javascript and css requests

I need to optimize the loading speed of several existing websites. One of the issues that I have is the amount of requests per page. The websites have 7 or more different types of pages which should load different set of css and javascripts because they contain different widgets or functionality. Currently each widget or functionality has its own javascript file. I am planning to combine and minify the files to have fewer requests.

  1. Would it be a good practice to combine and minify all javascripts necessary on each type of page into one file (and to do the same for css)? e.g.
    • home page has just one homepage.js,
    • list pages have just listing.js,
    • detail pages have just detail.js,
    • etc.
  2. Is it better to combine only those files which are always used together? e.g.
    • jquery.js + jquery.cookie.js + common.js,
    • list.js + paging.js + favorite.js,
    • detail.js + favorite.js,
    • etc.
  3. What about having one file for all javascripts that should load in the head and one file for all javascripts that should load at the end of body, e.g.
    • init.js goes to <head> and do.js goes to <body>.
  4. What about having one file for common functions and one for administrative functions which is loaded if the user has specific permissions?
  5. Are there any strategies how to balance between 1., 2., 3., and 4.?
  6. What is a recommended amount of javascript and css requests for a page?

I am considering large-scale websites a.k.a. portals or social networks.

(BTW, there are some libraries which requests I can't control, e.g. TinyMCE or google maps).



Usually you can use the following pattern:

  1. main.js - bundle all scripts here that are used by several pages on the website.
  2. page.js - all js specific to the page. This would mean bundling together js of all widgets on a page.

With this practice, you just have 2 requests for your JS on each page and you get a clear separation/structure in your JS. For all pages except the first one, it will be just one request as main.js would be cached.

You can use the same principle for the CSS as well. This is effective but as mentioned by another answer, you can actually take this further and bundle everything in 1 js only. Its a preference or style. I like to break it into 2 as it keeps things logical for me.

Ensure the following though:

  1. Namespace your JS else you might end up with errors when you bundle them together.
  2. Do your pages a favor and push them at the bottom of the page.

EDIT: I thought I would update the answer to answer some of your points.

Point 2: Is it better to combine only those files which are always used together?

Ans: Personally, I don't think so. If you are serving all files which are being used together, it doesn't matter which group they belong to or how they land up on the page.This is because we combine JS files to reduce the amount of HTTP Requests.

Once your JS is combined & minified & in PROD, you are not expect to debug or make sense out of it. So to bind together logically related JS files is a moot point. Its in your DEV environment where you would like to have all these logically related code files together.

Point 3: What about having one file for all javascripts that should load in the head and one file for all javascripts that should load at the end of body?

Ans: There are certain cases where you are somehow forced to inject JS in the HEAD. Ideally, you shouldn't do it as SCRIPT tags are blocking in nature. So unless you really need to, place all your JS ( 1 or multiple files ) at the end of the BODY tag.

Point 4: What about having one file for common functions and one for administrative functions which is loaded if the user has specific permissions?

Ans: This seems like a reasonable approach to split your JS code. Depending upon the user privileges, you can fork your JS code.

Point 6: What is a recommended amount of javascript and css requests for a page?

Ans: This is a very subjective question. It depends on what you are building. If you are worried about too much JS being loaded on page load, you can always split it and use on-demand SCRIPT injection methods to split the load.


Like some others have said, put used-on-more-than-one-page scripts all together into a main.js and then if there are page specific ones: home.js, another_page.js, etc.

The only thing I really wanted to add was that for libraries like jQuery, you should use something like Google's Libraries API.

  • If your user has visited a different site which also uses Google's servers for the libraries, then they'll arrive with a primed cache! Win!
  • I'm going to go out on a limb here and bet that Google's servers are faster than yours.
  • Because the requests are going to different servers, clients can simultaneously process two requests to the Google servers (eg: jQuery & jQuery UI) as well as two requests to your servers (main.js & main.css)

Oh, and finally -- don't forget to turn gzipping on your server!


Depending on your development environment, you might consider automating the process. It is a fair bit more work up front, but I found it has been worth it in the long run. How you would go about doing that depends largely on your project and environment. There are several options, but I will explain (high level) what we did.

In our case, we have several ASP.NET based websites. I wrote an ASP.NET control that simply contains a list of static dependencies - CSS and JavaScript. Each page lists what it needs. We have some pages with 7 or 8 JS dependencies and 4 or 5 CSS dependencies, depending on what shared libraries/controls are being used. The first time the page loads, I create a new background worker thread that evaluates all the static resources, combines them into a single file (1 for CSS, 1 for JS), and then performs minification on them using the Yahoo Yui Compressor (can do both JS and CSS). I then output the file into a new "merged" or "optimized" directory.

The next time someone loads that page, the ASP.NET control sees the optimized version of the resource, and loads that instead of the list of 10-12 other resources.

Furthermore, it is designed to only load the optimized resources when the project is running in "RELEASE" mode (as opposed to DEBUG mode inside Visual Studio). This is fantastic because we can keep different classes, pages, controls, etc. separate for organization (and sharing across projects), but we still get the benefit of optimized loading. It is a completely transparent process that requires no additional attention (once it is working). We even went back and added a condition where the non-optimized resources were loaded if "debug=true" was specified in the query string of the URL for cases where you need to verify/replicate bugs in production.


I think it's necessary to perform the following optimisations:

  • Server-side:

    1. Use fast web-server like nginx or lighttpd for serving js and css files if not using yet.
    2. Enable caching and gziping for the files.
  • And client-side:

    1. Place all common js-files into minified one and enable hardcore caching. If your page don't require running them before "onload", you can add one to page dynamically. It will speed up loading.
    2. Place per-page code into separate files and if it's not necessary to do smth. before onload event, add it dynamically.
    3. If it more than a few Kbytes of css at all, just merge it into one. Else you should split it to common and per-page styles.
    4. If you need best performance on client-side, place into common css file really common styles, that applied to ALL pages. Load rules you need on the page only, it will speed up rendering.
    5. The simplest way to minify js and css together is to use YUI Compress. If you want better speedup, you can remove all evals and other "dynamic" code and use Google Closure Compiler.

If it will not help, than everything's bad and you need more servers to serve js and css files.


As always: it depends. The bigger the page-specific files are, the more sense it makes to keep them separate. If they're not big (think 10 kB minified) it probably makes more sense to join, minimize and compress them, so you can save some requests and rely on caching.


There are a multitude of things to consider while deciding how to combine your JS and CSS files.

  1. Do you use CDN to serve your resources. If you do, you might get away with more requests per page then otherwise. Biggest killer of performance with multiple downloads is latency. CND will lower your latency.

  2. What are your target browsers. If you are audience is mostly using IE8+, FF3 and WebKit browsers that will allow 6+ simultaneous connections to a given sub-domain, you can get away with more CSS files (but not JS). More than that, in the case of modern browsers, you actually would want to avoid combining all CSS into one big file, because even though overall time spent on downloading 1 large file is going to be shorter then time spent on downloading 6 smaller files of equal sumed size in sequence, you can download multiple files simultaneously, and download for 6 smaller files will finish before the download of one large file (because your users will not max out their bandwidth on CSS file download alone).

  3. How many other assets do you serve from the same domain. Images, Flash, etc. will all count against connection limit of the browser per sub-domain. If possible, you want to move those to a separate sub-domain.

  4. Do you heavily rely on caching. Deciding how to combine files is always a tradeoff between number of connections and caching. If you combine all files on one page and 90% of those files are used on other pages, your users will be forced to re-download combined file on every page of the site, because of ever-changing last 10%.

  5. Do you use domain-splitting. If you do, again for CSS you can get away with more files if you serve them from multiple sub-domains. More available connections - faster download.

  6. How often do you update your site after it goes life. If you do a patch every few days that will modify CSS/JS files, you defiantly want to avoid combining everything into one file, cause it will destroy caching.

Overall, my generic suggestion, not knowing all the facts about what your situation, is to combine files that are used on all pages (jquery, jquery-ui, etc.) into one file, after that if there are multiple widgets that are used on all pages, or almost all pages, combine those. And then combine files that are either unique to the page, or used only on one-two pages into another group. That should give you the biggest bang for your buck, without having to resort to calculating size of each file, hit numbers on each page and projected standard path of the user on the site to achieve ultimate combining strategy (yeah, I had to do all of the above for very large sites).

Also, one other comment unrelated to your question, but about something you mentioned. Avoid adding Javascript files to the head of the document. Javascript download, parsing and execution will block all other activates in the browser (except for IE9, I believe), so you want your CSS to be included as early as possible on the page, and you want your JavaScripts to be included as late as possible, right before closing body tag, if possible at all.

And one more comment, if you are so interested in getting the best performance from your site, I suggest looking at some more obscure optimization techniques, such as preloading assets for the next page after page completion, or possibly even more obscure, like using Comet to serve only required javascript files (or chunks of js code) when they are requested.


Assuming its on apache, if not, ignore the answer :)

I have not tried this myself yet but you might want to have a look at mod_pagespeed from google.

Alot of the features you want to do by hand are already in it, take a look here as well.


I'm working on a really big project in the same area which has a lot of style definitions and different JS-Scripts for different tasks in the project. The site had the same problems => too many requests. Basically, we've implemented a fix as follows:

  • The render component of the application remembers each necessary js include file and puts them together, minified at the end of the rendering process. the resulting output is being cached by the clients via caching headers and the HTTP ETag!
  • the style sheets of the given application rely on each other. There is a huge basic style-sheet that cares about basic formatting and page objects (size, floats etc.) which is being extended by a basic color scheme, which may be extended by a custom overriding style sheet for different customers to enable custom colors, layout, icons etc.... All these stylesheets put together can exceed for example 300k in size because there are a lot of icons as background images... each icon has two definitions - one as a GIF for IE6 and a PNG image for all other Browsers.

And here we came to Problems.. at first the Stylesheets worked with @import rules. We wrote a wrapper script, which parsed all the styles and combined them into a single file. The result was, that all IE versions broke the layout when a style sheet exceedes about 250-270k in size. the formatting just looked like crap. So we changed this back, so that our wrapper only puts together two style sheets and writes all other sheets as @import rules at the top. Also, the wrapper uses caching headers and the ETag.

This solved the loading issues for us.


(Please, don't rant at me because we have a 300k stylesheet. Trust me, it just has to be for various reasons. :D)


One thing you should do is optimize your .htaccess file to compress and cache files properly:

# compress text, html, javascript, css, xml:  
AddOutputFilterByType DEFLATE text/plain  
AddOutputFilterByType DEFLATE text/html  
AddOutputFilterByType DEFLATE text/xml  
AddOutputFilterByType DEFLATE text/css  
AddOutputFilterByType DEFLATE application/xml  
AddOutputFilterByType DEFLATE application/xhtml+xml  
AddOutputFilterByType DEFLATE application/rss+xml  
AddOutputFilterByType DEFLATE application/javascript  
AddOutputFilterByType DEFLATE application/x-javascript

# cache media files
<FilesMatch ".(flv|gif|jpg|jpeg|png|ico|swf|js|css|pdf)$">  
Header set Cache-Control "max-age=2592000"  

In general, I concatenate and minify all the scripts on the site and serve only two requests - one for JS and one for CSS. The exception to that rule is if a certain page has a significantly sized script that is only run there - in that case it should be loaded separately.

Load all the JS scripts at the bottom of the your page to prevent scripts from blocking page load.


Minification and combining JS are only part of the battle. The placement of the files is more important. Obtrustive javascript should be the last loaded thing on the page as it will halt page load until it's done loading.

Consolidate what you can but namespacing and using closures may help keep the DOM clean of your function calls until they're needed.

There are tools that can test page load speed.

The Net Panel in Firebug as well as Yslow are handy tools that you can use to help debug page load speed.

Good luck and happy javascripting!


Recent Questions

Top Questions

Home Tags Terms of Service Privacy Policy DMCA Contact Us

©2020 All rights reserved.