I am considering large-scale websites a.k.a. portals or social networks.
(BTW, there are some libraries which requests I can't control, e.g. TinyMCE or google maps).
Usually you can use the following pattern:
With this practice, you just have 2 requests for your JS on each page and you get a clear separation/structure in your JS. For all pages except the first one, it will be just one request as main.js would be cached.
You can use the same principle for the CSS as well. This is effective but as mentioned by another answer, you can actually take this further and bundle everything in 1 js only. Its a preference or style. I like to break it into 2 as it keeps things logical for me.
Ensure the following though:
EDIT: I thought I would update the answer to answer some of your points.
Point 2: Is it better to combine only those files which are always used together?
Ans: Personally, I don't think so. If you are serving all files which are being used together, it doesn't matter which group they belong to or how they land up on the page.This is because we combine JS files to reduce the amount of HTTP Requests.
Once your JS is combined & minified & in PROD, you are not expect to debug or make sense out of it. So to bind together logically related JS files is a moot point. Its in your DEV environment where you would like to have all these logically related code files together.
Ans: There are certain cases where you are somehow forced to inject JS in the HEAD. Ideally, you shouldn't do it as SCRIPT tags are blocking in nature. So unless you really need to, place all your JS ( 1 or multiple files ) at the end of the BODY tag.
Point 4: What about having one file for common functions and one for administrative functions which is loaded if the user has specific permissions?
Ans: This seems like a reasonable approach to split your JS code. Depending upon the user privileges, you can fork your JS code.
Ans: This is a very subjective question. It depends on what you are building. If you are worried about too much JS being loaded on page load, you can always split it and use on-demand SCRIPT injection methods to split the load.
Like some others have said, put used-on-more-than-one-page scripts all together into a
main.js and then if there are page specific ones:
The only thing I really wanted to add was that for libraries like jQuery, you should use something like Google's Libraries API.
Oh, and finally -- don't forget to turn gzipping on your server!
Depending on your development environment, you might consider automating the process. It is a fair bit more work up front, but I found it has been worth it in the long run. How you would go about doing that depends largely on your project and environment. There are several options, but I will explain (high level) what we did.
The next time someone loads that page, the ASP.NET control sees the optimized version of the resource, and loads that instead of the list of 10-12 other resources.
Furthermore, it is designed to only load the optimized resources when the project is running in "RELEASE" mode (as opposed to DEBUG mode inside Visual Studio). This is fantastic because we can keep different classes, pages, controls, etc. separate for organization (and sharing across projects), but we still get the benefit of optimized loading. It is a completely transparent process that requires no additional attention (once it is working). We even went back and added a condition where the non-optimized resources were loaded if "debug=true" was specified in the query string of the URL for cases where you need to verify/replicate bugs in production.
I think it's necessary to perform the following optimisations:
If it will not help, than everything's bad and you need more servers to serve js and css files.
As always: it depends. The bigger the page-specific files are, the more sense it makes to keep them separate. If they're not big (think 10 kB minified) it probably makes more sense to join, minimize and compress them, so you can save some requests and rely on caching.
There are a multitude of things to consider while deciding how to combine your JS and CSS files.
Do you use CDN to serve your resources. If you do, you might get away with more requests per page then otherwise. Biggest killer of performance with multiple downloads is latency. CND will lower your latency.
What are your target browsers. If you are audience is mostly using IE8+, FF3 and WebKit browsers that will allow 6+ simultaneous connections to a given sub-domain, you can get away with more CSS files (but not JS). More than that, in the case of modern browsers, you actually would want to avoid combining all CSS into one big file, because even though overall time spent on downloading 1 large file is going to be shorter then time spent on downloading 6 smaller files of equal sumed size in sequence, you can download multiple files simultaneously, and download for 6 smaller files will finish before the download of one large file (because your users will not max out their bandwidth on CSS file download alone).
How many other assets do you serve from the same domain. Images, Flash, etc. will all count against connection limit of the browser per sub-domain. If possible, you want to move those to a separate sub-domain.
Do you heavily rely on caching. Deciding how to combine files is always a tradeoff between number of connections and caching. If you combine all files on one page and 90% of those files are used on other pages, your users will be forced to re-download combined file on every page of the site, because of ever-changing last 10%.
Do you use domain-splitting. If you do, again for CSS you can get away with more files if you serve them from multiple sub-domains. More available connections - faster download.
How often do you update your site after it goes life. If you do a patch every few days that will modify CSS/JS files, you defiantly want to avoid combining everything into one file, cause it will destroy caching.
Overall, my generic suggestion, not knowing all the facts about what your situation, is to combine files that are used on all pages (jquery, jquery-ui, etc.) into one file, after that if there are multiple widgets that are used on all pages, or almost all pages, combine those. And then combine files that are either unique to the page, or used only on one-two pages into another group. That should give you the biggest bang for your buck, without having to resort to calculating size of each file, hit numbers on each page and projected standard path of the user on the site to achieve ultimate combining strategy (yeah, I had to do all of the above for very large sites).
Assuming its on apache, if not, ignore the answer :)
I have not tried this myself yet but you might want to have a look at mod_pagespeed from google.
Alot of the features you want to do by hand are already in it, take a look here as well.
I'm working on a really big project in the same area which has a lot of style definitions and different JS-Scripts for different tasks in the project. The site had the same problems => too many requests. Basically, we've implemented a fix as follows:
And here we came to Problems.. at first the Stylesheets worked with @import rules. We wrote a wrapper script, which parsed all the styles and combined them into a single file. The result was, that all IE versions broke the layout when a style sheet exceedes about 250-270k in size. the formatting just looked like crap. So we changed this back, so that our wrapper only puts together two style sheets and writes all other sheets as @import rules at the top. Also, the wrapper uses caching headers and the ETag.
This solved the loading issues for us.
(Please, don't rant at me because we have a 300k stylesheet. Trust me, it just has to be for various reasons. :D)
One thing you should do is optimize your .htaccess file to compress and cache files properly:
In general, I concatenate and minify all the scripts on the site and serve only two requests - one for JS and one for CSS. The exception to that rule is if a certain page has a significantly sized script that is only run there - in that case it should be loaded separately.
Load all the JS scripts at the bottom of the your page to prevent scripts from blocking page load.
Consolidate what you can but namespacing and using closures may help keep the DOM clean of your function calls until they're needed.
There are tools that can test page load speed.
©2020 All rights reserved.