How to Speed Up sites like vancouver2010.com by more than 50% in 5 minutes
a standard approach in order to get to a high-level analysis result in 5 minutes
By: Andreas Grabner
Feb. 25, 2010 10:13 AM
As the Winter Olympics are a hot topic right now I checked out vancouver2010.com to see if they have any potential to improve their web site performance. It seems I found a perfect candidate for this 5 minutes guide
Minute 1: Record your dynaTrace AJAX Session
Before I start recording a session I always turn on argument capturing via the preferences dialog:
Now its time to start tracing. I executed the following scenario:
Minute 2: Identify poorly performing pages
Here is what we can see
Minute 3: Analyze Timeline of slowest Page
Here is what I can read from this timeline graph (moving the mouse over these blocks gives me a tooltip with timing and context information):
Minute 4: Identify poorly performing CSS Selectors
I highlighted those calls that have a major impact on the performance of this event handler. You can see that most of the time is actually spent in the $ methods that is used to look up elements. Another thing that I can see is that they change the class name of the body to “en” which takes 550ms to execute.
The problem here is easy to explain. The site makes heavy use of the CSS Selectors to look up elements by class name. This type of lookup is not natively supported by Internet Explorer and therefore jQuery has to iterate through the whole DOM to find those elements. A better solution would be to use unique IDs - or at least add the tag name to the selector string – this also helps jQuery as it first finds all elements by tag name (which is natively implemented and therefore rather fast) and then only has to iterate through these elements. So instead of an average lookup time of between 50ms and 368ms this can be brought down to 5-10ms -> nice performance boost - eh?
Minute 5: Identify network bottlenecks
In the timeline I saw many image requests coming from the same domain. As most browsers have a physical network connection limitation per domain (e.g.: IE7 uses 2) the browsers can only download so many images in parallel. All other images have to wait for a physical connection to become available. Drilling into the Network View for page 4 I can see all these 70+ images and how they “have to wait” to become downloaded. Once these images are cached this problem is no longer such a big deal – but for first time visitors it is definitely slowing down the page:
The solution for this problem is using the concept of domain sharding. Using 2 domains to host the images allows the browser to use twice as many physical connections to download more images in parallel. This will speed up page the download of those images by 50%.
Feedback on this is always welcome. I am sure you have your own little tricks and processes to identify performance problems of your web sites. Feel free to share it with us at blog.dynatrace.com.
SOA World Latest Stories
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
SYS-CON Featured Whitepapers
Most Read This Week