To administer your WestHost account, please enter your
Domain Name or Server Manager Username.

8 Tips & Resources to Increase Site Speed

Web Site Speed, Faster Web site

Web site speed has now become a more important factor following Google’s recent announcement that page load speed will be introduced to their 200+ factor algorithm. In a video interview with WebProNews software engineer for Google, Matt Cutts, said “It should be a good experience, and so it’s sort of fair to say that if you’re a fast site, maybe you should get a little bit of a bonus. If you really have an awfully slow site, then maybe users don’t want that as much.”

With their announcement, Google and many others have constructed tools to test the many characteristics involved with Web page load time. What you might not know is that most of these tools test the site itself and provide information that you can do something about, independent of your service providers.

First of all it is important to know how Web site speed actually works and the factors involved in moving your data. Secondly, the most common reasons you may experience poor site performance; and finally a handful of great resources for testing your site.


Web Site Speed
Perceiving the speed of your site also involves many factors.

  • Depending where you are in the world, your route to the server will take different paths called hops. A major factor in Internet speed depends on the speed of the network hops between you, point A, and your destination, point B. Just because you connect quickly to one Web site does not mean you will connect with the same speed to another.

    This is illustrated by a trace route from a server to Google.com

    Traceroute to www.google.com (74.125.19.99), 30 hops max, 60 byte packets
    -1 69-36-160-13.WEST-DATACENTER.NET(69.36.160.13) 0.466 ms 0.43 ms 0.625 ms
    -2 206.130.126.18.west-datacenter.net(206.130.126.18) 0.858 ms 0.82 ms 0.818 ms
    -3 206.130.126.42.west-datacenter.net(206.130.126.42) 0.799 ms 0.78 ms 0.748 ms
    -4 ge11-1-0d0.mcr1.saltlake2-ut.us.xo.net(65.46.56.217) 1.20 ms 1.193 ms 1.175 ms
    -5 216.156.0.5.ptr.us.xo.net (216.156.0.5) 17.629 ms 17.855 ms 17.838 ms
    -6 207.88.13.101.ptr.us.xo.net (207.88.13.101) 17.816 ms 17.882 ms 17.850 ms
    -7 216.156.84.30.ptr.us.xo.net (216.156.84.30) 17.870 ms 17.845 ms 17.822 ms
    -8 72.14.239.250 (72.14.239.250) 18.786 ms 18.759 ms 18.737 ms
    -9 209.85.251.94 (209.85.251.94) 23.953 ms 24.177 ms 24.160 ms
    -10 nuq04s01-in-f99.1e100.net (74.125.19.99) 18.646 ms 18.899 ms 18.854 ms

    As you can see, there were 10 servers hit, or hops, between me and the eventual IP of the server hosting Google.com. Each one had a different connection speed, measured in milliseconds (ms). In this case, nothing was abnormal; but sometimes a server between you and your destination can be extremely slow or overloaded causing timeouts and other nasty errors.

  • Your personal Internet service provider connection could be slow. You can check this by viewing your Web site from a computer connected to a different network.

Common Issues
If your site is slow due to issues between your chair and the keyboard, it will typically be one of the following reasons:

Factors involved is Web site optimization

  1. You may be using a greedy add-on or plug-in on one of the many content management systems (CMS) such as WordPress, Drupal, Joomla, etc. CMS applications have extensive, active communities who develop oodles of neat little add-ons and plug-ins to add to your CMS site; however, many of these are extremely greedy and eat-up server resources resulting in slow Web site speeds.
  2. Some developers create scripts (mail scripts, image management scripts, etc.) that, if developed improperly or used too frequently, can cause problems in the backend and affect the speed of your site load time.
  3. You are getting too much traffic. This simply means you have outgrown your current hosting package and it is time to upgrade. Congrats!
  4. Your pictures are too large! Many sites contain images that are much larger than necessary and cause slow load times. You can decrease the file size by compressing the image to reduce the resolution and shrink the pixels or size of the image itself. Most programs have an option to save a file for the Web which automatically compresses the image. It is also important to remember that most digital images are much larger than the area you are trying to fill. For example if you take a picture of your family with a 3.1 megapixel camera the image will contain more than 3 million pixels, and a picture that is 2048×1536 pixels. Most of the time an image this size is much larger than necessary and will bog down your site.
  5. Resources

  6. Google Code
    Google’s resource to make the Web faster. Through this link you’ll find hundreds of great articles, active forums, and several great tools providing real-time tips to discover site speed variables, script compilers, and more.
  7. Yahoo! Developer Network
    Yahoo!’s resource for developers. Offering tools, Yahoo! APIs, and other resources to help developers build a better site. Perhaps the favorite tool here is YSlow. It is a Firefox add-on created to provide a grade for each site based upon Yahoo!’s algorithm of best practices.
  8. WebPageTest
    Originally created by AOL for use on its own sites, WebPageTest provides a more technical, location specific, “waterfall of your page load performance as well as a comparison against an optimization checklist.” WebPageTest is available with any URL and can be searched from a few different US locations with 2 different browsers.
  9. Web Page Analyzer
    A private company offering a free tool providing information to increase site speed. You simply need to enter a URL and the system calculates page size, composition, and download time. The script then calculates the size of individual elements and sums up each type of Web page component. Based on these page characteristics the script then offers advice on how to improve page load time. The script incorporates the latest best practices from Website Optimization Secrets, Web page size guidelines and trends, and web site optimization techniques into its recommendations.

Another great post created by WebDesignBooth offers additional resources.

What resources have helped you develop a fast site?

About Jake Neeley

Jake Neeley is a content marketing and social media geek who loves learning, outdoor sports (especially those in Utah mountains), and time with the fam. Connect with Jake on Google+, Twitter, and LinkedIn.

This entry was posted in Marketing & Advertising, Web site Development. Bookmark the permalink. Both comments and trackbacks are currently closed.

15 Comments

  1. Posted December 26, 2009 at 7:18 pm | Permalink

    It’s great that Google is making this new move towards speeding up the web, but I just wish they would make some of the info more transparent, but I guess they want us to search the web to find all the info on how to do some of the things to make the our websites faster :D

  2. Posted December 28, 2009 at 11:54 am | Permalink

    Yes, it can be hard to find. Google does have a pretty good community at code.google.com/speed/community.html with two groups devoted to Web speed. One group, ‘Make the Web Faster’ seems to have more information about specific tools and good site development, etc. while the other, ‘page-speed-discuss,’ generally discusses page load time.

    Let me know if you find anything else that is helpful… I might have to write an addendum :)

  3. Posted January 3, 2010 at 1:56 am | Permalink

    I had a slow loading site when I first started, and realized my images had not been compressed. But once this was done, there was an amazing improvement. I noticed too, that you don’t really lose much in terms of image quality by compressing the images.

    • Posted January 6, 2010 at 4:08 pm | Permalink

      Thanks for the note, this is a great point to pass along!

  4. Posted January 6, 2010 at 4:23 pm | Permalink
  5. Posted January 12, 2010 at 1:23 pm | Permalink

    Just to elaborate on the image size optimization technique: Use a photo editing application [such as Photoshop [elements], Paintshop Pro, or even something like Google’s Picasa], to set the image size to exactly the same as it will be in on the web page. Then, when you save the image (from the photo editing software), select the file type, either JPEG (.jpg), GIF or PNG and then set the image compression level to just above the level where the “compress artifacts” are noticeable. Most photo editing software includes a preview so you can see how the different compression levels will affect the image quality.

    Also, it’s important to know the difference between non-destructive compression and the kind of compression that JPEG uses where there is a trade-off between image quality and file size. GIF and PNG use a non-destructive compression technique. They can compress an image without changing it’s appearance in anyway. But, this kind of compression technique only works well with images that have large areas of solid color–typically images containing text, and logos with lots of solid colors.

    Now, when I say “solid color” I mean where there are NO variations in color–where every pixel is set to EXACTLY the same color. This differs from a photo that has, say, a large area of blue sky. If you look very closely at the sky area of a photo, you will see that the colors vary ever so slightly (and in this definition of “color” we’re taking levels of brightness and darkness as well as variations in what most people call color). You can also load it into a photo editor and use the eye-dropper tool to examine the individual pixels. In most cases you will see that each pixel has a different set of color values. ANY variation of color defeats the kind of compression used in GIF and PNG, UNLESS there are runs of the same color in the horizontal direction (hence the name: “run-length compression”). But I digress…

    JPEG (and PNG, because PNG can do BOTH forms of compression), on the other hand, does a kind of compression that works well for photos but not for images with a lot of solid colors and sharp changes in color. Try it! Use a photo editor or digital paint program to create an image with some large, bold mono color text against a strongly contrasting background –say, dark red text on a white background. Then save it as a JPEG with the compression set to some high value. Then load it back into the photo editor and examine it closely (e.g. using the zoom tool). You will see all kinds of “crud” around the areas where the colors transition from red to white or white to red. Save the original, unsullied image as GIF (not the one you with the crud in it). Also notice the file sizes. Depending on the level of compression on the JPEG, the file sizes will either be similar, or the GIF will be much smaller. Bottom line, even if the JPEG has a small file size, the GIF will be small AND PRETTY.

    Then try the same with a typical photograph, and notice that it’s the JPEG that is small and pretty.

    In your HTML (the code that generates your web page), be sure to set the actual size of the image. This way, the browser doesn’t have to take time figuring out what size the image is before rendering it. Also, if the size in the web code matches the actual size of the image, the browser doesn’t have to take time resizing the image. Here’s an example of the HTML involved:

    <img src=”myPhoto.jpg” width=”100″ height=”90″ >

  6. Posted January 16, 2010 at 11:28 am | Permalink

    I suspect that Google has already figured out how to account for hops in their speed equations; otherwise their algorithm would simply measures the internet distance from the Googlebot.

    The comment about outgrowing a web site is a bit perplexing. Our accounts are given a large amount of bandwidth (500GM to 2000GB). It seems to me that a web host should be able to deliver a decent portion of the bandwidth a customer purchased.

    I wish there was more guidance on when an account has outgrown a service.

    Since most accounts are on shared servers, a better description is that a slow server is a sign that the web host oversold the server, or that there are some sites which have outgrown their resources on the server. It may be you or other person on the same server.

    The suggestions on image compression is helpful. All of formatting elements of a page should be compressed as a matter of good design; However, since images are ofen considered content, I doubt googlebot will care. That is a properly compressed image will improve the experience of human users. Googlebot is likely to simply measure the speed at which the content comes over the wire assuming that it is valuable content.

    It seems to me the things that would help designwise would be to optimize all server side scripts. If there is an element on a page that is slow, deliver that piece of the page with a javascript. Googlebot would measure the time to download the script and not the resource.

    Use robots.txt to block googlebot from any slow resources on your site.

    I suspect that most of the questions about speed will be technical issues at the hosting level. Hopefully this change will have web hosts competing on providing fast service.

    Sadly, what is likely to happen is Google will decide that pages generated with frontpage are better to pages with custom coded scripts because they download faster.

  7. Posted January 21, 2010 at 6:26 pm | Permalink

    Hi Kevin

    There is more than just bandwidth when considering whether you’ve “outgrown” your hosting plan. It is very rare for our clients to near or exceed the bandwidth limitations of the hosting package. Bandwidth is almost never the limiting factor.

    For example, on WestHost 3.0 accounts, you are limited to 25 concurrent Apache processes (25 concurrent web server connections). See http://www.westhost.com/blog/2009/06/09/improving-apache-web-server-performance/ for more information on this.

    If your website is processing-intensive (e.g., dynamic PHP-based sites like a blog or a bulletin board), then as the traffic grows, your memory and CPU needs also grow. You can make use of caching to lessen the memory and CPU requirements of a popular site, but this can only take you so far. While it is definitely possible to run a very popular site from a WestHost 3.0 shared account for less than $20 per month, you probably won’t be able to do it with off-the-shelf software because you’ll need every database and coding optimization you can get.

  8. Posted January 28, 2010 at 12:56 am | Permalink

    It depends on your cable companys speed. Where I use to live I use to get mad while waiting for a video on you tube to load, now at my new home it runs so fast im shocked, if you go to you tube it might tell you how to make your firefox faster etc.

  9. Posted January 31, 2010 at 6:23 pm | Permalink

    This is helpful – thanks for the tips.

  10. Posted February 26, 2010 at 1:57 am | Permalink

    It depends on your cable companys speed. Where I use to live I use to get mad while waiting for a video on you tube to load, now at my new home it runs so fast im shocked, if you go to you tube it might tell you how to make your firefox faster etc. Remember disk cleanup, diskfragment , and dont have too many unwanted programs on your computer hopefully it can run a little faster, good luck buddy.

  11. Posted March 10, 2010 at 9:39 am | Permalink

    Image size? fast loading?
    Well I would be happy if I could just simply display my web site – this is the third time in as many weeks that this impecable service has left me flat & right in the middle of a an advertising kick off.
    And then they ad insult to injury bu offering “2 weeks free” for every sucker you can get to use this service??
    They need to fix their tech problems AND THEN HIRE A PR FIRM

  12. Posted April 2, 2010 at 11:12 am | Permalink

    Perhaps this is somehow a push to get more people onto their Google Fiber internet service. Best response times to Google searches would be via their own network after all.

    Till then,

    Jean

  13. Posted July 27, 2010 at 2:58 am | Permalink

    Just found this post in a loooooong search to optimize our sites. Thanks for the tips. I want to add some another simple trick:

    add client side caching of static content like CSS and the logo of your site. Apache has some very simple configuration settings for this:

    ExpiresActive On

    ExpiresByType text/css “access plus 1 day”
    ExpiresByType text/javascript “access plus 1 day”
    ExpiresByType application/x-shockwave-flash “access plus 1 day”

    The browser will not download a css, javascript or flash file again, or even check if it has changed. (This will not affect the first page the user visits ofcourse)

    Catherine

  14. Posted July 27, 2010 at 3:01 am | Permalink

    The Directory tags were removed in my previous post. See the mod_expires documentation on apache.org for more information (http://httpd.apache.org/docs/2.0/mod/mod_expires.html)

    You need to load the mod_expires:

    LoadModule expires_module libexec/mod_expires.so
    AddModule mod_expires.c

    Catherine

Copyright © 1998–2014, WestHost. All rights reserved.  |  WestHost / Believe in Better Hosting. Privacy Policy