Skip to content →

Fasterfox: when Firefox Extensions go bad!

Fasterfox is a Firefox Extension which claims to speed up your browsing experience:

[with Fasterfox] Dynamic speed increases can be obtained with the unique prefetching mechanism, which recycles idle bandwidth by silently loading and caching all of the links on the page you are browsing.

Fasterfox logo

Wow, that sounds great – doesn’t it? Until you actually think about what it does. And how it achieves it.

Fasterfox, in essence, is a prefetcher. It reads the page you are currently browsing and attempts to download all of the other pages being linked to. When you do decide to click on a link on the page, the browser can the load that page from disk rather than fetch it from the Internet. It then starts the whole process again – downloading links from the new page into the cache ready for the next page to be clicked on, etc…

Now, that is actually a really bad idea. It’s not just bad, it’s dumb too.

It’s bad because who knows what kind of page you are looking at? Sure, if you’re looking at BBC News then great – you’ll get an article linked to from the front page appear in no-time. But what if you are currently using an e-commerce website? Or even worse, using a web-based admin tool?

It boils down to the fact that we use hyperlinks on webpages not just as a means for accessing documents, but for issuing commands too. Remove item from basket. Delete message from inbox. Etc.

Have a think about some of the shopping or admin sites you use regularly. What would happen if you literally clicked on every link on that page – opening each one into a new window? Ok, many commands are crap-filtered with an “are you sure?” message on the subsequent page. But many are not – here are some nightmare scenarios I can think of:

  • Amazon 1-Click buy
  • Trying to preserve unread items in your web-based mail client
  • Table-empty command in PHPMyAdmin – eeek!

My theoretical concerns are backed up by complaints on the Mozilla website. “Johnny” wrote the following yesterday:

It cost me

I subscribe to a music service. I use Linux and download only songs, never albums. I pay a monthly service fee for a set number of downloads and by using FasterFox’s cache system (or whatever) entire albums appeared to the music service web site as having been downloaded. For the FIRST TIME I exceeded my ADSL provider’s monthly quota. I took it off both my windoze box and the Linux system. No thanks FasterFox, and besides I NEVER noticed a speed increase, what’s with that?

As Johnny alluded to in his comment, not only can this extension do bad things but it also vastly increases your bandwidth usage. That’s bad for everyone.

It’s bad for your ISP (but that’s between you and your ISP).

It’s also bad for website owners – who will see their traffic increase exponentially. That could lead to them exceeding their bandwidth quotas without any benefit, as few of those downloaded pages will ever be looked at by a human being.

And it’s bad for the Internet as a whole – what would happen if everyone used this plug-in? The additional load on our ISPs, the additional load on webservers and the additional load on the transit links that do all the hard work in between would noticeably slow down the Internet experience.

Check this gem from Bill Web on Lockergnome:


I haven’t played much with the custom settings. I just clicked the radio button next to “Turbo Charged – Maximum performance, exceeds RFC specs (increases load to Web servers),” and left it alone.

Say, what? I just can’t get over the casual disregard in which someone who has a fairly technical background (they’re posting to Lockergnome) sets their browser to request more data from a webserver then is defined in RFC 2616 – the Hypertext Transfer Protocol 1.1 standard. That is just plain and simple selfishness.

But like I was saying, it’s also a really dumb approach too.

After all, it’s not inconceivable that you might visit a webpage with 50 or 100 links on it. You are probably only going to visit 1 of them next (and that’s assuming you do click on a subsequent page). That seems like a really inefficient mechanism to me. And if you’re using P2P (for legal downloads of course) or VoIP then the extra bandwidth you are consuming is going to be detrimental to the other uses of your connection.

At this point, you are probably thinking “Well, Google offers a similar service”. And they do.

But they only prefetch the top result on a search result page. I not really convinced there’s much of a benefit either.

And finally, from a technical perspective it’s unlikely this will even work anyway. As Steve Olcott notes:

All this extension does is makes multiple connections to webservers causing them higher server load(doing the same thing as enabling pipelining through chrome, which is also a bad idea), and unless the webserver uses per connection bandwidth throttling you don’t get any benefit from this, Eg. if your internet connection can download at 40 Kb/s and the website you are connecting to doesn’t throttle per connection you will have 1 connection to the server at 40 Kb/s, but with this you would have 2 connections at 20 Kb/s, or 4 at 10 Kb/s giving you no extra speed but four times the server load because the server has to keep track of four times the amount of connected clients. However almost all webservers limit the amount of connections per client, so the webservers anti-leech protection could end up banning your IP for making too many simuntaneuos connections.

There’s been roughly 20 comments added to the Mozilla.org webpage for this extenion in the last two days, and only 3 of them have been positive about it. Nevertheless, it’s currently being promoted as the 3rd most popular extension download this week, with 97,813 poor souls having downloaded it in the last 7 days.

It will be interesting to see what happens to the Fasterfox Extension – and whether the Mozilla folks end up doing something about its damaging ways.

Published in Thoughts and Rants

23 Comments

  1. No! My poor bandwith :/

  2. Ohh my. I never realized. I turned off pre-fetching and I’m going to do the same on my co-workers machine. Thanks for the tip!

  3. tim – that argument is the same one used for Google Web Accelerator. The reality is REALITY, not what a document would recommend… and that means that prefetching and proxying applications are prone to erroneous requests as well as potential bandwidth-spiking activity.

    That said… My quick tests indicate that it doesn’t prefetch php pages (PHEW!) which would crush many sites like mine (as dynamic page generation is big overhead for WP-based sites, as for other systems). And the docs indicate that as well. So asp/etc. pages should be safe too.

    HOWEVER, this isn’t all good news. It would seem that depending on your site’s construction, the prefetching of things like IMAGES could really hurt you. Imagine gallery pages, where all the BIG image versions are immediately downloaded (this is a good reason for linking to a script that embeds the large image, rather than links direct to image files from thumbnails!).

    This isn’t new… it’s been done before in different ways. Flashgot can on-demand crawl for stuff. But FF has the potential, again if the link URLs ‘let it’, to be pretty ‘abusive’ to small site owners (not to mention the large ones!).

    Enough from me. I’m interested to hear from others.

    -d

  4. Mr Ooizo Mr Ooizo

    ….and your blog used to be so insightful.

  5. Spaghetti Spaghetti

    Uh, well, according to http://fasterfox.mozdev.org/faq.html#Can_prefetching_mess_things_up

    “Since prefetching is basically the same as clicking on a link, and clicking on a dynamic link can perform some action such as “logging you out” or “emptying your cart”, only static content is prefetched by Fasterfox. ”

    This would tend to invalidate your entire rant and it’s something you really should have researched first don’t you think ?

  6. Ben Ben

    Spaghetti: How would fasterfox know whether a link pointed to “static cotent” or “dynamic content”?

    Sure, a url like /dir/page.jsp?foo=bar&abc=zyz looks pretty dynamic, and as such I wouldn’t expect it to fetch that. But with the current url fasion of having cruftless urls, you can’t expect /page/logout/ or /page/droptabe/tablex/ to not also perform dynamic fuctions

    Ruby On Rails is totally setup so that these “cruftless” urls are used throughout, even for dynamic actions.

    Finally, as part of my research before writing the post (yes, I did do research) I read a number of accounts from people who experienced first hand Fasterfox prefetching dynamic pages and thereby causing bad results.

  7. Nobody is forced to use it! First one installs a plugin without reading the docs, then (s)he sets it to “turbo – violate RFC” and then (s)he complains…

  8. social drunk social drunk

    you can turn prefetching off, ya know. plus it give you access to some of the about:config things most people wouldnt want to bother with… pipelining, max memory, ect…

    i noticed a speed difference.

  9. Ben Ben

    Social Drunk – yes, this is true you can turn pre-fetching off. But the point of the whole post is that most people don’t (probably because they don’t know how to more than anything else)

  10. Dan Dan

    When I installed fasterfox, the prefetch defaulted to off and I left it off. As for establishing extra connections to web servers, that can definately help in SOME instances like overseas websites where latency keeps me from using most of my bandwidth, slow websites that can only send so much per connection (just due to load not throttling), etc.

  11. mindtempest mindtempest

    1: prefetch defaults to off, and with the warning there few people turn it on
    2: it doesnt prefetch pages that contain active content that causes you to buy stuff
    3: having 4 out of 100 connections to a server beats having 1 out of 100 connections to a server. You gain 3.9 times the speed that otherwise you wouldnt have due to distance or load.

  12. Mark Ives Mark Ives

    I’ve used Fasterfox for quite a while now, and though it occasionally slows down new pages I visit, pages I regularly visit it speeds up surfing a LOT!

    I’ve found a link for a tweaked version of Fasterfox, called rather cheesly Fastererfox. It allows you to grab web pages even faster with some new options which can be changed by the rather cool sounding “Thrash” option in the properties box.

    http://rdc.untamed.co.uk/fastererfox/frame.htm

  13. To me, the question is not, “is this a good idea?” (I never thought it was, though I have considered downloading it a few times) — the question is, how, as a webmaster, do I block it?

    For example, does it honor robots.txt? Is there a setting I can add to that file that would tell Fasterfox (or Fastererfox, blech!) not to prefectch or spawn multiple connections against my server?

    If robots.txt is not an option, what about some Apache directives inside of .htaccess files or IIS directives inside of global.aspx (or whatever it’s called in IIS)?

    If anyone has answers to these questions, it’d be helpful. I like David’s idea above about using PHP scripts to serve up the images instead of just linking to images. I’m not sure how to do that, but it’s now on my “must learn” list. Thanks David for the tip, and thanks Ben for the article.

  14. Hey! I googled myself and found out that some people actually read the Extension comments 🙂

    Thanks for the inclusion in your blog.

    -Steve

  15. Fasterfox makes my surfing on an unstable wireless connection actually bearable (AND faster ie. more enjoyable). You’re perspective has holes in it, and you don’t seem entirely informed on what the extension does, how it works, or how it affects people’s browsing experience (other than the web hosts). And with respect to the last, well, as one myself, get used to it.

  16. Hello! Good Site! Thanks you! uylfgllabgrpwg

  17. This has been gone over often after the whole Google Web Accelerator debacle. Websites need to use POST for non-idempotent operations, but they don’t so it is irresponsible to put out a tool like this.

    Ignoring ?foo=bar type url’s definately won’t solve this, and relying on robots.txt is stupid because you should be guarding against all potential clients like this by putting you’re “Order Ferrari” buttons behind a POST form. As a “webmaster”, you should definitely know this. No hackery with Apache or IIS is going to solve this problem for you.

Comments are closed.