Some of My Favourite wget Tricks


Simple Load Testing with wget

To do a simple load test,wget can be used like this to download full pages, recursively (depth five) and repeat everything 30 times. This simple load test is a useful starting point to iron out the most obvious performance issues.

Once this doesn't pose any problem, time is ripe to start using more serious stress testing tools like httperf and siege

$ for i in $(seq 30); do
    wget -o /dev/null \
    -p \
    http://mysite.com \
done

Populating Your Caches with wget

This is how I populate my caches, both application caches, distributed memory caches and cache server caches are simply populated withwget. This will traverse the site recursively with the default depth5 and delete the files after downloading them.

$ wget -o /dev/null \
  -r \
  --delete-after \
  http://mysite.com

Please note that if you give/dev/null to the-O parameter (big 'o'), --delete-after actually remove the/dev/null file - if your OS and user rights allow it! The command above is safe, though. Just wanted to warn you as I've done this mistake before ;-)

Timing Your Site Delivery Time

This is how I time the delivery time of the web sites. It's important to time it many times, as you may hit the server(s) at the bad time when they're doing garbage collections, invalidate their caches etc. Thus, I always to 10 fetches to determine the delivery speed:

$ for i in $(seq 30); do
  time wget -p \
  --delete-after \
  -o /dev/null \
  http://mysite.com/
done

Licensed under CC BY Creative Commons License ~ ✉ torstein.k.johansen @ gmail ~ 🐘 @skybert@hachyderm.io ~ 🐦 @torsteinkrause