At my current location my (business) cable provider does not like you fully using your bandwidth, so they throttle your downloads after the first 20 or 30 megabytes, at peak hours they throttle to less than a 10% of your real bandwidth, quite unacceptable. So when I needed to download a multi gigabyte file from a CDN, I decided to do some ‘bashing’ …
#!/bin/bash COOKIES='' # Here your cookies to authenticate or whatever HOST='here.goes.the.hostname.com' URL='http://1.2.3.4/sample/url/for/your_file.dmg' FILE=$(basename "$URL") MINSPEED=8000 # min KB every 10s while true; do curl -C- -o "$FILE" -H "Host: $HOST" -H "Cookie: $COOKIES" "$URL" & CURLPID=$! LSIZE=$(du -ks "$FILE" | cut -f1) while true; do sleep 10 SIZE=$(du -ks "$FILE" | cut -f1) if [ $(( $SIZE - $LSIZE )) -lt $MINSPEED ]; then break; fi LSIZE=$SIZE done if ! kill $CURLPID; then break; fi # Not running, already ended! wait $CURLPID 2> /dev/null done
this was the second version, the first, just killed the curl
process after 10s, but this one is better, as sometimes they forget to throttle you. Also I specify an IP in the URL as it makes things faster; well implemented CDNs should cache the file and keep it hot in at least the IP serving you, if you jump from one IP to another, you are a bad net-citizen 😀
The ETA for the download is now below 1 hour, instead of 4. Really love UNIX!