Skip to content

bash and wget to download a lot of files

Problem:

  • Download 500 large files.
  • Files follow the pattern data_1.zip, data_2.zip, … data_500.zip.
  • All the files are in “https://examples.com/downloads”.
  • Files have from kilobytes to gigabytes in size.

Solution:

  • A naive Bash script to download the files using wget.
  • It is set to use wget with -c (continue). This will skip files already downloaded and continue files that have been interrupted. This only works with FTP/HTTP servers that support the “Range” header.
#!/bin/bash
set -euxo pipefail

PREFIX="https://examples.com/downloads/data_"

for i in $(seq 1 500); do
  wget -c "$PREFIX"_"$i".zip
done

Published inUncategorized

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *