Usually, data that mimics real-world data is always the best, but sometimes we need an assortment of files of various content and size for validation testing without delay. Imagine that you have a web server and it is running some sort of application that accepts files for storage. However, the files have a size limit being enforced. Wouldn't it be great to just whip up a batch of files in an instant?
To do this, we can use some few file system features such as /dev/random
and a useful program called dd
. The dd
command is a utility that can be used to convert and copy files (including devices due to Linux's concept of everything is a file, more or less). It can be used in a later recipe to back up data on an SD card (remember your favorite Raspberry Pi project?) or to "chomp" through files byte by byte without losses. Typical minimal dd usage can be $ dd if="inputFile" of="outputFile" bs=1M count=10
. From this command, we can see:
if=
: Stands...