因进行压力测试,需要按照如下要求生成大量文件做模拟源站【之前是通过nginx的重定向映射做的五个固定大小的文件】:
1、大文件:数量,488888
大文件分布 | |||
分类 | 文件大小(KB) | 个数占比(%) | 总平均大小(MB) |
分类1 | 0-1M | 43 | 9 |
分类2 | 1-5M | 34 | |
分类3 | 5-10M | 2 | |
分类4 | 10-50M | 19 | |
分类5 | 50M以上 | 2 |
2、小文件:18500000
小文件分布 | |||
分类 | 文件大小(KB) | 个数占比(%) | 总平均大小(KB) |
分类1 | 0-10k | 64 | 66 |
分类2 | 10k-50k | 24 | |
分类3 | 50k-100k | 6 | |
分类4 | 100-500k | 5 | |
分类5 | 100-500k | 1 |
这里为了达到这个分布的随机大小文件,发布站有12块磁盘,为了防止压力测试的时候导致磁盘i/o,因此文件需要均分在12块磁盘下,我写了一个生成脚本,如下:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 |
#!/bin/bash bigfilenum=488888 smallfilenum=18500000 disknum=12 basedomain="http://www.tyumen.cn" basefiledir="/data/www/wwwroot" urllistfile="$basefiledir/urllist.txt" function ProcessBar() { now=$1 all=$2 proname=$3 percent=`awk BEGIN'{printf "%f", ('$now'/'$all')}'` len=`awk BEGIN'{printf "%d", (100*'$percent')}'` bar='>' for((i=0;i<len-1;i++)) do bar="="$bar done printf "$proname:[%-100s][%03d/%03d]\r" $bar $len 100 } function Rand(){ min=$1 max=$(($2-$min+1)) num=$(cat /dev/urandom | head -n 10 | cksum | awk '{print $1}') echo $(($num%$max+$min)) } function fileclass(){ filesizemin=$1 filesizemax=$2 filenum=$3 allnum=$3 basefiledir=$4 filetype=$5 filedir=$6 mkdir -p $basefiledir/$filetype/$filedir for dirnum in `seq 1 $disknum`;do mkdir -p /data/cache$dirnum/$filetype/$filedir;done while [ $filenum -ge 1 ] do fsize=$(Rand $filesizemin $filesizemax) fdirnum=$(( $filenum % $disknum +1 )) dd if=/dev/zero of="/data/cache$fdirnum/$filetype/$filedir/$filenum".gz bs=$fsize count=1 > /dev/null 2>&1 ln -s "/data/cache$fdirnum/$filetype/$filedir/$filenum".gz "$basefiledir/$filetype/$filedir/$filenum".gz let process=$allnum-$filenum ProcessBar $process $allnum $filetype-$filedir echo "$basedomain/$filetype/$filedir/$filenum".gz >> $urllistfile let filenum-- done printf "\n" } function makebigfile(){ let num0_1M=$bigfilenum*43/100 fileclass 0 1000000 $num0_1M $basefiledir bigfile num0_1M echo "num0_1M finished,BigFile 43% finishd" let num1_5M=$bigfilenum*34/100 fileclass 1000000 5000000 $num1_5M $basefiledir bigfile num1_5M echo "num1_5M finished,BigFile 77% finishd" let num5_10M=$bigfilenum*2/100 fileclass 5000000 10000000 $num5_10M $basefiledir bigfile num5_10M echo "num5_10M finished,BigFile 79% finishd" let num10_50M=$bigfilenum*19/100 fileclass 10000000 50000000 $num10_50M $basefiledir bigfile num10_50M echo "num10_50M finished,BigFile 98% finishd" let num50_100M=$bigfilenum*2/100 fileclass 50000000 100000000 $num50_100M $basefiledir bigfile num50_100M echo "num50_100M finished,BigFile 100% finishd" } function makesmallfile(){ let num0_10K=$bigfilenum*64/100 fileclass 0 10000 $num0_10K $basefiledir smallfile num0_10K echo "num0_10K finished,SmallFile 64% finishd" let num10_50K=$bigfilenum*24/100 fileclass 10000 50000 $num10_50K $basefiledir smallfile num10_50K echo "num10_50K finished,SmallFile 88% finishd" let num50_100K=$bigfilenum*6/100 fileclass 50000 100000 $num50_100K $basefiledir smallfile num50_100K echo "num50_100K finished,SmallFile 94% finishd" let num100_500K=$bigfilenum*5/100 fileclass 100000 500000 $num100_500K $basefiledir smallfile num100_500K echo "num100_500K finished,SmallFile 99% finishd" let num500_1000K=$bigfilenum*5/100 fileclass 500000 1000000 $num500_1000K $basefiledir smallfile num500_1000K echo "num500_1000K finished,SmallFile 100% finishd" } makebigfile makesmallfile |
生成的文件路径+domain的url在文件:/data/www/wwwroot/urllist.txt中;其它信息说明:
磁盘挂载的目录为:/data/cache{1……12}
源站所在根目录为:/data/www/wwwroot