grandad
Member
I recently wrote a wee shell script to run on my [Linux Mint] laptop.
The purpose of the script is to cycle through a series of servers and FTP down some files. This is done by setting up an array of variables and then cycling through the array -
This has worked perfectly, but today I ran into a problem. On the first of the month, some rather large files are involved. It cycled through the first few all right [files less than 50Mb] but baulked at the next - a file of 385Mb. I get my "150 Opening BINARY mode data connection" all right, and it completes the download but then just hangs. A Ctrl + C will cause it to skip onto the next cycle, but it repeatedly hangs on the larger [in excess of 200Mb] files.
Has anyone any ideas?
I should add that I am a complete novice when it comes to shell scripting, and the above is my first attempt!
The purpose of the script is to cycle through a series of servers and FTP down some files. This is done by setting up an array of variables and then cycling through the array -
Code:
count=0
while [[ $count -lt ${#SERVER[@]} ]]
do
ftp -v -i -n ${SERVER[$count]} <<END_OF_SESSION
user ${USER[$count]} ${PASSW[$count]}
cd webspace/httpdocs/backups
lcd ${LOCAL[$count]}
mget $REMOTEFILE
mdelete $OLDFILE
bye
END_OF_SESSION
(( count++ ))
done
Has anyone any ideas?
I should add that I am a complete novice when it comes to shell scripting, and the above is my first attempt!