|
Question : FTP versus xcopy
|
|
I have a central ftp server and 30 remote servers. Although each server is on a separate domain, they all have VPN connections to the central server. All either win2003 or 2000. The remote servers have internet connections that are either DSL or cable (there are some flaky connections). The server is running Cerberus FTP server.
Goal: Every night, I need each server to connect to the central server and transfer a zip file, and pick up a couple of files. I need this transfer to happen EVERY night. Having one or 2 fail once a week is OK, but not 1 or 2 every night.
Problem: I am using an FTP-Performer script, that works as long as there are no connection issues. I have tried MANY different script changes to attempt to resume broken uploads/downloads. It is very close - the REST command seems to work OK, but then the STOR right after it fails with "Unable to open file - Access denied". Also - the script language seems to send QUIT way too often, and the I have to check the connection status many times in the script and connect again.
Solution could come in one of several forms: 1) Enable me to find and fix what is causing the access denied (the ftp server is set to timeout after 60 seconds - the client script waits 300 seconds before it reconnects and tries to resume). 2) Find another ftp client and/or server that would enable this to work. 3) Would a simple batch file with xcopy, etc. work just as well (speed is an issue, too) with flaky connections?
|
Answer : FTP versus xcopy
|
|
try syncback, it's inexpensive supports networks and FTP. It has vast improvments for figuring out what to copy and what to do if disconnected.
|
|
|
|