ftp automation

ScriptFTP

The professional tool to automate FTP, SFTP, FTPS and schedule FTP batch jobs

The forum is now read only. Please, go to the the main ScriptFTP website if you need help.
Post here if you experience file transfer problems or unexpected errors.
I am trying to connect through FTPS and download a large file about 1GB
server is filezilla
The transfer is getting interrupted at different points of the transfer
any ideas??



PASV
227 Entering Passive Mode (69,10,38,226,234,103)
Opening data connection to 69.10.38.226 Port: 60007
RETR 20090616.rar
Connected. Exchanging encryption keys...
150 Connection accepted
Session Cipher: 128 bit Unknown
TLS encrypted session established.
Transfer Timeout (30s). Closing data connection.
738452075 bytes transferred. (785 KB/s) (00:15:18)
426 Connection closed; transfer aborted.
***** GETFILE Error #426: Cannot download remote file 20090616.rar.
***** The server said: Connection closed; transfer aborted.
CWD /
250 CWD successful. "/" is current directory.
PWD
257 "/" is current directory.

CLOSEHOST
QUIT
221 Goodbye
Server closed connection
Disconnected.
I have been reading more and the problem might be the router is killing the connection
FTP has 2 connections manager and data
so while the data connection is alive, the manager is idle
so router is killing after 15 mins

does scriptFTP support "keep alive" feature?????

this problem doesn't happen if I use a client like filezilla
Hello,

ScriptFTP does not have a keep alive function, sorry. It is a must have feature I s houldinclude in the next version. In the meanwhile the connection errors should be addressed using the GOTO command. For example:
Code: Select all:download $result=GETFILE("*.txt") $attempts=$attempts+1 # Check if $result is different from "OK" IF($result!="OK") # If this is the third attempt stop execution IF($attempts==3) STOP ELSE PRINT("Cannot transfer! Trying again.") # Jump to the label :connect to retry                 # the connection GOTO :download END IF END IF


The problem with this approach is that every file is downloaded again if one fails. Using SYNC or downloading the files in different groups is a possible solution.