Hello,
While I try to download large web site I always get error messages below, how to solve this problem? Is it possible to skip that timed out file and procceed with other files?
***** GETFILE Error #4: Operation timed out.
***** GETFILE Error #3: Not connected and not logged in to remote server.
While I try to download large web site I always get error messages below, how to solve this problem? Is it possible to skip that timed out file and procceed with other files?
***** GETFILE Error #4: Operation timed out.
***** GETFILE Error #3: Not connected and not logged in to remote server.
Code: Select all
# Connect to FTP server
OPENHOST("64.191.125.x","x","x")
# Go to C:\MyDir
LOCALCHDIR("C:\www\!BackUp\Server 3\")
# Get all the files in the FTP server
GETFILE("*.*",SUBDIRS)
# Transfer finished, close the connection
CLOSEHOST
Hello,
You can skip the errors downloading each file separately using FOREACH. The only counterpart Is that you cannot go through subdirectories. For example:
You can skip the errors downloading each file separately using FOREACH. The only counterpart Is that you cannot go through subdirectories. For example:
Code: Select all
# Retrieve the C:\pim\outbox file listing
$result=GETLIST($list,REMOTE_FILES)
# If GETLIST failed stop the script
IF($result!="OK")
STOP
END IF
# For each file in $list...
FOREACH $item IN $list
# Upload the file
$result=GETFILE($item)
END FOREACH
liked Script FTP? Try our other product Email Parser
It is a pity that there is no solution for timeout