![]() (Ftp command prompt toggles batch processing for mget/mput.)įtp -in | nawk 'findNewChangedFiles > mgetTargets < open $ You do know about the ftp cmdline option -n (maybe -i, too) or maybe the ftp cmd 'prompt' preceding mget, right? Between these two cmdline options and mget with variablized filespec, you shouldn't have any trouble. Some will expand a Unix type wild-card for 100's of files, others require individual filenames spelled out, AND as a separate issue/complication, some clients have a command line size limit, so then you're executing multiple calls on the ftp process to mget a set of files. My experience is that mget differs by client build (aix, solaris, gnu, etc). I'd **really** be surprised if gFTP doesn't support the mget command. Thanks again.įor your first pass, you can use the ftp 'dir filespec' and capture that output to a local file, processing that to build your list of new/changed files, the output then being used as arguments to mget. Just a pain that I have 99% of a solution, broken by a trivial bug. I also plan to provide the clients ability to put files - there is also a Wput project but it looks very early days. I need to download, build and validate it. Wget looks interesting - it does the selective part. There is mget with wildcards, but I have not tried with multiple file names on one line (the gftp-text command set is hard to find documentation on.) I might be able to fake a wildcard for just the files I want to fetch. ![]() However, it's non-selective, so my data volume goes up times 5 in the usual case. The gftp -d option works, and does not show my problem. I also visit periodically just to detect if the client locks FTP access accidentally, so we can call him to manually unlock it again during the daytime. I visit once to list the available files, then figure what has changed and visit again to get the new/changed files only. ![]() I'm trying to visit each site nightly and collect just the new files. Gftp linux plus#I have about 120 client sites, each of which runs a HTML and mySql backup every night, and keeps the last 5 days in case I miss a collection, plus some appended logs. My current workaround is to do each get in a separate session, but that's not very sweet for a lot of files, and my hoster will probably complain eventually. gFTP will do simultaneous transfers through the GUI, but it's not trying that here: each get waits for the previous one to complete, it just fails to use the existing connection when it starts. gftprc resource parameters, and can't see anything amiss. ![]() I checked all the online documentation and all the. So after 3 files, my hosting site bounces me. And it works (as gftp-text), UNTIL.Įvery time I do a "get filename", it opens a new connection to the site, even though it opened one from command line parameters. I tried lftp (failed to build) and then I got gFTP source and built it myself. My package comes with ssh (which my host service does not support FAIK), and gftp (GUI mode only) which looks beautiful and works great.īut I need a batch mode so I can cron it. I'm using gFTP (Gnu FTP V2.0.19) on Puppy Linux to FTP archives from Web sites. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |