> I am trying to distribute some work among several machines. The basic
> task is to acquire a file (I'm doing it via Net::FTP), then run my perl
> script on it, then send the results back (probably via ftp as well).
> ie:
> repeat {
> FTP next file from server;
> system("perl calc.pl filename");
> FTP results back to server;
> }
> The input files are 60-150Mb. The calculations take about the same time
> as transferring the file, often less time. So, I would like to
> ftp the next file while I'm doing the calculations. The calculations
> are basically CPU bound anyway, so this seems like a good fit that
> wouldn't slow things down very much.
> ie:
> repeat {
> while (FTP'ing the next file)
> do calcs on previous file;
> wait (if needed) until FTP is done;
> FTP results back to server;
> }
> Any pointers on where to look; I've browsed thru threads, exec() and
> system() so far, but haven't figured how to put it together yet. This
> would be running on Win32 / ActiveState Perl
> thanks
You might want to consider "fork". I'd recommend you fork two separate