<div dir="ltr">use rsync unless that's not fast enough, in which case scripting cp + tar would probably work.<br></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Nov 5, 2014 at 10:53 AM, Raj Wurttemberg <span dir="ltr"><<a href="mailto:rajaw@c64.us" target="_blank">rajaw@c64.us</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Looking for some advice on parallel data transfers in Linux.<br>
<br>
We regularly get large amounts of data to import into a development<br>
environment. Usually about 10TB but the data is in many files.<br>
<br>
Is there a command line utility to copy several files at once? It's usually<br>
100+ files so I really can't specify a file name. I like the way FileZilla<br>
copies files but we don't need SFTP or FTP since we are copying directly to<br>
NFS.<br>
<br>
Kind regards,<br>
Raj Wurttemberg<br>
<a href="mailto:rajaw@c64.us">rajaw@c64.us</a><br>
<br>
<br>
_______________________________________________<br>
Ale mailing list<br>
<a href="mailto:Ale@ale.org">Ale@ale.org</a><br>
<a href="http://mail.ale.org/mailman/listinfo/ale" target="_blank">http://mail.ale.org/mailman/listinfo/ale</a><br>
See JOBS, ANNOUNCE and SCHOOLS lists at<br>
<a href="http://mail.ale.org/mailman/listinfo" target="_blank">http://mail.ale.org/mailman/listinfo</a><br>
</blockquote></div><br></div>