<html>
<head>
<meta content="text/html; charset=ISO-8859-1"
http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<div class="moz-cite-prefix">On 10/10/2013 02:11 PM, Brandon Wood
wrote:<br>
</div>
<blockquote
cite="mid:CADs2hrUMW51oz2ou7dMY8o+c7UT6J5QK=b0_wZV7crHHHnrREg@mail.gmail.com"
type="cite">
<div dir="ltr">Any specific examples you care to share with the
group?</div>
</blockquote>
<br>
I had to backup 150 GB of stuff to BD-R. Without something "in
between" tar and the burner program, tar eventually dies (SIGPIPE)
and you can't easily resume the backup on the next media.<br>
<br>
With splitpipe, you can say "run this program repeatedly and send
(up to) this much data to it each time, until there is no more input
data". What this means is:<br>
<ul>
<li>You have something like tar running to backup a very large set
of data, which is piped into splitpipe.</li>
<li>Splitpipe will then run the command you specify and shunt its
standard input (from tar) to the standard input of whatever
command you give it. (In my case, I used growisofs to send raw
data to a BD-R.)</li>
<li>When the first media is full, splitpipe will stop accepting
new data from tar (but hold the pipe open!) and let you change
media. Because tar is blocked on writing, it will not die while
waiting for the stream to resume.</li>
<li>When you tell splitpipe that the next media is ready, it will
spawn the specified command again, and resume shunting its stdin
to the command's stdin, until finished.</li>
</ul>
<p>The concept could use some improvement, and I think I might
actually go ahead and do it becuase I've thought about it several
times before:<br>
</p>
<ol>
<li>A program that could take an arbitrary input and split it
amongst CD/DVD/BD media, and <i>only do that</i>, would be very
useful. Then you could do things like have end-of-media
detection based on the actual media size. (This would mean that
a program would need to actually handle the burning to CD/DVD/BD
directly or with a specially-written helper utility.)</li>
<li>In order to build the previous program a library would be
written, and that same library could be used to then write a
splitpipe clone that works as the current splitpipe does. The
only real addition would be the special-purpose
single-input-to-multiple-discs utility.</li>
<li>The program could automatically handle parallel compression
using as many cores as are available (either on the local host
or an entire network). This is necessary in order to be able to
burn a vast amount of data compressed with e.g., xz in
realtime. Even gzip can't keep up with 4x BD-R rates on an
8-core system using the pigz utility.</li>
</ol>
<p> — Mike<br>
</p>
<div class="moz-signature">-- <br>
<table border="0">
<tbody>
<tr>
<td> <img src="cid:part1.04000103.01010709@naunetcorp.com"
alt="Naunet Corporation Logo"> </td>
<td> Michael B. Trausch<br>
<br>
President, <strong>Naunet Corporation</strong><br>
☎ (678) 287-0693 x130 or (855) NAUNET-1 x130<br>
FAX: (678) 783-7843<br>
</td>
</tr>
</tbody>
</table>
</div>
</body>
</html>