[ale] Presentation challenge

JD jdp at algoloma.com
Sat Feb 2 17:25:41 EST 2013


On 02/02/2013 04:04 PM, Ron Frazier (ALE) wrote:
> I actually did that recently.  I thought I had saved an image of the screen
> but couldn't find it.  I had 4 terminal windows open, tiled on the screen.  I
> was using a different wget command in each window to download a bunch of mp3
> podcast files.  As soon as one process would finish, I'd go to that window
> and change the file name and start a new download.  That way, I was able to
> keep 3-4 downloads going at all times and eventually get 60 files or so in a
> couple of hours.  It's possible I could have automated it with a script, but
> didn't know how.

According to either google or the wget man page,

SYNOPSIS
       wget [option]... [URL]...
....
-A acclist --accept acclist
-R rejlist --reject rejlist
    Specify comma-separated lists of file name suffixes or patterns to accept or
reject. Note that if any of the wildcard characters, *, ?, [ or ], appear in an
element of acclist or rejlist, it will be treated as a pattern, rather than a
suffix.


But is posting man pages on this list really useful?

$ wget -Amp3 {URL}
would find all the mp3 links on a specific URL and download them, in turn.
If you want to further limit which files - perhaps there are high-quality and
low-quality mp3 files ... then

$ wget -Ahq.mp3 {URL}
would download only those files containing the "hq.mp3" string in the link.

30 seconds to read the man page saves how much time manually doing it?

Best of all, ZERO scripting required.

This happens over and over and over with UNIX.  The man page explains how to use
a command efficiently.  man pages are already on your Linux systems.  Why not
use them, especially to save time?

Learning to read man pages took me some time, but since then, it has paid off
1,000,000 fold.



More information about the Ale mailing list