[ale] Presentation challenge

Ron Frazier (ALE) atllinuxenthinfo at techstarship.com
Sat Feb 2 19:19:38 EST 2013


Oops!  Er, well, that is handy info to know for this type of application.  I'll definitely save that for next time.  Thanks for sharing.

And, I'll be perhaps more likely to look at the man page.

Since you mentioned it, I have a question.  Many web servers don't allow you to list the contents of their directories, such as:

http:/podcastsite.com/specialpodcast/

Instead, you have to reference a specific file like index.html or jan2013podcast.mp3.

So, how would wget even be able to find all the mp3's if you didn't provide the full path names and file names?

Sincerely,

Ron


JD <jdp at algoloma.com> wrote:

>On 02/02/2013 04:04 PM, Ron Frazier (ALE) wrote:
>> I actually did that recently.  I thought I had saved an image of the
>screen
>> but couldn't find it.  I had 4 terminal windows open, tiled on the
>screen.  I
>> was using a different wget command in each window to download a bunch
>of mp3
>> podcast files.  As soon as one process would finish, I'd go to that
>window
>> and change the file name and start a new download.  That way, I was
>able to
>> keep 3-4 downloads going at all times and eventually get 60 files or
>so in a
>> couple of hours.  It's possible I could have automated it with a
>script, but
>> didn't know how.
>
>According to either google or the wget man page,
>
>SYNOPSIS
>       wget [option]... [URL]...
>....
>-A acclist --accept acclist
>-R rejlist --reject rejlist
>Specify comma-separated lists of file name suffixes or patterns to
>accept or
>reject. Note that if any of the wildcard characters, *, ?, [ or ],
>appear in an
>element of acclist or rejlist, it will be treated as a pattern, rather
>than a
>suffix.
>
>
>But is posting man pages on this list really useful?
>
>$ wget -Amp3 {URL}
>would find all the mp3 links on a specific URL and download them, in
>turn.
>If you want to further limit which files - perhaps there are
>high-quality and
>low-quality mp3 files ... then
>
>$ wget -Ahq.mp3 {URL}
>would download only those files containing the "hq.mp3" string in the
>link.
>
>30 seconds to read the man page saves how much time manually doing it?
>
>Best of all, ZERO scripting required.
>
>This happens over and over and over with UNIX.  The man page explains
>how to use
>a command efficiently.  man pages are already on your Linux systems. 
>Why not
>use them, especially to save time?
>
>Learning to read man pages took me some time, but since then, it has
>paid off
>1,000,000 fold.
>
>_______________________________________________
>Ale mailing list
>Ale at ale.org
>http://mail.ale.org/mailman/listinfo/ale
>See JOBS, ANNOUNCE and SCHOOLS lists at
>http://mail.ale.org/mailman/listinfo


--

Sent from my Android Acer A500 tablet with bluetooth keyboard and K-9 Mail.
Please excuse my potential brevity.

(To whom it may concern.  My email address has changed.  Replying to former
messages prior to 03/31/12 with my personal address will go to the wrong
address.  Please send all personal correspondence to the new address.)

(PS - If you email me and don't get a quick response, you might want to
call on the phone.  I get about 300 emails per day from alternate energy
mailing lists and such.  I don't always see new email messages very quickly.)

Ron Frazier
770-205-9422 (O)   Leave a message.
linuxdude AT techstarship.com




More information about the Ale mailing list