[ale] How to drive Linux browser to make a campground sniper?
Mark@markulmer.com
mark at markulmer.com
Sun Jan 14 12:24:07 EST 2018
Neal,
Have you looked at getting a head start by having someone build you the website scraping part for you? You may look at Fiverr.com and search for website scraping and/or automation. You should be able to get initial custom script built for anywhere from $5-$25. I’ve had great experiences with these freelancers. Be sure to include Linux in your requirements.
Mark Ulmer
>> On January 13, 2018 1:16:34 PM Neal Rhodes via Ale <ale at ale.org> wrote:
>>
>> Boring "Real-World Details":
>>
>> So, we are planning a summer trip to Glacier National Park in Montana.
>>
>> We would really like to camp at Many Glaciers Campground in the park. However, at present, all the sites are already reserved. They are reserved through recreation.gov, starting 6 months to the day from today.
>>
>> But we have a really strong impression that people initially book a long stretch, then later either reduce the duration once they get more specific plans, or cancel.
>>
>> SO, we really want to detect if/when sites become available over the next 6 months and jump on it before someone else does.
>>
>> There is someone who offers this as a service for $40 a reservation, irrespective of whether they are successful.
>>
>> Exciting Technology Application:
>>
>> Initially I looked at the HTML for their search page, with the thought of using "wget" to simulate the reservation request. That increasingly looks like a fool's errand, assuming that they may have session cookies related to sign-on and other magic handshake crap that would be difficult to simulate. And what happens when they alter their data fields?
>>
>> Then I thought: All I want to do is:
>> Setup a browser window on our Centos 6 desktop, any browser that understands https;
>> Run that browser through the responses to get it to the search window on this campground, and put in all the dates and related input.
>> Then:
>> Run SOMETHING that will automate:
>> Hit the Search Submit button;
>> See if the resultant page contains "No Suitable availability"
>> IF Not: Email me
>> Sleep 15 minutes
>> Rinse, Later, Repeat
>>
>> This sounds to me like a very elemental application of a test/control manager for a GUI interface. If I can automate an existing browser, we can eliminate all the complexities of trying to fake out their web server.
>>
>> Since this just sits on my desk in the basement, I can live with hard-coded screen coordinates.
>>
>> What tools exist in Linux to do this?
>>
>> regards,
>>
>> Neal Rhodes
>> MNOP Ltd
>>
>>
>> _______________________________________________
>> Ale mailing list
>> Ale at ale.org
>> http://mail.ale.org/mailman/listinfo/ale
>> See JOBS, ANNOUNCE and SCHOOLS lists at
>> http://mail.ale.org/mailman/listinfo
>>
> _______________________________________________
> Ale mailing list
> Ale at ale.org
> http://mail.ale.org/mailman/listinfo/ale
> See JOBS, ANNOUNCE and SCHOOLS lists at
> http://mail.ale.org/mailman/listinfo
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.ale.org/pipermail/ale/attachments/20180114/7bf3862d/attachment.html>
More information about the Ale
mailing list