GraniteBalls Aging fast 12262 Posts user info edit post |
How would I go about downloading all the images on a specific web folder, if they're not indexed?
like, lets say I wanted all the PA comics from 05, the web-folder would be
http://www.penny-arcade.com.com/images/2005/(yyyymmdd.jpg)
How could I go about getting copies of all the image files in that folder? Is it possible to automate?
[Edited on July 28, 2006 at 4:27 PM. Reason : naming convention] 7/28/2006 4:25:28 PM |
30thAnnZ Suspended 31803 Posts user info edit post |
you could write a php script to grab them all. they're named following a convention so it wouldn't be hard.
unless they have that disabled, like tww does 7/28/2006 4:26:45 PM |
darkone (\/) (;,,,;) (\/) 11610 Posts user info edit post |
There are programs out there that do this. I've never used any personally, so I can't suggest one. 7/28/2006 4:29:05 PM |
El Nachó special helper 16370 Posts user info edit post |
I used to use a program called offline commander to do this.
I don't know if it's the best (or even if you can download it anymore, as I haven't used it in years), but it's worth a shot. go to download.com and search for "site rippers" and you should find tons of programs to do what you want. 7/28/2006 4:31:04 PM |
GraniteBalls Aging fast 12262 Posts user info edit post |
excellent.
thanks. 7/28/2006 4:32:35 PM |
dFshadow All American 9507 Posts user info edit post |
will firefusk do this?
downthemall would work if there was a html page showing them all 7/28/2006 5:58:08 PM |
joe17669 All American 22728 Posts user info edit post |
curl -O "http://www.penny-arcade.com.com/images/2005/2005[00-12][00-31].jpg" 7/28/2006 6:08:48 PM |
Noen All American 31346 Posts user info edit post |
blackwidow will do it, but since they are all in the same folder and named in an organized manner, a script would be much easier. 7/28/2006 6:10:02 PM |