View previous topic :: View next topic |
Author |
Message |
servo888 Apprentice
Joined: 22 Feb 2004 Posts: 293
|
Posted: Wed Jun 09, 2004 4:18 pm Post subject: Website "getting" |
|
|
I found a website with a whole bunch of familiy pictures inside folders. The server has the indexing enabled in apache so I can browse through all of them. I'm wondering if there's any kind of programs that would fetch all of these images and folders automatically...
regards,
servo |
|
Back to top |
|
|
moocha Watchman
Joined: 21 Oct 2003 Posts: 5722
|
Posted: Wed Jun 09, 2004 4:27 pm Post subject: |
|
|
Code: | wget -r -k http://url/ |
See the wget man page for more details.
Another alternative is httrack.
If you want a GUI tool, give khttrack a try. It's a KDE application so it'll pull the whole KDE package in (looong emerge). I don't know if there are Gnome httrack front-ends too. _________________ Military Commissions Act of 2006: http://tinyurl.com/jrcto
"Those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety."
-- attributed to Benjamin Franklin |
|
Back to top |
|
|
neysx Retired Dev
Joined: 27 Jan 2003 Posts: 795
|
Posted: Wed Jun 09, 2004 4:28 pm Post subject: |
|
|
I'm quite certain wget is already installed on your box. man wget and look for --recursive |
|
Back to top |
|
|
servo888 Apprentice
Joined: 22 Feb 2004 Posts: 293
|
Posted: Wed Jun 09, 2004 5:16 pm Post subject: |
|
|
lol good point
yea wget works like a charm! hehe, I don't think I've ever used wget on the http protocal; so I didn't even think about it.
thanks a bunch |
|
Back to top |
|
|
|