View previous topic :: View next topic |
Author |
Message |
pops45042 Apprentice
Joined: 07 Jan 2004 Posts: 278 Location: Cincinnati, Ohio
|
Posted: Wed Jan 05, 2005 5:32 pm Post subject: Copy a Website |
|
|
Guys, I need some help!
I'm trying to make a copy of a website. I don;t own it, I just need to make a copy of it. I've found a windows based tool called "Elitesys superbot" here
http://www.sparkleware.com/superbot/manual.htm
But I need something that can run on my gentoo box. Anyone have any ideas?
Paul _________________ "That which does not kill me only postpones the inevitable"
Laugh at my old system!
AMD Opteron 165@ 2750Mhz, 2gig 400mhz Cas2 Crucial Ballistix, Abit NF8 Ultra MB, WD 80gig (system), WD 120gig (storage), Geforce 8800GTS 320MB, Mushkin 550W PS |
|
Back to top |
|
|
ToeiRei Veteran
Joined: 03 Jan 2005 Posts: 1191 Location: Austria
|
Posted: Wed Jan 05, 2005 5:36 pm Post subject: |
|
|
wget ?
Rei _________________ Please stand by - The mailer daemon is busy burning your messages in hell... |
|
Back to top |
|
|
Pink Veteran
Joined: 24 Jul 2003 Posts: 1062
|
Posted: Wed Jan 05, 2005 5:37 pm Post subject: |
|
|
File - Save ? |
|
Back to top |
|
|
pops45042 Apprentice
Joined: 07 Jan 2004 Posts: 278 Location: Cincinnati, Ohio
|
Posted: Wed Jan 05, 2005 5:42 pm Post subject: |
|
|
Well, i was going to do the ol file-> save routine, but If I go that route I'll likeley be sitting here saving web pages for a few days non-stop! I'm looking for something a bit automatic that will do down throught the website's local links and copy those pages.
What complicates it a bit is the fact that quite a few pages are written in php and aren;t static.
Paul _________________ "That which does not kill me only postpones the inevitable"
Laugh at my old system!
AMD Opteron 165@ 2750Mhz, 2gig 400mhz Cas2 Crucial Ballistix, Abit NF8 Ultra MB, WD 80gig (system), WD 120gig (storage), Geforce 8800GTS 320MB, Mushkin 550W PS |
|
Back to top |
|
|
Dizzutch Guru
Joined: 09 Nov 2004 Posts: 463 Location: Worcester, MA
|
Posted: Wed Jan 05, 2005 5:47 pm Post subject: |
|
|
wget -rx website.com
that might do it.
edit +x |
|
Back to top |
|
|
pops45042 Apprentice
Joined: 07 Jan 2004 Posts: 278 Location: Cincinnati, Ohio
|
Posted: Wed Jan 05, 2005 5:58 pm Post subject: |
|
|
Dizzutch! Amazing! I think I love you man!
That worked like a charm! Thanks a bunch!
Paul _________________ "That which does not kill me only postpones the inevitable"
Laugh at my old system!
AMD Opteron 165@ 2750Mhz, 2gig 400mhz Cas2 Crucial Ballistix, Abit NF8 Ultra MB, WD 80gig (system), WD 120gig (storage), Geforce 8800GTS 320MB, Mushkin 550W PS |
|
Back to top |
|
|
ToeiRei Veteran
Joined: 03 Jan 2005 Posts: 1191 Location: Austria
|
Posted: Wed Jan 05, 2005 6:03 pm Post subject: |
|
|
it's amazing how the simple tools work, linux provides, eh?
Rei _________________ Please stand by - The mailer daemon is busy burning your messages in hell... |
|
Back to top |
|
|
Pink Veteran
Joined: 24 Jul 2003 Posts: 1062
|
Posted: Wed Jan 05, 2005 10:09 pm Post subject: |
|
|
pops45042 wrote: | Well, i was going to do the ol file-> save routine, but If I go that route I'll likeley be sitting here saving web pages for a few days non-stop! I'm looking for something a bit automatic that will do down throught the website's local links and copy those pages.
What complicates it a bit is the fact that quite a few pages are written in php and aren;t static.
Paul |
I see, well the php won't matter, all you will get is the html output anyway (not the php code behind it).
But at least you can see the resulting html ok. |
|
Back to top |
|
|
Naib Watchman
Joined: 21 May 2004 Posts: 6052 Location: Removed by Neddy
|
Posted: Wed Jan 05, 2005 10:19 pm Post subject: |
|
|
"httrack" is a good program for this
in portage _________________
Quote: | Removed by Chiitoo |
|
|
Back to top |
|
|
|