|
|
freqy
on 2011-02-25 05:37 [#02407125]
Points: 18724 Status: Regular | Show recordbag
|
|
hi
Does anyone know if there is a program that can download all information on every web page contained within each of the bookmarks within a browser?
reason : need to download lots of read material onto a laptop that doesnt have the internet. So thought a program that would download every page and also every page linked within each bookmark too into a neat little folder for exploring later on.
would be great to download the entire internet onto my laptop but not enough bandwidth just yet.
(hope Indeksical revived all his stuff by the way)
|
|
freqy
on 2011-02-25 05:42 [#02407126]
Points: 18724 Status: Regular | Show recordbag
|
|
i wanna dowload the internet.
|
|
freqy
on 2011-02-25 05:44 [#02407127]
Points: 18724 Status: Regular | Show recordbag
|
|
some dude is doing it already.
I wonder if he create a floppys disc of the intersnets for me laptops?
|
|
freqy
on 2011-02-25 05:48 [#02407128]
Points: 18724 Status: Regular | Show recordbag
|
|
there is no programa for downloading bookmark content into a folder anywhere on the entire nets. end of thrad.
|
|
freqy
on 2011-02-25 05:57 [#02407131]
Points: 18724 Status: Regular | Show recordbag
|
|
program : httrack
download an entire website to hard disc. like wikipedia lol : P
i wonder how big that site is today and how long it will take?
|
|
-crazone
from smashing acid over and over on 2011-02-25 06:03 [#02407133]
Points: 11233 Status: Regular | Show recordbag
|
|
Hi freqy, what's the use of your thread?
|
|
zoomancer
from Kabul (Afghanistan) on 2011-02-25 11:16 [#02407139]
Points: 1215 Status: Regular
|
|
i remeber there used to be a software you could do it with but I am not sure if that software will be able able to hack it with all of the dynamic content and flvs these days
however you can dowlnload an entire website in PDF form..all the pages and all the hyperlinked stuff..if you have a full version of Adobe Acrobat...not just the reader but the whole schamozle...
|
|
freqy
on 2011-02-25 12:34 [#02407142]
Points: 18724 Status: Regular | Show recordbag
|
|
cheers zoomancer i shall check it out sounds good with teh linked stuff also.
i managed to get httrack working on linux , man , i am downloading a site but , i dont know if it will ever end , people may be uploading quicker than i am downloading ,lol, but anyway loads of info which is great.
peace.
|
|
cx
from Norway on 2011-02-25 15:22 [#02407148]
Points: 4537 Status: Regular
|
|
It could literally go on to crawl large portions of the web if you just set it to download all links and all content. The key is to find filters that prohibit certain pages from being downloaded, while still downloading the exceptions. One filter is to set it to only have 1 recursion depth for all domains, because then it won't go into all the menus of the sites and download a bunch of stuff you don't want but rather only the one page and 1 page on each link and bookmark. There is a program called teleport pro for windows, and wget for linux, both work pretty well.
|
|
Advocate
on 2011-02-25 17:37 [#02407149]
Points: 3319 Status: Lurker
|
|
imagnine downloading xltonic... !
wowee zowee.. a wealth of nonsense right there!
|
|
nightex
from Šiauliai (Lithuania) on 2011-02-25 22:33 [#02407156]
Points: 1275 Status: Lurker
|
|
I believe you need to download wiki. Other projects is too "small" for analysis.
There is alotof software for downloading sites and you know it.
|
|
zoomancer
from Kabul (Afghanistan) on 2011-02-26 13:40 [#02407184]
Points: 1215 Status: Regular
|
|
LAZY_WebRipper here is a version of the kind of sort of software I used to use
um any wiki pages you want you can download them as a handy pdf file they have a link on everypage...just have to look for it on the left pane
|
|
freqy
on 2011-02-26 15:32 [#02407192]
Points: 18724 Status: Regular | Show recordbag
|
|
thanks again zoomancer
: )
|
|
Messageboard index
|