|
|
freqy
on 2015-01-10 00:08 [#02482122]
Points: 18724 Status: Regular | Show recordbag
|
|
A possibility?
Funds to help hire more hands, to repair code and upgrade the site to V2?
|
|
EpicMegatrax
from Greatest Hits on 2015-01-10 00:31 [#02482132]
Points: 25264 Status: Regular
|
|
i've offered to help a couple times; most of what i do for a living is PHP nonsense.
however, it's phobs baby. even aside from him not trusting a member to work on that, he may not want anyone else tinkering around with his code, in the same way you wouldn't want other people driving your car.
a v2, though? let's not forget what happened when myspace deleted everyone's photos and expected that to bring back its glory days. once you chuck the roots, the old members won't have a reason to come back, and it won't bring back the glory days.
|
|
freqy
on 2015-01-10 00:36 [#02482136]
Points: 18724 Status: Regular | Show recordbag
|
|
There are still many....many more glory days to be enjoyed.
|
|
EpicMegatrax
from Greatest Hits on 2015-01-10 05:07 [#02482164]
Points: 25264 Status: Regular
|
|
it's sort of like a really, really long feedback delay loop. after it's going for a few years, it starts to be in sync with things on a deeper level. chap manning the loop machine gets home at roughly 6pm everyday, but not exactly, so there's a constellation of slamming doors around 6pm. then the man pulls the plug, upgrades to a new tape machine with fancier trim, but... the loop is gone.
|
|
belb
from mmmmmmhhhhzzzz!!! on 2015-01-10 06:32 [#02482165]
Points: 6384 Status: Lurker
|
|
this make sense for anyone? been a good while since I did any real code, console-run obvs
wget -erobots=off --no-parent -- wait=3 --limit-rate=20K -r -p -U "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)" -A htm,html,css,js,json,gif,jpeg,jpg,bmp http://example.com
"this will grab a site, wait 3 seconds between requests, limit how fast it downloads so it doesn't kill the site, and mask itself in a way that makes it appear to just be a browser so the site doesn't cut you off using an anti-leech mechanism. Note the -A parameter that indicates a list of the file types you want to download. You can also use another tag, -D domain1.com,domain2.com to indicate a series of domains you want to download if they have another server or whatever for hosting different kinds of files. There's no safe way to automate that for all cases, if you don't get the files. wget is commonly preinstalled on Linux, but can be trivially compiled for other Unix systems or downloaded easily for Windows: GNUwin32 WGET"
get this place a fuckin ambo or on the getgo. HALP
|
|
belb
from mmmmmmhhhhzzzz!!! on 2015-01-10 06:36 [#02482166]
Points: 6384 Status: Lurker
|
|
I mean let's not fuck about here, dead DEADline still unknown. GO. even if just ricey yugomab has a crack at it, better than this sad apathy
|
|
EpicMegatrax
from Greatest Hits on 2015-01-10 06:49 [#02482167]
Points: 25264 Status: Regular
|
|
i used a perl script going to a database (mariadb). i haven't run it in a month or two; i should do that for the completionism i suppose. i didn't get around to spidering the user info pages, though, because those require login. not sure if i'll have time for that part. people will just have to email me
|
|
EpicMegatrax
from Greatest Hits on 2015-01-10 06:53 [#02482168]
Points: 25264 Status: Regular
|
|
LAZY_TITLE
|
|
Junktion
from Northern Jutland (Denmark) on 2015-01-10 08:33 [#02482170]
Points: 9713 Status: Lurker
|
|
just let it go
|
|
goDel
from ɐpʎǝx (Seychelles) on 2015-01-10 09:32 [#02482174]
Points: 10225 Status: Lurker | Followup to Junktion: #02482170
|
|
sounds like frozen ;D
|
|
Messageboard index
|