Word Scrambling Wikipedia for Money

revlimiter

$333.33/day is my goal!
I am reading the Wikipedia Copyright TOS and they seem to be OK with the copy & paste of all of their content as long as you include a link back..
http://en.wikipedia.org/wiki/Wikipedia:Copyrights

Which led me to thinking..

If I created a bot to scrape Wikipedia of all of it's content.. maybe put it through a word scrambler to help SEO out a little bit.. do you think it would have any chance of making money online? This would be an enormous strain on the scraper bot, since we are talking about millions of pages, but I dunno... maybe it has some potential? And I am not talking about millions of dollars either.. just a few bucks here and there..

Anyways.. What are your thoughts / opinions are on this?

Thanks,
 


do you realize how much content is on wikipedia? I highly doubt you have enough storage space and servers to scrape even 40-50% of it :)

how would you be making $ from it? making a website that just spits back out the info? nah dont think so really. maybe if you wanted to make $x or $xx/day sure np have fun w that.

edit also,
http://en.wikipedia.org/wiki/Wikipedia#Reusing_Wikipedia.27s_content
However while access by human being is easy, obtaining the full content of Wikipedia for reuse is not. Direct cloning by the web bot is forbidden. Various dumps that should be a replacement contain no images and may be significantly out of date.[81]
 
Yeah.. there is a lot of content on Wikipedia.. lots and lots of free content.. :)

The whole idea is to automate income and have my webserver do the brunt of the work. Even if it's only an extra $XX per day.. it's still an extra $XX per day that I wouldn't get if I didn't try it.. And for not doing a licking thing to make that money I think that's a pretty good investment, don't you?

But yeah.. if the webserver(s) and content spinner computers are gonna cost $XXX per day and I'm only making $XX per day then it isn't worth it..

If each article on Wikipedia is 100kb (and that's if) the total storage for Wikipedia would be 336GB. That's based on their current 3,523,488 articles. Bah.. finding an ISP with an unlimited upload limit will be hard enough ;P I'll have to shop around...
 
You can actually download Wikipedia in an archive form. Don't have the link, but I know they had it somewhere.