I was looking to download the data dumps for Wikipedia so that I can always have it to hand wherever I am, since I don't always have an internet connection.
Anyways... When I go to the page I just can't figure out which file to download. Could anyone direct me to the correct file?
How about the page that talks about it? Google is awesome.
Yeah, I found that page, but I got confused by the long list of files. I was expecting just links to the dumps.
I've actually read it properly now and got it. So thanks.
Er, you do realise that the download, being many gigabytes in itself, is actually compressed with a ratio of up to 2000%, i.e. the uncompressed source could be something like 30GB?
I have lots of space to spare, so it doesn't really bother me that much. Plus the download was only about 2.5GB, and uncompressed turned out to be just under 10GB. I was expecting it to be much bigger actually. lol
I am having a problem getting it into the mediawiki install I've done though.
I've using xml2sql to create pages.sql, revision.sql and text.sql.
I used the mysql command line to put pages.sql in no problem, but when I try to do revision.sql I keep getting an error saying that the column count for the table is different to the number of fields or something like that.