Wikipedia offline reader: put all of Wikipedia on your laptop



Wikipedia periodically publishes full data dumps of the encyclopedia’s content. If you wanted to make your own copy of the Wikipedia site for offline viewing, you’d typically convert and import that content into MySQL using MediaWiki’s importDump.php utility. The initial import process can take over a day. Building the indexes for searching articles takes even longer.

Thanassis Tsiodras came up with a better way of using the Wikipedia dump for offline reading:

Wouldn’t it be perfect, if we could use the wikipedia “dump” data JUST as they arive after the download? Without creating a much larger (space-wize) MySQL database? And also be able to search for parts of title names and get back lists of titles with “similarity percentages”?

The end result is a Wikipedia reader that indexes the entire dump in under 30 minutes, stores the entire data in the original, though segmented, bz2 compressed format, and comes complete with a light web interface for searching and reading entries.

Building a (fast) Wikipedia offline reader – Link

WikipediaFS – a Linux MediaWiki file-system – Link

6 thoughts on “Wikipedia offline reader: put all of Wikipedia on your laptop

  1. naikrovek says:

    That first link is broken, the HTML looks funky.

  2. jason_striegel says:

    Whoops! Thanks, naikrovek. It should be fixed now.

  3. prashanthellina says:

    Wikipedia dumps are a great idea. I am exploring interesting ways to use them. I am posting progress here

Comments are closed.

Discuss this article with the rest of the community on our Discord server!