Evopedia is an offline Wikipedia viewer. It allows to search for articles based on their title or the geographic location on a map. If there is a connection to the internet, even images are displayed.

The information here is partly outdated. Please visit the project website.




Since downloads of the dumps caused 2 TB of data transfer on the server in one week, the big dumps are now only available via BitTorrent. Please install a BitTorrent client (if not already installed on your system) to use the .torrent files.

New dumps:

Note that you need evopedia version 0.3.0 (NOT 0.3.0 RC 3) for the search function to work properly for languages not based on the latin alphabet.

Dumping Wikipedia is a very time-consuming task. Please be patient or help creating the dumps.

Please report if you find errors in the dumps.

Distributed dump processing

Dump Installation

Extract one (or more) of the .zip-Files to your device and select it from within evopedia. The dump that is used can be changed using the (second) link at the top-left of the search screen.

Note that you cannot directly download dumps greater than 4 GB onto your device (FAT file size limitation). Please download it to a computer with a non-FAT filesystem and then unzip it directly onto your device (via USB for example).


Evopedia creates a configuration file at /home/user/.evopediarc when it is first started. Most of the options are self-explanatory.

By changing listen_address to you (or anybody else, so be careful) can also access evopedia from a different computer using the url http://<ip adress of your device>:8080/

maptile_repositories is a string that specifies which map types are available and where the tile images are stored. If it is empty, tiles are not stored at all and only OpenStreetmap is available. If this is not done automatically, you can set it to

maptile_repositories = youtube idm [OpenStreetMap I|http://tile.openstreetmap.org/%d/%d/%d.png|/home/user/MyDocs/.maps/OpenStreetMap I/|0,OpenStreetMap II|http://tah.openstreetmap.org/Tiles/tile/%d/%d/%d.png|/home/user/MyDocs/.maps/OpenStreetMap II/|0,OpenCycleMap|http://c.andy.sandbox.cloudmade.com/tiles/cycle/%d/%d/%d.png|/home/user/MyDocs/.maps/OpenCycleMap/|0,Public Transport|http://tile.xn--pnvkarte-m4a.de/tilegen/%d/%d/%d.png|/home/user/MyDocs/.maps/Public Transport/|0]

(without line breaks) to use the files also used by maep.

Creating a Wikipedia Dump

Please consider joining the distributed dump system mentioned above. Also, the information below is a bit outdated.

Creating a Wikipedia dump is unfortunately a rather lengthy process (some days). If you have any suggestions for speeding it up, please step forward. Concerning space, 20 GB should be enough for creating the dump.

For the advanced users, note that the process can be distributed over many computers by calling dumpWiki with the slice number to process on each computer.

1. install needed software (please add packages if I forgot some):

apt-get install php5-cli php5-mysql python mysql-server mysql-client wget zip tidy git-core texlive texlive-math-extra 

On some distributions, texlive must be replaced by tetex and texlive-math-extra by tetex-extra

2. download evopedia source code


Note: Please don't use the github address anymore, it's deprecated:

git clone git://github.com/crei/evopedia.gitr (deprecated) 

3. adjust settings

Set php's memory_limit to some higher value (128 MB) in /etc/php5/cli/php.ini (or similar)

Create empty mysql database and grant all rights to some user - e.g. for database named wikidb, user thomas and password x1lx:

> mysql -p
mysql> create database wikidb;
mysql> grant all privileges on wikidb.* to thomas@localhost identified by 'x1lx';

Edit paths and database settings in dumpscripts/createdump.sh (especially: dbuser, password, REPODIR, DUMPDIR) - make sure DUMPDIR points to area with plenty of space

Change the dump language ("de" in "for" statement at bottom of dumpscripts/createdump.sh)

Make ~/evopedia/evopedia/evopedia/*.py files executable (chmod a+x ...).

4. start the dump process

call createdump.sh

(Don't be too optimistic though. After part of the day the database is filled with several gigabytes of downloaded wikipedia content, but something is broken and the final dump gives no results.)