Evopedia
(→Dumps) |
(→Creating a Wikipedia Dump) |
||
Line 43: | Line 43: | ||
== Creating a Wikipedia Dump == | == Creating a Wikipedia Dump == | ||
+ | |||
+ | '''Rumor has it that some automatic distributed dump system will be released soon. So the following information is kind of outdated.''' | ||
Creating a Wikipedia dump is unfortunately a rather lengthy process (some days). If you have any suggestions for speeding it up, please step forward. Concerning space, 20 GB should be enough for creating the dump. | Creating a Wikipedia dump is unfortunately a rather lengthy process (some days). If you have any suggestions for speeding it up, please step forward. Concerning space, 20 GB should be enough for creating the dump. | ||
Line 50: | Line 52: | ||
==== 1. install needed software (please add packages if I forgot some): ==== | ==== 1. install needed software (please add packages if I forgot some): ==== | ||
- | apt-get install php5-cli php5-mysql python mysql-server mysql-client wget zip tidy git-core | + | apt-get install php5-cli php5-mysql python mysql-server mysql-client wget zip tidy git-core texlive texlive-math-extra |
+ | |||
+ | On some distributions, texlive must be replaced by tetex and texlive-math-extra by tetex-extra | ||
==== 2. download evopedia source code ==== | ==== 2. download evopedia source code ==== | ||
Line 60: | Line 64: | ||
set php's memory_limit to some higher value (128 MB) in /etc/php5/cli/php.ini (or similar) | set php's memory_limit to some higher value (128 MB) in /etc/php5/cli/php.ini (or similar) | ||
- | create empty mysql database | + | create empty mysql database and grant all rights to some user |
edit paths and database settings in dumpscripts/createdump.sh | edit paths and database settings in dumpscripts/createdump.sh |
Revision as of 00:07, 6 March 2010
Evopedia is an offline Wikipedia viewer. It allows to search for articles based on their title or the geographic location on a map. If there is a connection to the internet, even images are displayed.
Apart from the software you also need to download a specially prepared dump of the articles available here.
Contents |
Dumps
Dumping Wikipedia is a very time-consuming task. Please be patient or help creating the dumps.
Please report if you find errors in the dumps.
New dumps:
- German Wikipedia, 2010-01-17 (2.2 GB)
Converted dumps:
These dumps are from older versions of evopedia, some features may be missing.
- Dutch Wikipedia, 2009-10-16 (0.8 GB)
- English Wikipedia, 2009-02-28 (4.9 GB)
- Esperanto Wikipedia, 2009-11-17 (0.1 GB)
- French Wikipedia, 2009-10-11 (1.9 GB)
- Italian Wikipedia, 2009-11-09 (1.7 GB)
- Japanese Wikipedia, 2009-03-01 (1.5 GB) (pages are a bit broken but content should be there)
- Spanish Wikipedia, 2009-11-16 (1.4 GB)
Dump Installation
Extract one (or more) of the .zip-Files to your device and select it from within evopedia. The dump that is used can be changed using the (second) link at the top-left of the search screen.
Note that you cannot directly download dumps greater than 4 GB onto your device (FAT file size limitation). Please download it to a computer with a non-FAT filesystem and then unzip it directly onto your device (via USB for example).
Tweaks
Evopedia creates a configuration file at /home/user/.evopediarc when it is first started. Most of the options are self-explanatory.
By changing listen_address to 0.0.0.0 you (or anybody else, so be careful) can also access evopedia from a different computer using the url http://<ip adress of your device>:8080/
maptile_repositories is a string that specifies which map types are available and where the tile images are stored. If it is empty, tiles are not stored at all and only OpenStreetmap is available. If this is not done automatically, you can set it to
maptile_repositories = [OpenStreetMap I|http://tile.openstreetmap.org/%d/%d/%d.png|/home/user/MyDocs/.maps/OpenStreetMap I/|0,OpenStreetMap II|http://tah.openstreetmap.org/Tiles/tile/%d/%d/%d.png|/home/user/MyDocs/.maps/OpenStreetMap II/|0,OpenCycleMap|http://c.andy.sandbox.cloudmade.com/tiles/cycle/%d/%d/%d.png|/home/user/MyDocs/.maps/OpenCycleMap/|0,Public Transport|http://tile.xn--pnvkarte-m4a.de/tilegen/%d/%d/%d.png|/home/user/MyDocs/.maps/Public Transport/|0]
(without line breaks) to use the files also used by maep.
Creating a Wikipedia Dump
Rumor has it that some automatic distributed dump system will be released soon. So the following information is kind of outdated.
Creating a Wikipedia dump is unfortunately a rather lengthy process (some days). If you have any suggestions for speeding it up, please step forward. Concerning space, 20 GB should be enough for creating the dump.
For the advanced users, note that the process can be distributed over many computers by calling dumpWiki with the slice number to process on each computer.
1. install needed software (please add packages if I forgot some):
apt-get install php5-cli php5-mysql python mysql-server mysql-client wget zip tidy git-core texlive texlive-math-extra
On some distributions, texlive must be replaced by tetex and texlive-math-extra by tetex-extra
2. download evopedia source code
git clone git://github.com/crei/evopedia.git
3. adjust settings
set php's memory_limit to some higher value (128 MB) in /etc/php5/cli/php.ini (or similar)
create empty mysql database and grant all rights to some user
edit paths and database settings in dumpscripts/createdump.sh
change the dump language (bottom of dumpscripts/createdump.sh)
4. start the dump process
call createdump.sh