Task:Wiki dumps
(fix template usage) |
|||
(2 intermediate revisions not shown) | |||
Line 1: | Line 1: | ||
- | {{ | + | {{task|proposed}} |
- | + | ||
- | + | ||
- | | proposed | + | |
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | }} | + | |
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
Public weekly SQL dumps of the maemo.org wiki should be made available. This would allow the community to make their own copies of the dumps for backups and for use with offline viewers such as evopedia. | Public weekly SQL dumps of the maemo.org wiki should be made available. This would allow the community to make their own copies of the dumps for backups and for use with offline viewers such as evopedia. | ||
There are two types of dump of MediaWiki: | There are two types of dump of MediaWiki: | ||
- | * A database dump using | + | * A database dump using <pre>mysqldump -h<hostname> -u<user> -p <dbname></pre> which includes account data and site information |
- | * An XML dump of the page contents, which is done on the shell on the server with | + | * An XML dump of the page contents, which is done on the shell on the server with <pre>php $MEDIAWIKI_DIR/maintenance/dumpBackup.php --full > mediawiki_dump.xml</pre> |
All image files and attachments are stored on the filesystem in `/var/lib/mediawiki/images` and can be archived with a .tar.gz | All image files and attachments are stored on the filesystem in `/var/lib/mediawiki/images` and can be archived with a .tar.gz |
Latest revision as of 14:16, 14 April 2010
This task is in the list of maemo.org development proposals, please help planning and getting it ready for a sprint. Put a note on the talk page if you're interested in helping work on this task. Please see the talk page for discussion. |
Public weekly SQL dumps of the maemo.org wiki should be made available. This would allow the community to make their own copies of the dumps for backups and for use with offline viewers such as evopedia.
There are two types of dump of MediaWiki:
- A database dump using
mysqldump -h<hostname> -u<user> -p <dbname>
which includes account data and site information - An XML dump of the page contents, which is done on the shell on the server with
php $MEDIAWIKI_DIR/maintenance/dumpBackup.php --full > mediawiki_dump.xml
All image files and attachments are stored on the filesystem in `/var/lib/mediawiki/images` and can be archived with a .tar.gz
References:
- This page was last modified on 14 April 2010, at 14:16.
- This page has been accessed 5,853 times.