![]() Will dump the mywiki database to mywikidump.sql. You can use the pg_dump tool to back up a MediaWiki PostgreSQL database. ![]() Also see the talk page for more information about working with character sets in general. See the relevant section of the upgrading page for information about this process. See mailing list thread mysql5 binary schema about the topic. However, under certain circumstances the disadvantage of having to rebuild all this data may outweigh saving disk space (for example, on a large wiki where restoration speed is paramount). So to save disk space (beyond just gziping), although those tables need to be present in a proper dump, their data does not. Some of the tables dumped have different degrees of temporariness. bin/sh: -c: line 1: syntax error: unexpected end of file bin/sh: -c: line 0: unexpected EOF while looking for matching `'' usr/bin/mysqldump -u $USER -password=$PASSWORD $DATABASE -c | /bin/gzip > ~/backup/wiki-$DATABASE-$(date '+\%Y\%m\%d').sql.gz If you want to add this task in Cron through Cpanel then you must escape the character "%" The table format used by MediaWiki cannot be backed up with this tool, and it will fail silently! If you want to save the files and extensions as well, you might want to use this one.ĭo not attempt to back up your MediaWiki database using mysqlhotcopy. This will write a backup file with the weekday in the filename so you would have a rolling set of backups. Use valid values for $USER, $PASSWORD, and $DATABASE. The nice -n 19 lowers the priority of the process. Nice -n 19 mysqldump -u $USER -password=$PASSWORD $DATABASE -c | nice -n 19 gzip -9 > ~/backup/wiki-$DATABASE-$(date '+%Y%m%d').sql.gz Cron enables users to schedule jobs (commands or shell scripts) to run periodically at certain times or dates.Ī sample command that you may run from a crontab may look like this: Remember to also backup the file system components of the wiki that might be required, e.g., images, logo, and extensions.Ĭron is the time-based job scheduler in Unix-like computer operating systems. Mysqldump -h hostname -u userid -p -xml dbname | gzip > Mysqldump -h hostname -u userid -p -xml dbname > backup.xmlĪnd to compress the file with a pipe to gzip Mysqldump -no-tablespaces -h hostname -u userid -p dbname | gzip > Ī similar mysqldump command can be used to produce XML output instead, by including the -xml parameter. The solution is to add the -no-tablespaces option to the command: Some newer versions of MySQL might show an error about tablespaces and PROCESS privilege. Mysqldump -h hostname -u userid -p dbname | gzip > The output from mysqldump can instead be piped to gzip, for a smaller output file, as follows See mysqldump for a full list of command line parameters. Īfter running this line from the command line mysqldump will prompt for the server password (which may be found under Manual:$wgDBpassword in LSP). ![]() While dbname may be found under $wgDBname. If whatever is not specified mysqldump will likely use the default of utf8, or if using an older version of MySQL, latin1. Userid may be found under $wgDBuser, whatever may be found under $wgDBTableOptions, where it is listed after DEFAULT CHARSET=. Hostname may be found under $wgDBserver by default it is localhost. Substituting hostname, userid, whatever, and dbname as appropriate.Īll four may be found in your LocalSettings.php (LSP) file. Mysqldump -h hostname -u userid -p -default-character-set=whatever dbname > backup.sql This can be removed as soon as the dump is completed.Įxample of the command to run on the Linux/UNIX shell: $wgReadOnly = 'Dumping Database, Access will be restored shortly' When using the default MySQL or MariaDB backend, the database can be dumped into a script file which can be used later to recreate the database and all the data in it from scratch. If your wiki is currently offline, its database can be backed up by simply copying the database file. Most of the critical data in the wiki is stored in the database. The hosting company might provide a file manager interface via a web browser check with your provider.SCP (or WinSCP), SFTP/FTP or any other transfer protocol you choose.Non-private data you can simply publish on and/or in a dumps/ directory of your webserver.You will have to choose a method for transferring files from the server where they are: This makes sure all parts of your backup are consistent (some of your installed extensions may write data nonetheless). File system Software configuration files, custom skins, extensions, images (including deleted images), etc.Ĭonsider making the wiki read-only before creating the backup - see $wgReadOnly. MediaWiki stores important data in two places:ĭatabase Pages and their contents, users and their preferences, metadata, search index, etc. Help:Export is a quick and easy way to save all pages on your wiki.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |