DokuWiki

It's better when it's simple

User Tools

Site Tools


tips:backup_script

This is an old revision of the document!


Backup data from dokuwiki

Use this if you lost your ftp access, for example. You MUST allow php in pages to use this. This includes PEAR and Archive_Tar, and works in SAFE mode. It's completly insecure, the tar.gz file remains after use.

USE ONLY WHEN ALL OTHER MEANS HAVE FAILED Copypaste the contents of this file into a doku page on your wiki

One-time backup

This backs up the entire dokuwiki directory into a bz2 tar file. It is a simplified version of the script below. The commands are most likely to be run manually. The commands assume that you are in the parent directory of the dokuwiki install. My entire dokuwiki is in ~/wiki 1). I ran this from ~ . The second and third commands are not really needed, but if you want to have multiple backups, it may help

cd ~
mkdir wikibackup
cd wikibackup/
tar -cjf wiki.tar.bz2 ../wiki

Alternate command

A version of the above, with the addition of the current date to the filename.

tar -cjf backupDir/wiki.`date '+%d%m%y'`.tar.bz2 ../wiki

Backup Script

This small Bash-Shellscript (probably works with other shells too) creates backups of your wiki data/ and media/ directories. It keeps several old backups (see config) and archive backups, created on the first day of the month. You should put it in cron.daily, otherwise it does not really work like intended. – dp paul [at] thewall [dot] de

dw-backup.sh
#!/bin/sh
# $Id: dw-backup.sh 328 2004-12-22 13:15:20Z dp $
 
# config 
WIKIPATH="/home/dp/www/dokuwiki" # path to your wiki, no symbolic links are allowed!
# for debian etch: /var/lib/dokuwiki
 
BACKUPPATH="/home/dp/dw-backup" # where do you save the backups?
DAILY_DATA_BACKUPS="8" # keep this amount data backups
DAILY_MEDIA_BACKUPS="3" # and media backups
# no more config
 
# creates $1, if not existant
checkDir()
{
	if [ ! -d "${BACKUPPATH}/$1" ]
	then
		mkdir -p "${BACKUPPATH}/$1"
	fi
}
 
# 1 -> path
# 2 -> name
# 3 -> number of backups
rotateDir()
{
	for i in `seq $(($3 - 1)) -1 1`
	do
		if [ -f "$1/$2.$i.tar.bz2" ]
		then
			mv "$1/$2.$i.tar.bz2" "$1/$2.$((i + 1)).tar.bz2"
		fi
	done
}
 
 
# make sure everything exists
checkDir "data"
checkDir "data/archive"
checkDir "data/daily"
 
checkDir "media"
checkDir "media/archive"
checkDir "media/daily"
 
# first step: rotate daily.
rotateDir "${BACKUPPATH}/data/daily" "data" "$DAILY_DATA_BACKUPS"
rotateDir "${BACKUPPATH}/media/daily" "media" "$DAILY_MEDIA_BACKUPS"
 
# then create our backup
#   --exclude is not accepted for Linksys NSLU2 box, any alternative?
tar --exclude=".*" -cjf "/tmp/data.1.tar.bz2" -C "${WIKIPATH}" "data"
tar --exclude=".*" -cjf "/tmp/media.1.tar.bz2" -C "${WIKIPATH}" "media"
 
# for debian etch, replace "media" by "data/media" in line above
# and add --exclude="media" to first tar line
 
 
# create an archive backup?
if [ `date +%d` = "01" ]
then
	cp "/tmp/data.1.tar.bz2" "${BACKUPPATH}/data/archive/data-"`date +%m-%d-%Y`".tar.bz2"
	cp "/tmp/media.1.tar.bz2" "${BACKUPPATH}/media/archive/media-"`date +%m-%d-%Y`".tar.bz2"
fi
 
# add them to daily.
mv "/tmp/data.1.tar.bz2" "${BACKUPPATH}/data/daily"
mv "/tmp/media.1.tar.bz2" "${BACKUPPATH}/media/daily"

An rsync alternative

The script above works perfectly fine, thanks!

That being said, for bigger wiki sites the archive might grow very big with a daily tar file of the whole web site, especially the media directory.
Furthermore, I'd like to be able to run the backup script on a trusted and more secure server. Thus here is another way of doing the backup, possibly on another server, that will keep only one copy of the site plus daily differences using rsync.

This script is far from being well written, so please feel free to do any improvements you like!

#!/bin/bash
#========================
# backup script for stuff on wiki to be run periodically through a crontab on a trusted server
# > crontab -e
#
# daily : 1 1 * * * /path/to/script/wwwbackup.sh
# weekly: 1 1 * * 0 /path/to/script/wwwbackup.sh # I Guess weekly backup is enough for not frequetly updated wiki
#========================
here=`pwd`
mydate="`date '+%Y%m%d.%H%M'`"
 
# Define wiki location
myhost='www'                       #name of the server hosting your your wiki
myuser='user'                      #name of the user that has an ssh access to "myhost"
relpathtowiki='relpath/to/wiki/'   #relative path to your wiki from "myhost" base
backupsource="${myuser}@${myhost}:${relpathtowiki}"
#backupsource="/abs/path/to/wiki/" #Use this line instead of the above if you run the backup script directly on your www server.
 
# Define location of backup
backupdir="/path/to/backup/${myhost}"
logfile="${backupdir}/backup.log"
excludelist="${here}/wwwbackup-exclude-list.txt"
 
bkname="backup"
 
nbbackup="7" # keep this amount of old backup
 
#-- creates $1, if not existant
checkDir() {
    if [ ! -d "${backupdir}/$1" ] ; then
        mkdir -p "${backupdir}/$1"
    fi
}
 
# 1 -> path
# 2 -> name
# 3 -> number of backups
rotateDir() {
    for i in `seq $(($3 - 1)) -1 0`
      do
      if [ -d "$1/$2-$i" ] ; then
          /bin/rm -f "$1/$2-$((i + 1))"
          mv "$1/$2-$i" "$1/$2-$((i + 1))"
      fi
    done
}
 
#-- make sure everything exists
checkDir "archive"
checkDir "daily"
 
#-- first step: rotate daily.
rotateDir "${backupdir}/daily" "$bkname" "$nbbackup"
 
mv ${logfile} ${backupdir}/daily/${bkname}-1/
 
cat >> ${logfile} <<_EOF
===========================================
  Backup done on: $mydate
===========================================
_EOF
 
#-- Do the backup and save difference in backup-1
mkdir -p ${backupdir}/daily/${bkname}-1/
mkdir -p ${backupdir}/daily/${bkname}-0/
cd ${backupdir}/daily/${bkname}-0
rsync -av --whole-file --delete --force \
    -b --backup-dir ${backupdir}/daily/${bkname}-1/${thisdir} \
    --exclude-from=${excludelist} \
    $backupsource . \
    1>> ${logfile} 2>&1
 
#-- create an archive backup every month
lastarchivetime="0"
if [ -r ${backupdir}/lastarchivetime ] ; then
  lastarchivetime=`cat ${backupdir}/lastarchivetime`
fi
now=`date +%j`
let diffday=$now-$lastarchivetime
if [ $diffday  -ge 30 -o $diffday -lt 0 ] ; then
    echo $now > ${backupdir}/lastarchivetime
    cd ${backupdir}/daily
    tar -cjf ${backupdir}/archive/${bkname}-${mydate}.tar.bz2 ./${bkname}-0
fi

The script uses an exclude file, listing all file and dir “pattern name” you wish not to include in the backup. Here is a copy of my wwwbackup-exclude-list.txt file:

data/.cache/*

Download a copy of your wiki installation

Place this script in your DokuWiki directory and give your web server the rights to execute it, and you'll be able to download a compressed copy of your entire dokuwiki installation. Note: The script uses an exclude file to avoid certain files to be included in your backup (for example your cache dir)

#!/bin/bash
 
echo Content-type: application/octet-stream; name="dokuwiki.tar.bz2"
echo Content-disposition: filename="dokuwiki.tar.bz2"
echo
 
tar --exclude-from=.excluir -cvjO  ../dokuwiki/
How would you actually download the tarball? A link in the wiki? – samaritan dshackel [at] arbor [dot] edu
You can just enter the URL manually. It should look like this: http://yourhost.com/yourwikidir/backupscript.cgiChristopherArndt 2005-10-17 13:12

Download a copy of your wiki installation (PHP version)

I like the elegance of the bash script, but some installations may not have .cgi or shells as a processing option. Since PHP is a given, it makes sense to work from there.

Header redirects often confuse browsers, so we'll sacrifice a some file space rather than just a pipe. This has the side effect of leaving a file we can grab again if the download chokes, which can be useful.

<?php
$archFile = "dokuwiki.tar.gz";
$dumpFile = "/tmp/dokuwiki.dump.".gmdate("His");
$fullPath = realpath('.');
unlink("$fullPath/$archFile");
exec("tar -cz -C .. ".basename($fullPath)." > $dumpFile");
rename($dumpFile, "$fullPath/$archFile");
 
$host  = $_SERVER['HTTP_HOST'];
$uri   = rtrim(dirname($_SERVER['PHP_SELF']), '/\\');
header("Location: http://$host$uri/$archFile");
exit;
?>

A more agnostic version of this could be done with the PEAR Tar class, but this works fine on unix type installations. Baavgai 2007-06-08 22:24

Download a copy of your wiki installation (pure PHP version)

For completeness, here's a PEAR Tar version of the above. It should work on any box, including Windows, that has the PHP libraries. Baavgai 2007-06-13 08:37

<?php
require("Archive/Tar.php");
$archFile = "dokuwiki.tar.gz";
$dumpFile = "/tmp/dokuwiki.dump.".gmdate("His");
$fullPath = realpath('.');
unlink("$fullPath/$archFile");
$tar = new Archive_Tar($dumpFile, "gz");
 
# make relative path
$tar->createModify(array($fullPath), "dokuwiki", $fullPath);
 
# keep absolute
# $tar->create($files);

rename($dumpFile, "$fullPath/$archFile");
 
$host  = $_SERVER['HTTP_HOST'];
$uri   = rtrim(dirname($_SERVER['PHP_SELF']), '/\\');
header("Location: http://$host$uri/$archFile");
exit;
?>

Download a copy of your wiki installation (modified PHP version)

A slightly modified version of the above. Doesn't make use of header(“Location:…. Put it as, e.g., “dump.php” to the root of your DokuWiki installation and open the appropriate URL in your web browser. After saving dokuwiki_YYYY-MM-DD_hh-mm-ss.tar.gz to your local computer, remove both files to avoid security risks.

<?php
require("Archive/Tar.php");
 
$title = "Dump my DokuWiki folder";
$archFile = "dokuwiki_".strftime("%Y-%m-%d_%H-%M-%S").".tar.gz";
$dumpFile = "/tmp/dokuwiki.dump.".gmdate("U");
$fullPath = realpath('.');
 
echo <<<END
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<title>$title</title>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
</head>
<body bgcolor="#FFFFFF">
<h1>$title</h1>
<p>Dumping local folder "$fullPath"<br /> to archive "$archFile"...</p>
END;
 
$tar = new Archive_Tar($dumpFile, "gz");
$tar->createModify(array($fullPath), "dokuwiki", $fullPath);
 
if (file_exists($dumpFile)) {
  rename($dumpFile, "$fullPath/$archFile");
  echo <<<END
<p>Result: successful</p>
<p>Download: <a href="$archFile">$archFile</a></p>
END;
}
else {
  echo <<<END
<p>Result: failed</p>
END;
}
 
echo <<<END
</body>
</html>
END;
?>

(Yet Another) PHP script to download a copy of your DokuWiki installation

  • Does not use intermediate $dumpFile
  • Uses Content-Disposition
  • Deletes backup file prior exit
<?php
$archive = realpath('.') . '/dokuwiki-data.tar.gz';
 
if (is_file($archive)) unlink($archive);
 
exec('tar cz data > ' . basename($archive));
 
header('Content-type: application/octet-stream');
header('Content-Disposition: attachment; filename="' . basename($archive) . '"');
readfile($archive);
 
unlink($archive);
?>

dreamlusion 2008/3/9 16:00:00

windows command line script

Although wndows “ms dos shell” is not powerful as Unix shells, it can handle a lot of problems with or without thirdy part programs (Some examples). For this script you must install (or unzip) the ncftp client executables, and then you can run the following script:

ncFtpGetDokuWikiBackup.bat
@echo off
rem place this script in the parent directory of dokuBackup folder.
IF EXIST dokuBackup (
  echo dokuBackup already exist
  echo renaming ...
  IF EXIST dokuBackup_old (
    echo are you confused? dokuBackup_old already exist!
    exit /B 1
  ) ELSE (
    move /Y dokuBackup dokuBackup_old
  )
) ELSE (
  mkdir dokuBackup
)
 
ncftpget -R -u yourUsername -p yourPassword yourWebspaceFTPurl "./dokuBackup" "yourWebSpaceFtpDokuwikiPath/conf"
ncftpget -R -u yourUsername -p yourPassword yourWebspaceFTPurl "./dokuBackup" "yourWebSpaceFtpDokuwikiPath/data/pages"
ncftpget -R -u yourUsername -p yourPassword yourWebspaceFTPurl "./dokuBackup" "yourWebSpaceFtpDokuwikiPath/data/meta"
ncftpget -R -u yourUsername -p yourPassword yourWebspaceFTPurl "./dokuBackup" "yourWebSpaceFtpDokuwikiPath/media"
ncftpget -R -u yourUsername -p yourPassword yourWebspaceFTPurl "./dokuBackup" "yourWebSpaceFtpDokuwikiPath/attic"

Via FTP using wget

FTP-Login stored in $HOME/.netrc

machine example.net login USER password PW
#!/bin/sh -e
# backup data from wiki (FTP)
unset
url="ftp://example.net/"
backup="/path/to/backup"
wget -q --no-cache -nH -c -t0 --mirror -P$backup -i- <<EOF
        $url/data/pages
        $url/data/meta
        $url/data/media
        $url/data/attic
        $url/conf
EOF

See wget –help for details.

1)
tilde ~ means home directory, the actual full path is longer
tips/backup_script.1465480919.txt.gz · Last modified: 2016-06-09 16:01 by 194.75.78.178

Except where otherwise noted, content on this wiki is licensed under the following license: CC Attribution-Share Alike 4.0 International
CC Attribution-Share Alike 4.0 International Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki