Portal Home > Knowledgebase > Articles Database > Backup Options


Backup Options




Posted by twooly, 05-18-2007, 10:13 AM
So we all know we need to backup our files and not to depend on your hosting provider alone. I want to get a feel of how people backup their stuff (mysql db, files) as a reseller/standard host account where we don't have access to the server to backup using rsync or cool things like that (only have ftp/control panel access). I've been manually backing my few sites up (7 total). I have a way to automatically do the mysql dbs now but looking for help with the files them self. Looking for a way to automatically download the files and maybe only download files that have changed. (one rather large site, photo gallery that people can upload files) Any pointers on how what you use (free/paid) would be appreciated. Thanks

Posted by cartika-andrew, 05-18-2007, 12:31 PM
Hello twooly, yes, you can automate this.. 1) use a script that via a cron will backup mysql DB's into our ftp space 2) use an ftp tool that is capable of scheduling downloads to your PC of your entire FTP space and schedule this to run as often as you want to backup your data locally (though, this will use up transfer towards your hosting plan quota) Additionally, there are providers that give end users access to their backups ask your host and they should have these tools available for you.

Posted by vantage255, 05-18-2007, 12:55 PM
I used to use an expect script on my workstation to SSH into an account and FTP the users home dir onto an FTP server elsewhere. I cronned it and got a message every time it failed.

Posted by twooly, 05-18-2007, 01:18 PM
Thanks for the tip Andrew just looking to see if other options exist to only download what is changed. Thanks for replying vantage255 but I'm just a reseller so I don't have SSH access.

Posted by anatolijd, 05-18-2007, 01:44 PM
i have set up a couple cron jobs in my control panel: - first does daily mysqldump for all my mysql databases, gzip `em, and put file into ./backup/ directory in my account - another cronjob is archiving my website content (only really needed files and directories) and put it also in ./backup/ Since i have ftp access to my ./backup directory i wrote a cron task on my another server which just retrieves backup data files via ftp. Maybe not so smart, but i use it on my H-Sphere and cPanel accouns and i`m 100% satisfied. My sites are not so large and i use gzip for now, but i think there is should not be a problem to use "tar -u -p" instead. Last edited by anatolijd; 05-18-2007 at 01:54 PM.

Posted by twooly, 05-18-2007, 02:45 PM
anatolijd that is really good I'm also on H-Sphere but all my sites are running on windows, do you know how I can get it to archive all the files? Beause I can then have my pc here at home download them

Posted by anatolijd, 05-18-2007, 03:17 PM
no, i have no idea how to create web content archive on windows. As for now i see the only solution - retrieve once all web content to your local pc, and then periodically download only new files and files that were changed. I used TotalCommander to syncronize local and remote directories manually, i think you may try WinSCP - as far as i see it provides some sort of command-line automation for this: http://winscp.net/eng/docs/task_synchronize_full

Posted by twooly, 05-18-2007, 06:49 PM
That is perfect thanks much everyone this gets me what I needed

Posted by joshi_at, 05-21-2007, 06:06 PM
for the mysql-backup i´m very comfortable with mysqldumper(.de). they have a good supportforum, afaik englisch interface too and they can dumb very big databases too (no worries about the 30sec. limit in php)... i even donatet some money to the guys because of the good work... hope i could help you a little cheers hannes

Posted by Nnyan, 05-21-2007, 06:54 PM
I use this script for backups but I have to say I like anatolijd's so I may need to update mine a bit. = ) #! /bin/bash mydate=$(date +%m%d%Y%I%M%p) PATH=$PATH:/usr/local/bin export PATH /usr/bin/mysqldump --flush-logs --opt -u USERNAME -pPASSWORD DBNAME > /dir1/dir2/backup.sql cd /dir1/dir2/ if [ -e backup.sql ] then tar -czvf $mydate-sqlback.tar.gz backup.sql fi then lftp <



Was this answer helpful?

Add to Favourites Add to Favourites    Print this Article Print this Article

Also Read
Any Good News? (Views: 518)
3 ip to one domain? (Views: 473)