Web Hosting Forums

Page 1 of 3 1 2 ... LastLast
Results 1 to 15 of 39

This is a discussion on Backup with cron job in the Hosting Talk & Chit-chat forum
How would I back my site up with a cron job at midnight each day. ...

  1. #1
    zaw
    zaw is offline
    Loyal Client
    Join Date
    Sep 2001
    Location
    Someplace
    Posts
    157

    Backup with cron job

    How would I back my site up with a cron job at midnight each day.

  2. #2
    Community Leader jason's Avatar
    Join Date
    Sep 2001
    Location
    Rochester, NY
    Posts
    5,884
    In CP:

    Minute: 0
    Hour: 0
    Day: *
    Month: *
    Weekday: *
    Command: tar -czf /home/username/backup.tar.gz /home/username

    That will run tar every night at midnight. The command says to create an archive (c), gzip it (z), and save it to a file (f) called backup.tar.gz of all of the files inside /home/username

    --Jason
    Jason Pitoniak
    Interbrite Communications
    www.interbrite.com www.kodiakskorner.com

  3. #3
    Loyal Client
    Join Date
    Dec 2001
    Posts
    175
    thats cool,

    where does it save the backup to? and can we move the backup to another server elsewhere on the web?

  4. #4
    Community Leader jason's Avatar
    Join Date
    Sep 2001
    Location
    Rochester, NY
    Posts
    5,884
    Here's a breakdown of the command I posted above:

    tar - the name of the program that creates the archive
    -czf - configuration options
    • c - create a new archive
    • z - GZIP the archive after it is created (tar only archives the files, it doesn't compress them)
    • f - save it to a file


    /home/username/backup.tar.gz - this is the filename you want the archive saved as and the path of where to save it

    /home/username - this is the directory you want to be saved in the tar file

    Of course, change username to your username. My example will place the archive in your home directory. With a bit of scripting you could set it up to ftp the file to some other server or you could email it to yourself.

    --Jason
    Jason Pitoniak
    Interbrite Communications
    www.interbrite.com www.kodiakskorner.com

  5. #5
    Loyal Client
    Join Date
    Dec 2001
    Posts
    175
    Thanks for the detailed explanation Jason,

    I have never used cron before so I'm off to figure how to ftp or email it.

    thanks again.
    Tony.

  6. #6
    Loyal Client
    Join Date
    Mar 2002
    Posts
    93
    Thanks everyone for the great questions and thanks jason for the quality replies.

    Since a backup typically isn't time sensitive (such as things that must change on a website at midnight like sale items, new coupons, etc.), I set my backup cron for 2 am instead.

    I would imagine there are probably many cron jobs that are set for midnight.

    You might also consider not running the backup everynight unless you plan to download the backup each night. Why? So the server only performs this backup when it is necessary. For example, if you only plan to download the backup file but once a week, you might consider setting the cron as...

    00**0
    The last spot is the day of the week. An * in the last spot indicates everyday, so by replacing the last spot with a 0 (zero) indicates to run only on Sunday.

    * = Everyday
    0 = Sunday
    1 = Monday
    etc.
    6 = Saturday (remember, we started with zero).

    So, to run a cron job that performs your backup every Sunday at 12:20 am...
    20 0 * * 0

    The second "spot" indicates the hour...
    12am=0
    1am=1
    2am=2
    12pm=13
    1pm=14
    etc.

    For some reason, the minutes are listed first, followed by the hour, go figure.

    Minute | Hour | Day of Month | Month | Day of Week

    If you needed multiple dates, you could get really fancy...
    30 5 * * 2,5
    Would run every Tuesday AND Friday at 5:30 am

    I know that JaguarPC currently allows crons, but I also mention this as many hosts don't allow cron jobs because they can quickly become overused (ran too often, perform too big a task, etc.). I know some hosts only allow cron job on a case by case basis.

    Anyway, just a thought...

    Thanks again,
    Roger
    Last edited by Roger; 03-24-2002 at 06:32 PM.

  7. #7
    Loyal Client
    Join Date
    Mar 2002
    Posts
    93
    I also got to thinking...

    Regarding emailing the backup file to you, keep in mind that...

    1. The original backup file will reside on the server. Let's say it's 5 megs.
    2. If you email this file to you, since your email counts as part of your space, until you download the backup, that's another 5 megs of your web space gone. That's now a total of 10 megs.
    3. It's best to only send this via email if you have a fast connection to download such a file.
    4. It's slower to download this large of an email versus transfering via FTP.

    You might log in to your account via SSH/Telnet and perform the tar command manually to see how large your backup file will be.

    Other solutions...

    1. You could place the file in your FTP folder instead of below the root. You could then download the file from your browser and not consume email space.

    This woud be risky considering anyone might find the file and download it as well. Maybe a password on this folder would help, I haven't experimented with doing so.

    2. Instead of emailing it to you, you might also consider simply logging in via FTP and download the updated backup file. Keep in mind my post above, if you ask the server to backup daily, then make sure you are also downloading it daily. If you update the file weekly, then download weekly.

    3. You could create a CGI/PHP script that, after you login to the script via a username and password, it would allow you to download the file. It would then either automatically delete the server backup file, or ask you if you want to delete the server based backup file.

    4. You could have the cron job store the backup file in a hard to find folder on your web space, like /public_html/2343/234/1s3j2l/whatever and then password protect the very first folder (not public_html, but the next folder).

    Some ideas are above are better than others, some include more risk than others, that's up to you.

    Thanks,
    Roger

  8. #8
    Loyal Client
    Join Date
    Dec 2001
    Posts
    175
    Thanks Roger,

    I am learning PHP at the moment, so I was thinking of writing page that I could log into, do a backup to a file and FTP it off to someother secure server(like the one we have at work!).

    At least that way I could backup where ever I am in the world.

    Security is the main issue with this one though, so I am going to have to think about how i lock it down.

    I could lock the directory its in via CP, but I am not sure how secure it is.

    Kind Regards,
    Tony.

  9. #9
    Loyal Client
    Join Date
    Oct 2001
    Posts
    12
    An automated solution

    Download the World Wide Backup code from

    www.worldwidecreations.com (Under the Free Code section)

    You can configure it to do a number of different things and it works with cron - very useful and works like a charm
    Last edited by Hoonz; 03-25-2002 at 07:46 AM.

  10. #10
    Loyal Client
    Join Date
    Sep 2001
    Posts
    32

    The Better Backup...

    Don't forget to backup your MySQL databases, if you're using any! The idea of setting up the cron job is excellent, as it automates your backup procedure. However, instead of directly calling the tar program, you might actually want to write a shell script (the UNIX equivalent of a batch file) that dumps the database to a file, and then incorporates that into your site backup.

    Create a file called "backup.sh" that contains:
    Code:
    #!/bin/sh
    cd /home/{username}
    mysqldump -a --add-drop-table --add-locks -l -p{DBpassword} -u {DBusername} -B {database1} {database2} ... {databaseN} > database.mysql
    tar zcf backup.tgz database.mysql public_html
    {username} = your login ID. Type "pwd" at the shell prompt after you log in for the exact path.
    {database1} ... {databaseN} = The names of the databases you want to back up. Use the CPanel to determine this.
    {DBpassword} and {DBusername} = The password and username you use for the databases, which were set up through the CPanel.

    At the shell prompt, make the script executable with "chmod 755 backup.sh"

    Now, set up your cron job to run the shell script instead of running gzip directly.

    One extra thing for the true "uber-geek" would be to transfer the file to another system. Because I have a cable modem at home, I have a firewall. The firewall is not just Linksys router, it is my old Pentium-233MMX system runnning FreeBSD. Secure shell (SSH) is a part of the base install. All I do is create a set of crypto keys on my web server using "ssh-keygen", add the public portion of the key to my account on the FreeBSD firewall, and then add one more line to the bottom of that shell script:
    Code:
    scp backup.tgz {my home IP address}:~/backup.tgz
    The crypto keys mean that scp will never prompt for a password, and the entire transmission is encrypted. When I wake up in the morning, there is a file in my home directory on the firewall, with a complete backup of my site and site databases.

    If this interests you, drop me a PM with your e-mail address, and we can discuss it further. Who knows, I might end up with a little tutorial for doing this on my web site!
    Last edited by pahowes; 03-25-2002 at 08:29 AM.

  11. #11
    Loyal Client
    Join Date
    Dec 2001
    Posts
    175
    Hi Phowes,

    I have a database called fertilit_yabbse.

    Each night(since the NS17 crash!!) I do a backup including the complete structure. I use PHPMyAdmin to do this.

    After looking at your bash script - i went over to the MySQL site to try and understand your code a little better(i am a total newbie!)

    I thought that it would be cool if I could create a backup, zip it then email it to myself.

    I think I would need to use sendmail to do this (i'll look that up in a minute)

    After reading the mysql manual, i have noticed a couple of other options.......

    "Note that if you run mysqldump without --quick or --opt, mysqldump will load the whole result set into memory before dumping the result. This will probably be a problem if you are dumping a big database. "

    How can I integrate these options (and should i?) into your script (sorry if i sound stupid! - as I said im a newbie) also what options would i need to set to do a full backup complete with structure?

    Hope you dont mind all these questions.

    Kind Regards,
    Tony.

  12. #12
    Loyal Client
    Join Date
    Sep 2001
    Posts
    32
    As one of my old high school mathematics teachers told me, "There's no such thing as a stupid question."

    You're absolutely right about the addition of the "--opt" argument on the command line. You can add that anywhere on the mysqldump command line, as long as it appears before the "-B" option. The mysqldump program assumes that the -B parameter is the last one on the line, and from that point on the names of databases are listed. Also, because the "--opt" parameter actually encompasses several of the other parameters I was using, the entire command can be shortened to this:

    Code:
    mysqldump --opt -p{DBpassword} -u {DBusername} -B {database1} {database2} ... {databaseN} > database.mysql
    Thanks for the pointer! I'm going to update my script right now.

    As for sending it to yourself in an email, that's a bit tougher. Personally, I use a secure copy command to push it to the server behind my cable modem. If you wanted to, you could get an FTP program and transfer the Zip file out of your account, but that's not very secure. If the file is e-mailed to yourself, it's still sitting on the same server until you check your e-mail. Also, the attachment could be huge. I backed up my wife's site with the same script, and the Zip file was almost 10MB because of all the image files.

    Let me know if you have any other questions!

  13. #13
    Loyal Client
    Join Date
    Dec 2001
    Posts
    175
    Thanks for the quick answer!!

    About emailing it out.... I mean't that I would email it to an external account(like my works email).

    If i did that.... is there a way I can password(or encrypt) the zip before i send it?

    you are right of course, its a database and as such will get bigger and bigger.... and this is something i would have to think about in the future...... when I can afford to host another server with FTP(on a different box!!)

    I haven't heard of a 'secure copy' command before - where can I find out more about this?

    Thanks again.
    Tony.

  14. #14
    Loyal Client
    Join Date
    Sep 2001
    Posts
    32
    Secure copy is a part of the Secure Shell suite. Most UNIX-like systems use the open-source version, known as OpenSSH. The site for the group responsible for the product can be found at http://www.openssh.org/

    The secure copy command (scp) works exactly the same as the remote copy command (rcp) but has much better security. Both the authentication and the data transfer are fully encrypted.

    If you want a command-line port of OpenSSH for Windows, go to http://www.networksimplicity.com/

    If a GUI version of the tools is more to your liking, you can get the excellent "PuTTY" suite from http://www.chiark.greenend.org.uk/~sgtatham/putty/

    Zip files can have a password applied, but GZip files cannot. However, if the server has a copy of PGP or GnuPG available (I haven't checked) then you can use one of those products to encrypt the backup archive file in a variety of ways.

    You do not need to pay for a second hosting service. Just download the backups to your PC at home. Think about it: You're probably doing most of your web page development locally and uploading the HTML files to your account with JaguarPC right now. The only thing that you don't have a copy of is the database. Use cron to create a nightly dump of whatever you want to back up, and then configure the Windows program scheduler to grab a copy of the archive. You could always place the archive in your public_html directory and grab it using Internet Explorer.

  15. #15
    Loyal Client
    Join Date
    Mar 2002
    Posts
    93
    I have my cron job setup to perform a backup, but I received the following email...

    tar: Removing leading `/' from member names
    tar: /home/owensbor/backup.tar.gz: file changed as we read it
    tar: Error exit delayed from previous errors

    Since it indicates "exit delayed from previous errors" do you think the backup is still good or did it just die?

    It appears that the backup is trying to backup itself. Any ideas how to get around this with command switches?

    Thanks,
    Roger

Page 1 of 3 1 2 ... LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •