I changed my MT configuration to make use of Umask to set permissions better. And, I also had a guy I work with run a security program against my site and it turned up with no vulnerabilities. So, it seems like its condition of the site is sound at present. I suppose if there were a way to copy these to my local drive that would be even better.
But that seems beyond my ability would require script writing and programming. And I don't know how I'd automate backing up sites on my webspace and copying them to my local drive. Is that even possible? Well static content that doesn't change doesn't need backed up except when changes are made. As for the database your blog is using that can be automated to email you a backup every night. I do it with my blog and 3 others using a script called dbsender.
I wrote detailed instructions on setting it up. There are also links in that thread worth reading. You can post now and register later. If you have an account, sign in now to post with your account. Paste as plain text instead.
Only 75 emoji are allowed. Display as a link instead. Viewed 10k times. Improve this question. Expecto Expecto 2 2 gold badges 17 17 silver badges 34 34 bronze badges. Add a comment. Active Oldest Votes. Improve this answer. Erik Forsberg Erik Forsberg 4, 3 3 gold badges 26 26 silver badges 30 30 bronze badges.
Nick Nick 6 6 bronze badges. Sign up or log in Sign up using Google. Sign up using Facebook. Sign up using Email and Password. Post as a guest Name. Email Required, but never shown.
Here is the command to copy all. Open terminal and run the following command to create an empty shell script. Save and close the file.
If you want to run this script automatically every day, just create a cron job for it. Open crontab with the following command. Add the following line to run the above shell script daily at 10 am. Modify it as per your requirement. Please use full paths while running shell script as cronjobs to avoid errors.
Also use sudo to avoid permission denied errors.
0コメント