Posted on

Back Up Google Drive with CrashPlan

Today, as I put another spreadsheet with Store Locator Plus and Enhanced Results features onto Google Drive, I realized something.   If Google Drive crashes, very unlikely but stranger things have happened, I don’t have a backup copy of ANY of my Google Drive documents.  There are a lot of things I have out there that I really don’t want to re-create.   It is even more important now that I am using Google Drive to store spreadsheets that are an integral part of my WordPress plugin documentation.

While I’m thinking of it, let’s go down that road for a moment.

Using Google for WordPress Tables

A few days ago I wanted to start building an “add on pack comparison” to the site.   It helps me organize my thoughts on what features belong in which plugins, reminds me of where I put those features, and also educates the consumer on what plugin they may be interested in.    I decided a wide table with side-by-side columns for each plugin was the best option.   Since it is not a true price comparison I needed a flexible grid display.

I tried a number of table plugins that are out there in the WordPress extensions directory.   Unfortunately a large number of those plugins were defunct, many not updated in years.   The few that were updated were adequate but too hard to man-handle to look  just the way I wanted, requiring extensive CSS updates and HTML man-handling to behave as I desired.  Sure, some of those, like TablePress, had options to make those efforts easier but still not effortless.

Then I stumbled across a post that discussed inserting a Google spreadsheet in the middle of a page.   You create a spreadsheet, format it how you like, and then publish to web.   Select the “embed code” and get the unique iframe tag to put on your site.  While I was leary of the iframe idea, it worked beautifully.   Now I can format the colors, fonts, and column data exactly how I want with the ease of updating the Google spreadsheet.  It is far easier to click a cell and the color box and see the background change than tweak CSS all day.

If you are trying to put tables in WordPress you may want to check that out.     Create the spreadsheet in Google, go to “publish to web”, publish, go to “get embed code”, copy the iframe HTML snippet, and paste into your page or post (in text mode).   Tweak the width and height parameter to fit your site.  Done.

embedded google spreadsheet
embedded google spreadsheet

Check off “auto-republish” and every time you make a change it will reflect on the website within a few hours (or you can force a manual republish if you need if faster).


Backing Up Google Drive

So back to the backup issue.   I have a lot of doc, some integral to my public site, on Google Drive.   I NEED to back those up.    How did I do it?  Turns out easier than I thought.

First, I run CrashPlan as my backup service.  MUCH better than Mozy which is over-priced, IMO.   A MILLION TIMES better than Carbonite, which is slow as heck, throttles the computer, has horrible restore times, and worse support response times.   In fact if you are considering backup my only key recommendation is do NOT use Carbonite.  There is a reason you hear about them all the time, they are hiding a poor design and poor service with a huge marketing budget.

Second, getting the Google Drive content to the CrashPlan backup.   Easy.   Install Google PC Drive.   When you log into Google Drive there is a subtle link in the left sidebar for this app.    It is an program that will be installed on your local computer.   It creates a folder on your computer which is the “local sync” for the Google Drive content.  You can select which folder you want to keep in sync.   I just let it do the whole thing since I have plenty of space on the 1TB drive in my notebook computer.

Now I have a local copy (first stage backup) of everything on Google Drive.   Even better, if I create something in that folder OR on Google Drive it will be auto-replicated on both sides.   That makes for a good first stage backup strategy.

Second, since this folder goes under your user directory by default CrashPlan should automatically note the new content and mark it for backup, which it did on my system.  If it does not do this you can manually add the Google sync folder to the backup plan.

crash plan and google drive
crash plan and google drive


I like easy.


Henry Houh contact me last week about an issue with this type of backup.     It turns out Crash Plan will not back up ANYTHING by default when using this configuration.  Why?    Crash Plan runs as the user “SYSTEM” not as your normal login user.     Google Drive runs as you.

In my case the Google Drive folder was created with “Full Control” permission for me but NO permissions for the system user “System”.

The fix?

Go to the Google Drive folder, not the shortcut.

Right-click and select properties.

Select the security tab.

Click the Edit button.

google drive properties

Type “System” in the add user box.

Click “Check Names”.

Click OK.

Click on System in the list of users.

Check off “Full Control” under the allow column.

Click OK.

Click OK.

google drive with system properties

Now your Google Drive content will be backed up to Crash Plan.

Posted on

Linux : Find All Files Older Than…

I recently needed to clean up a directory on my Linux box that included hundreds of files. I wanted to get rid of all the files that hadn’t been updated in over a year. At first I decided just to list the files by date:

ls -lt

This will list the files in long format by time (newest files list before old file). This shows me all the details with the oldest files scrolling to the bottom of the window so the last few files above my command prompt are the oldest.

There are hundreds of files more than a year old.

Employing Find

Find is one of the tools I keep in my Linux tool belt. I don’t need it often, but when I do it saves me quite a bit of time. Find is the Swiss Army Knife of Linux search tools. It is complete, thorough, and comes with just about every “doo-dad” (a technical term) for finding files. It does real-time system searches, so unlike locate it does not rely on a secondary database which may become outdated and not give complete results.

The downside of find is that there are so many options. It is easy to choose the wrong option or, more likely, to string together the options in a manner that the search takes forever and you get no results.

The upside, thanks to how the command shells work, is that you can use the output of find to drive other applications. Like ls or rm. The later two are how we’ll employ find.

Find Files Not Touched In A Year

First we can find all the files in our current directory that are ‘stale’ like this:

find ./ ctime +365

In English “find stuff in this directory (./) where the creation time (ctime) is at least 365 days ago”.

The sister option is mtime, which is “modification time”, and may be more appropriate depending on whether you are truly looking for “modified since” (touched at all) or “created since” (date it was first brought into existence).

Now we can combine this with ls to list the results. It may seem redundant, but I like to test the parameter passing of find to another shell command using something innocuous such as ls. So we test like this:

ls -l `find ./ ctime +365`

The back-ticks take the output of find, which is a simple relative-path based list of the files it located, and uses that as the second parameter to ls.

If all looks good we can now force a remove of those files. Be careful with rm -f. You can do irreparable harm with this. There are other options and if you are not comfortable with power tools that can take a limb off with one keystroke, then drop the -f or us one of the myriad of linux admin tools to help you out. I’ll roll the dice and hope all my limbs remain intact:

rm -f  `find ./ ctime +365`

Other Find Options

There are a lot of ways to find files by other attributes such as “delete all files larger than ? MB” or “delete all files older than <this file>”. This is a good resource that explains some of the options and how to perform different types of find operations:

Good luck & keep your limbs on!

Posted on

Using Find To Help Manage Files On Linux

We found a system administration problem on a  server today that was being caused by incorrect directory permissions.  Any email that passes through the server-wide spam filter was not going through because of permissions on the /home/<domaindir-here>/etc directory.  That directory needs to be owned by mail.

Here is a quick way to update those directories:

 [root@host:home]# cd /home

The find command only lists directories (much, much faster if you know you only need a certain file type like ‘d’), up to 2 levels deep (.  = current directory = level 1), and matching the name etc…

 [root@host:home]# chgrp mail `find /home -maxdepth 2 -type d -name etc`

Now we pass find as a variable list to the ls command to see what we touched.  The ‘d’ on ls also restricts it to directory level output only, so we don’t descend into those directories and list the contents.

 [root@host:home]# ls -ld `find /home -maxdepth 2 -type d -name etc`
drwxr-x---  3 aaron    mail 4096 Feb 10  2008 /home/aaron/etc
drwxr-x---  2 abundatr mail 4096 Oct 20  2009 /home/abundatr/etc
drwxr-x---  3 alutask  mail 4096 Feb 10  2008 /home/alutask/etc
drwxr-x---  3 banks    mail 4096 Feb 21  2008 /home/banks/etc
drwxr-x---  4 chasvol  mail 4096 Feb 10  2008 /home/chasvol/etc
drwxr-xr-x  3 cyberspr mail 4096 May  7 11:24 /home/cyberspr/etc
drwxr-x---  2 daedalus mail 4096 Mar 27  2008 /home/daedalus/etc
drwxr-x---  7 dolphin  mail 4096 Jul 30  2008 /home/dolphin/etc
drwxr-x---  3 dutchbul mail 4096 Feb 10  2008 /home/dutchbul/etc
drwxr-xr-x  2 eatchas  mail 4096 May 10 21:59 /home/eatchas/etc
drwxr-xr-x  2 fireant  mail 4096 May 25 21:16 /home/fireant/etc
drwxr-xr-x  4 jrsint   mail 4096 Jan 11  2008 /home/jrsint/etc
drwxr-x---  3 lance    mail 4096 Jul  9  2007 /home/lance/etc
drwxr-xr-x  2 memoryve mail 4096 Feb 16 10:29 /home/memoryve/etc
drwxr-x---  2 michaelc mail 4096 May 13  2008 /home/michaelc/etc
drwxr-x---  3 modelloc mail 4096 Dec 18 19:22 /home/modelloc/etc
drwxr-x---  3 monstrss mail 4096 Feb 10  2008 /home/monstrss/etc
drwxr-x---  3 nicolas  mail 4096 Feb 10  2008 /home/nicolas/etc
drwxr-x---  3 outdoor  mail 4096 Aug 26  2008 /home/outdoor/etc
drwxr-xr-x  2 perks    mail 4096 Jun  6 15:17 /home/perks/etc
drwxr-x---  2 pout     mail 4096 Jun 15 12:08 /home/pout/etc
drwxr-x---  3 ravenel  mail 4096 Aug 12  2007 /home/ravenel/etc
drwxr-x---  4 remodel  mail 4096 Feb 10  2008 /home/remodel/etc
drwxr-x---  2 saveag   mail 4096 Oct  9  2008 /home/saveag/etc
drwxr-xr-x  2 shoppout mail 4096 Jun 15 16:46 /home/shoppout/etc
drwxr-x---  3 southern mail 4096 Feb 10  2008 /home/southern/etc
drwxr-x---  2 tbcustom mail 4096 Jun 20  2008 /home/tbcustom/etc
drwxr-x---  3 thebicyc mail 4096 Jun 16  2008 /home/thebicyc/etc
drwxr-xr-x  3 theenerg mail 4096 Feb  9  2008 /home/theenerg/etc
drwxr-x---  2 unclelue mail 4096 Dec 14  2009 /home/unclelue/etc
drwxr-x---  2 vanjean  mail 4096 Feb 16  2009 /home/vanjean/etc
drwxr-x---  3 wwwbrea  mail 4096 Dec 18 01:22 /home/wwwbrea/etc

This same technique can be used for any number of commands when you need to work on directories.   Just be careful with it, this can wreak as much havoc as it can repair damage done by other command line tools that have been wielded without care.

This Red Rider BB Gun is loaded.  Be careful out there!  “You’ll shoot your eye out kid”…

Posted on

Finding Which Linux Packages Provide Which Files

There have been multiple situations where I find out that I need a particular file to continue with something I am doing. Most of the time this happens when I am compiling a program. I will be missing a library, or header file, or something. So I end up on search engines looking for whatever package I need to ‘apt-get install’. Well it turns out there is a command line tool that will tell you this information, on systems use Apt, that is.

Enter ‘apt-file’.

I use Ubuntu, and it doesn’t come with that platform by default. Or at least not on 10.04 then I’m using. But you should know how to get it. A simple ‘apt-get install apt-file’.

Once you have it installed, you will have to update the cache it uses for searching. I was prompted to do this automatically, but if you are not then you can run ‘apt-file update’ to do so.

With that done, the command ‘apt-file find’ will let you list packages that include the given file. For example, I was looking for the program ‘xpidl’, which I didn’t have. Easy to find:

    $ apt-file find xpidl
    kompozer: /usr/lib/kompozer/xpidl
    sunbird-dev: /usr/lib/sunbird/xpidl
    thunderbird-dev: /usr/lib/thunderbird-3.0.3/xpidl
    xulrunner-1.9.1: /usr/lib/xulrunner-
    xulrunner-1.9.1-dbg: /usr/lib/debug/usr/lib/xulrunner-
    xulrunner-1.9.2: /usr/lib/xulrunner-
    xulrunner-1.9.2-dbg: /usr/lib/debug/usr/lib/xulrunner-

You can provide the argument ‘-x’ to use a Perl regular expression as your search query.

You can also see what files are in a package by using the command ‘list’ instead of ‘find’. Unlike the ‘dpkg -L’ command, ‘apt-file list’ will work even if you don’t have the package installed or cached on your system.

I wish I had found this tool years ago.

Posted on

Linux File Management

Note About Security: Best practices dictate that the security of your directories and files should be set to least privileges granted to get the job done. That means don’t open up the security of a directory or a file just to make it work. Know what you need to accomplish your goal and don’t open security beyond that.

Listing Files

By Date

“list long time reverse” = show detailed listing, by date, in reverse

ls -ltr *

By Size

“list long by size reverse human” = show detailed listing by size, reverse order, human readable (kb,Mb instead of blocks)

ls -lSrh

Finding Files

“find all files, starting at the root directory whose size is more than 4096k”

find / -size +4096k

Script to find big files

This script will locate all files in the specified directory that are more than 4M (4096k).

  • The backward single tick around the find command is used to gather the results of the find and pass that as an argument to the ls command.
  • The $1 in the find command is replaced with the first thing typed after our “bigfiles command”

The script below says: “list by size in reverse with human readable file sizes the stuff that the find command returns in the specified directory where the size is more than 4096k”.

  • vi /usr/sbin/bigfiles (you’ll need to be root to do this)
  • enter this into that file
ls -lSrh `find $1 -size +4096k`
  • chmod +x /usr/sbin/bigfiles

Run the command like this:

bigfiles /home/backup

The Fancy Version

# Set the directory to look in to the current directory
# if the user did not give us a place to look
if [ "$1" = "" ]
# Set the size limit, default to 25M
# user must specify in "k" or other
# format that is "find friendly"
if [ "$2" = "" ]
# list files by size
# that are at or below the directory provided
# whose size is at least the size specified
ls -lShr `find $DIR -size +$SIZE`

Run it like this to find files under /home over 100M:

bigfiles /home 100000k

Sharing (Read/Write) A Directory With Members Of Your Group

  • The immediate parent in the tree needs to allow members of your group to have read/write privelages.
  • The file(s) you want to share with the group must be set to read/write privelages.
  • The example below assumes your primary login is part of the group “dev” and that you want to share a lower-level directory in it’s entirety with your group.
  • See #File and Directory Permissions for details on reading permissions.
  1. Login as a privelaged user (root)
  2. Start at the root of the file system and lookup our directory tree permissions
 cd /
 ls -lhd /home
 ls -lhd /home/<targetdir>
 ls -lhd /home/<targetdir>/public_html

The file listings:

 drwxr-xr-x  64 root root 4.k Mar  8 09:10 /home
 drwxr-xr-x  64 <user1> <group1> 4.k Mar  8 09:10 /home/<targetdir>
 drwxr-xr-x  64 <user1> <group1> 4.k Mar  8 09:10 /home/<targetdir>/public_html

Since we are only trying to grant access to files within the public_html directory we can ignore the settings for the /, /home, and /home/<targetdir> directories. We are only concerned with the setting for the /home/<targetdir>/public_html directory itself and the files within.

  1. Set the group for the directories
  2. Set the group for the files
  3. Change the permissions
 chgrp dev /home/<targetdir>/public_html
 chgrp dev /home/<targetdir>/public_html/*
 chmod g+w /home/<targetdir>/public_html
 chmod g+w /home/<targetdir>/public_html/*

Note About “Immediate Parent”

One small but important note about directory permissions. The ENTIRE tree does not need to be opened up for things like r-x to work… only the IMMEDIATE PARENT and the file itself. In other words, if I want /home/cyberspr/public_html/xyz.php to be read/write/execute for owner & group, but read-execute for world then you need to do this:

chmod 775 /home/cyberspr/public_html
chmod 775 /home/public_html/xyz.php

File and Directory Permissions

drwxr-xr-x  64 root root 4096 Mar  8 09:10 /home
  • 1st character is the directory flag (d = directory, – = a file)
  • next 3 characters (rwx in this example) are the OWNER privelages flags
  • next 3 characters (r-x) are the GROUP privelages flags
  • next 3 characters (r-x) are the WORLD privelages flags
  • next digits are something we don’t care about right now
  • the next item is the OWNER that the file belongs to (root)
  • then comes the GROUP that the file belongs to (also root in this case, which is NOT the same as the user named root)

In our example above the following permissions are setup:

  • The user named “root” has full privelages to the file , read, write, and execute
  • The group named “root” can only read and execute files
  • And anbody at all can read or execute the files within the home directory

Notes From The Real World

If you plan on using the subversion repos on the server you’ll want to add yourself to two groups : dev and svn. The reason is that the /home/<targetdir>/public_html/pipe/repo directory must be owned by svn for svnserve to work. However the parent directory (/home/<targetdir>/public_html) has to be in the group dev. Obvious conflict here without being part of 2 groups.

Related Links