#! /usr/bin/ksh – Backup Admin Life: Automation

If you read that title, ‘Pound sign, exclamation point, forward slash U-S-R, forward slash, B-I-N, forward slash, K-S-H’, then you may not get the nuances of my blog post, or forget the nuances, you may not “get it” at all.

No offense.

However, if you read the title, “pound-bang user-bin-korn shell” or something very similar, you just might have lived in a time long ago when dinosaurs ruled the earth, just like me, and you are an old Unix guy, gal, person, er uh, you get the point.

Chances are, you probably steered clear of or questioned those people you suspected posing as Unix “experts” who pronounced vi as “vie”, I did and you probably were in awe of those who could actually use emacs with ease.  I tried it once but never grew accustomed to that interface.  And you probably looked at your Windows NT “counter-parts” kind of weird when they said that their at-jobs were similar to cron jobs, even though they were but you didn’t want to let on that there were similarities in our worlds and you definitely didn’t want to tell them that we had at-jobs too…that would be just too much alignment.  I mean drinking at the same bar was as about as far as you would go with an NT admin, right?  Maybe?

I joke about this because it was true back in the 80s and 90s, and it was silly, to be honest.  We both were trying to accomplish the same things as admins, just with different operating environments.  I never was an NT guy, but a Novell Netware guy who jumped to Unix as soon as I could in the late 80s.  Like many, I ran Windows on my laptop but always had Cygwin so I could go to my happy place.  And when Mac finally went to a derivative of FreeBSD, I found a new home.

Back in the 80s and 90s I was a Unix guy (and I still am), who learned Unix on the fly and by using a number of the reference books you see in the image of this blog.  Having a programming background, I found Unix shell scripting to be a very quick and easy way to hack together functions I would find myself repeating over and over.  In other words, I would ‘automate’ these things.

Some of the things that I started to automate would be the monitoring of the tape drives. For you youngsters…oh, nevermind.

I would often times find that the tape drives would just go DOWN for whatever reason, so if you didn’t automate this process and have it run every 15 minutes or so during your backup window you may, in fact, find yourself missing some backups or worse yet, all of the backups.  To keep track of the number of times this was occurring I had the script email $someone_who_cares.  I loved that variable name by the way, and typically the $someone_who_cares was me, the consultant, or the backup admin on staff.  After a while of receiving these emails for the same drive index I decided I needed to modify my script.  You see, on occasion, a drive was DOWN for good reason.  So if the drive was down for a predetermined number of iterations, then it would send an email to $NOC_cares, which included, you guessed it, folks in the NOC who would go and physically inspect the system and create a ticket if necessary.  That was real-time-ish automation!

Moving beyond the simple monitoring and conditional action type of scripts, I recognized a need to build in more “intelligence” to some of my scripts.  For example, the backup admin who was ultimately responsible for the environment was often times dismayed to get emails on a Saturday that none of the backups ran because the tapes were FULL.  So it was time to write a capacity planning automation script, yes, I called it that.  This was a necessity due to the fact that this particular customer at the time would not buy a larger tape library to handle the increased data load placed on the backup server environment.  So, this script would run (out of cron) as often as the backup admin would like and query the tape library via the backup catalog and the calculate the available capacity.  It would then look at the upcoming scheduled backup job types and compare it to the previous backup jobs of the same types and make a recommendation whether tapes needed to be removed and replaced with blank media to accommodate upcoming backup jobs.  It would even create a picklist for the admin that they could feed to another script that would move these tapes to the I/E slot without having to open the library and causing it to go offline.  That was pretty slick, for the time, and it provided real proactive data for the admin team to act upon.  If they paid attention to the emails sent to $someone_who_cares, then they would never get the panicked phone calls on Saturdays from the NOC personnel.

To many people, Unix seemed very foreign and difficult to use.  To all of us Unix types, we liked that job security and the exclusive club it placed us in.  I found the Unix community to be one of close camaraderie where tips and tricks were freely shared, often times in lieu of progressing on the tasks at hand. However, those times were short-lived as these systems, applications, and platforms we were administering were growing more and more complex with fewer experienced bodies to manage it.  We, therefore, had to find ways to automate or die.  The easier we made these solutions for others to monitor and manage, the better for us.  The same is true today.  Just think of all the complexities that are hidden behind a slick GUI or dashboard and while some of us die-hards still pine for command line, we are thankful for these GUIs and dashboards so we can focus on more critical functions within our organizations.

The automation I just outlined is great as it does help to remove some of the potential human error, and it may still only be “conditional automation”, such as found with IFTTT, but intelligent automation changes everything.  How wonderful would it have been if I could have had an actively monitoring daemon of NOAA tracking hurricanes, or massive snowstorms, and make intelligent decisions about proactively protecting my data, while at the same time deciding where to move my data and workloads in the event I lose a datacenter to these storms?

So, when you look for data protection solutions today, think about how it will or could be the intelligent data management platform for your environment.  Choose one that is more open to a greater range of partners to enhance its capabilities and support your ecosystem.

I have worked in data protection the majority of my professional career and something I have come to accept is the fact that many view my particular choice of vocation to be not the “sexiest job in IT”.  I’m okay with that actually, it may not be the flashiest in IT, but it is still absolutely required.  Stay tuned to my blog as I continue to reminisce about the days old the vendors in this market are addressing the challenges we still face today as backup admins (yes, I still consider myself a backup admin.)

Oldtimer, signing off.

-Chapa

Leave a Reply