Every once in a while I head over to CertCities to see if they’ve finally gotten around to another Hottest Certs for xxxx, which we haven’t seen one since 12/2005 or certification salary surveys which have fallen behind the times as well. I collect certifications now and then. Initially I picked up a bunch of Microsoft certifications to get a foot hold on the Seattle market after moving here. Now they’re not so important because I work for startups where Microsoft comprehension is essential, but challenges lie elsewhere, mostly in Open Source. A while back I went and got an LPIC-1 and LPIC-2, feeling like I should have a Linux certification but didn’t have the time or money for the RHCE lab, or any respect for the CompTIA Linux+. I got an email from LPI today for a survey they’re conducting about where LPI should go from here, which made me head back to CertCities and I found a number of recent articles by Emmett Dulaney about Linux that made me send him a couple of emails.
One, “Pondering Ubuntu 8.04“, subtitled “Did the few minor tweaks included in the latest version of Ubuntu actually warrant a new release? Emmett’s not so sure.” is about how the lack of new features in hardy doesn’t justify the release. It misses every point of the release cycle, and even comments about how everyone hated Microsoft for making regular releases. Well, because we had to pay for them each time, maybe?
To the folks that think upgrading from Server 2000 to Server 2003 is good because it’s new, you simply present Ubuntu as 7.10 and 8.04. When interacting with colleagues we usually refer to releases by short name such as ‘gutsy’ or ‘hardy’, which allows interjecting debian releases like ‘etch’ and ‘lenny’ without having to specify the distribution explicitly.
Of course a suitable reason for 8.04 enough is the release cycle. Debian has an amazing framework but releases are slow. Debian etch was initially released in 2007-04 and we’re hoping that lenny will be out this year, but we’ll see. Just yesterday I had to back port packages from lenny to etch because each release gets security updates, not version updates, so you have to wait for the next release for the version updates or go through the trouble of doing the backport yourself.
One might question why backport a deb package when you can simply install the new software and the answer is one of configuration management. Whenever I inherit a network full of linux systems I have to try to figure out what software was installed where. There are many instances where the same software will have been installed as different versions by different people over the years and it’s difficult to tell which is being used. Packaging solves this because (slotting aside) there’s one version installed and you can use packaging software to tell exactly what files belong to that package and
where they are.While this may not seem of immediate benefit to a single user, it is because it’s essential to troubleshooting user problems for those that provide support, in the case of Ubuntu, mostly for free.
While Hardy may not have any visually apparent and stunning changes, I assure you there are lots of updates behind the desktop that are well worth the appreciation.
The other was, “Linux Certs and the Cutting Edge“, subtitled “Some certifications seem stuck in the Dark Ages. Plus, Book of the Week toes the command line.” This article goes on to talk about how “df, du, kill, ls, mv, rm, tar, umask, vi and so on” are on all the tests and offers that it’s because of the “commonality between the distributions”, not because these are all essential utilities. Anyways, here’s my email:
CompTIA is always a terrible example of certifications because it’s so entry level. I can’t complain a whole lot because it’s respected and besides questions that I consider obscure to my job roles (like fixing laser printers) it’s pretty easy to pass the tests.
“df, du, kill, ls, mv, rm, tar, umask, vi and so on”
These are all -essential-. I would never hire someone who failed to explain exactly what each of these tools does. I feel like LPI certifications may be a little overboard because they expect you to know what certain flags do for each command, when you can always look them up in the man page. But knowing the difference between tar -z and tar -j is always a good thing.
For example though, we have a fairly complex configuration employing debian linux hosting and as a guest on vmware-server, with configuration management by puppet with git, and capistrano for system administration. While someone with experience with these things is good, the following is a piece of a puppet recipe I wrote:
# set linux clock algorithm
# non rescue (single) kernel lines in grub config that don’t have a clock algorithm set get set to pit
# best to run this regularly (this will run everytime) so that new kernel installs get this added
# there is the edge case that a kernel is upgraded and we don’t wait for puppet to run before the reboot
exec { “set-vmware-clock”:
command => “/bin/sed -ie ‘/clock\|single/! s/^kernel.*/& clocksource=pit/ ‘ /boot/grub/menu.lst”,
onlyif => “/bin/grep ‘^kernel’ /boot/grub/menu.lst | /bin/grep -v ‘single’ | /bin/grep -v ‘clock'”,
}If someone can’t look at that and tell me what it does, they’re not getting hired here. They don’t need to know so much about puppet, that they can figure out, even just by looking at the recipe you get a good idea of what the puppet portion of the configuration is for. But if you’re not familiar with standard tools, you’re not going to get much done, regardless of how much you may know about something like puppet. If you look at that and know that ‘grep’ returns a line of text, but don’t know that ‘-v’ makes it exclude that line, you’re going to miss the point of that recipe.
The key isn’t that these tools are distribution neutral, giving you a lot of common ground. The key is that these tools are extremely powerful provided you know how to use them. The more you familiarize yourself with them, the more you can chain them together and make more powerful solutions.