Home / Learning Linux and Unix / Choosing My Tools in Switching to Linux

Choosing My Tools in Switching to Linux

I’m moving from Windows to *nix for server programming and am finally getting comfortable. In this post, I’m laying out my switching strategy for others who wish to become powerful by going old-school with a modern twist. There’s just too much to know even with just a basic Linux install, so the trick is to trim it down. I think back to my Amiga days when a multitasking graphical operating system could (and had to) boot from a 400K floppy. It was possible to know every directory and file, and mix your own AmigaDOS distributions, choosing replacement commands, alternative shells, and even GUI navigators! The entire startup sequence was defined in a file named… startup-sequence! Ahh, those were the days.

Today, even the smallest Linux distributions seem to be about 50 MB. Therefore, to learn Linux, you simply have to wade through tons more files, and it therefore takes longer to know what the heck is going on. And to make it worse, each command-line tool in the standard Linux toolbox, such as vi and is a world of learning in itself. A single command can be 400KB, and a package like vi Enhanced (vim) non-graphical version is alone 20MB. Vim was actually born on the Amiga, and I remember the month it was distributed at the user group meeting on a Fred Fish disk. In those days, it could be thrown on a floppy right along with the entire Amiga OS. Today, the 20MB decedent is my primary tool in Linux. More on that in a moment.

Nothing gives you familiarity with a system like learning it’s boot process. It is also a key to power, because a primary benefit of Linux is that it will run on such a diverse range of hardware. A goal here is to develop the ability to create new Linux systems on-demand on almost any hardware available. THAT’S power. You learn things like how hardware is hard-wired to look in a certain place for a bootloader (lilo or grub), which point to the kernel, which point to an initialization script, which gets everything else going. Anyone can run an OS installer, but understanding how to bootstrap a system from scratch provides real insight. Knowledge and control over embedded systems arises from this–which is valuable, because just about everything contains embedded *nix these days. Many of the powerful tools in the Linux toolbox will run just as readily on a $100 microserver like AppleTV or SheevaPlug as they will a $5000 server–or $50K server, for that matter. Master system administration on a $100 box, and you have some ability to stand toe-to-toe with sys admins.

After understanding the boot process, the type-in command-line interface, or “shell” is the next thing to learn–but not necessarily to master. The shell is not the OS; it’s just one way of interacting with it. And there are many shells, just as there are many windowing systems. The Berkeley-derived Bash (BORN again) is the most popular, and usually the default. While you don’t need to master it, it is still a powerful command-execution and even programming environment common to every *nix system. You don’t need to know every shell command–because even just the simple ones like “ls” that lists the contents of directories simply have too much to learn! But it’s important to learn the WAY of learning commands. For example, “ls -la” is the most useful invocation of ls. There’s almost no way of knowing this without being told, or playing Sherlock Holmes. What’s worse, to actually read the output that scrolls past too fast, you have to “pipe it to more” “ls -la | more”. To even understand this, you must differentiate l’s from I’s from |’s. You need to learn–and learn to love–a few of *nix’s personality quirks like this, mostly expressed in the shell.

Okay, so understanding the boot process and getting around the shell are the two most important starting points in the switch to Linux for server programming. But what’s next? A language, right? Nope! Even more important than language is the timeless text editor that is always there and becomes as powerful as you allow. This usually means either vi or emacs, and making a choice is like religion. I use vi enhanced (vim) because vi is almost always on the system, while emacs is not. And while either one takes considerable customization to suit your precise needs, vi has this almost video-game feel of blasting text around the screen, even without any customization. And frankly, that makes you look and feel bad-ass. Getting good at vi or emacs is a skill that will serve you well for the rest of your life, and free you from the tyranny of vendor-lock-in IDEs like Visual Studio or XCode. Your text editor is your rock of stability in a tumultuous sea of programming trends.

And now we’re up to language, right? Not quite. First, we solve a thousand little problems that arise in coding, job-switching, and life in general–by choosing a distributed version control system! This is the “new twist on old school” that I mentioned, and the one thing that has dramatically changed recently that is more than just a trend. The old way, CVS and SVN, are not actually distributed systems. Listen to Linus Torvalds rant on the topic for the full story. You should have an infinite undo of all of your code edits living on a variety of hardware so that even catastrophic equipment failure will not lose you a line of code or project history. Even the main code on your main repository machine is no more special than any other instance of your code–it just happens to also be a webserver. The end result being the whack-a-mole like survival of your code and easy replication of your life’s work. The defacto standard DVCS appears to be git, Linus’ system that’s used on the Linux kernel. I use the fractionally easier (because it’s better documented) Mercurial system, which is command-compatible with git. I may end up switching to git someday, based on ongoing support. However, Mercurial is used for the Mozilla project, and is written in Python… foreshadowing my language choice.

So we’re up to language finally! If you want to devote large chunks of your brain and soul to getting the most out of hardware, then C++ is your language. A tiny group of meta-programming elitists will choose some dialect of LISP so they can write programs that write programs. The rest of us will make a choice based on preference and support. Java is easier than C++, but still requires too large a chunk of your soul for too little return. PHP is the total opposite and is easy to build Web apps–but has no soul. And of course, you can’t get away from JavaScript which is built into most Web browsers–but whose support in servers is lacking. Finally, there’s Perl, Ruby and Python. Perl is the pee in the pool of *nix due to it’s heavy use in installers, but its pathological eclecticness makes it difficult for large projects. Conversely, Ruby is Perl perfected, and its popular Rails framework makes large projects a joy. Ruby would have been my choice if Google offered client API libraries for GData and AdWords.

And then there’s what ended up to be my choice, Python. I’ll admit that I initially chose it out of defeat, having exhausted every viable way of running JavaScript on the server, and cursing that very few client-libraries are ever released in Ruby. My perception was that there was nothing special about Python except that Google somehow preferred it. But I have subsequently come to love it. It only costs a small portion of your soul for a very high return, promotes beautiful coding, is full of elegant shortcuts, and has tons of support. Above all, it has no curly-brackets or keywords for code-block delineation. Instead, it uses indents, which is how it promotes beautiful coding. Probably in the end, I would have been happy with JavaScript or Ruby if they worked out, but am somehow very happy that I count Python among BASH, vim and Mercurial as my new primary tools.

Comments

comments

Previous
Next