Learn Computer History When Becoming a Tech

by Mike Levin SEO & Datamaster, 01/30/2013

I’ve started teaching a friend to become technical—someone who for years now has demonstrated the aptitude and interest, but for whatever reasons has not yet taken the plunge. Or shall we say—has never stayed on the water and committed to a swim after taking the plunge. With my mission to turn my personal Linux distribution into my secret weapon at work and into a groundswell movement in education, I’m taking the opportunity to turn him into a tech and take some notes.

After the first couple of lessons, my observation—at least with this particular learner—let’s call him Boris—is context, context, context! This was an unexpected learning for me. Even though the computer history context of what I do is always fascinating to me, I always thought it would make others tune out. For me, these rich histories have eclipsed sports, music, politics, and a ton of other areas in which people revel in knowing the players, following the careers, reciting the stats. If you talk to me about programming syntax, I’m as likely as the next guy to zone-out. But talk to me about the stranger-than-fiction tales of Ada Lovelace, Alan Turing, Charles Babbage, and I’m engaged. I’m glad to learn it’s not only me.

Who knows if its the same way with all learners. But it is with Boris. He’s a lover of stories. In fact, I give Boris credit for turning me back onto SciFi after a very long hiatus, insisting that Neal Stephenson’s Diamond Age was written almost specifically for me—given the issues of raising my daughter in a technologically accelerating world that he and I sometimes discussed over our “SEO Babble Lunches”. Boris was right regarding Diamond Age. Now, the Young Lady’s Illustrated Primer is a cornerstone of recent thinking. Truth be told, Boris had conquered the obtuse and heady Stephenson tome called The Cryptonomicon that I have yet to get through, which tells me that he is as able to be a “tech” as anyone. While merely reading doesn’t make you a tech, actually understanding Neal’s perpetually challenging concepts qualifies to you make the attempt.

Perhaps it’s Boris’ predilection for stories that made the context of what I was trying to teach him as interesting (more?) than the subject matter itself. Specifically, I’m trying to teach him very old-school, yet amazingly still relevant, information tech that underlies most other tech. Some call it the “short stack”. But basically, you strip away all graphical windows managers (Windows, OS X) and monolithic frameworks (Objective C, .NET), and work with nothing more than what you need to make your program (a short stack). Then you carefully add things on top of this base of the stack (which is usually Unix or Linux) to get your job done—and no more!

I fully disclosed to Boris that this was the utterly opposite to the HTML5 / CSS3 / JavaScript and sometimes PHP approach, which is where the jobs are—and different from the Mobile Apps Instagram / Angry Birds approach, which is also where the jobs are AND could let you hit the billionaire jackpot.

My way is different. Those who read my articles know that I advocate the fundamentals, and stripping out every reasonable unnecessary dependency, until all you have left is a very small version of Unix/Linux. Then you build-up just on that until you have precisely what you need to accomplish the job at hand. This is not the “client software” approach of Mobile Apps, and it is not really the Web development approach. Writing the actual client-facing user interface is a different, and admittedly sexier world. Instead, my way is more of a API-publishing, API-consuming approach by machines and for machines. It lets you build Internet and Web services that other things use—which is occasionally useful for automating some tasks that you are called upon to perform. My way makes you a citizen who can automate tasks on a broad variety of machinery that is around you today.

You might describe my way as old-school—or lowest-common denominator. I like to think of it as timeless, and obsolescence-proof. Some have even called it bad-ass. It involves logging into remote servers, and using only a text-based interface and the most cryptic—and yet worth mastering—text editors known to man: vi(m). Discovering this approach was born of a “got screwed by Microsoft / fool me twice” mentality (I was an Active Server Page guy). Any way you slice it, my way is highly effective in the modern world and elevates you to being a sort of super-citizen. Yet, it won’t necessarily get you hired into the best, most in-demand jobs. It’s not trendy. But it is undeniably useful, and underlies all other more complicated information technology. I basically tried to scare Boris away.

But Boris asks why. I start to tell him why I take such an approach. I tell him about the Multics consortium from the ’60s and Ken Thompson of AT&T, and his frustration with Multics and his rebel-streak—and how he just went and made Unix, and the tradition of play-on-words deeply ingrained in the Unix culture that was born that moment. I tell him about Berkeley Unix that tried to free Unix from commercialism (BSD), and it’s legal quagmire that kept it in a prison—and the serendipitous work by a young Fin named Linus Torvalds and how Linux’s work came together with Richard Matthew Stallman’s (rms) GNU to form the worlds first FOSS operating system—and why rms hates the concept “Open Software” and hates Steve Jobs and was happy he was dead.

I showed Boris Google Knowledge Graph pics of Richard—instant understanding when you just LOOK at him in contrast to Steve. And I showed him Ken who lands somewhere in between. Boris noticed Ken was still alive, and I explained how he is working for Google on the Go language. Boris doesn’t understand the need for new languages, so I explain Go and compiled languages versus interpreted ones, and the wasted time during C compiles, and how you take that times 20K to see why long compile-times plague Google.

Compiled versus interpreted languages is a difficult discussion these days. You have to skip over a lot of confusing details about hybrid interpreters, pseudo-code, JIT compilers, virtual runtime engines, and the like. You have to focus on the fact that if you know everything ahead of time, you can make the fastest, most optimized code reasonably possible without working in machine code.

So, Boris asks: So, compiled languages are best? I say: well, if you’re programming an operating system and drivers, sure. But if you’re programming robotic cars to autonomously cross the desert, you want something entirely different, because your problem is so unique, and your problem domain is so particular. You probably want to write your own language in a language for writing languages. In other words, you want LISP.

And Boris is like: so LISP is not compiled? I’m like: No. Not until you write a compiler in LISP to compile your own code. BAM! Head explodes. Back up! Simplify! If you want to write solid, fast code, use C. If you’re solving a very difficult problem that no one else has addressed before and you hit a lot of obstacles that you feel may be due to the limitations of the tools you’re using, try LISP.

There is always a better way of looking at a problem, and LISP could get you there. But both C and LISP require quite a large commitment, and I wouldn’t recommend going that route unless you plan on becoming a career programmer. You don’t need a career programmer’s pulsing alien brain just to be a tech. There are better language choices for a tech. The only thing in common is sitting on top of the heritage of Unix. In other words, no matter how you slice it, Unix or Linux should be at the base of the technologies you “stack” on top of each other to arrive at your solutions.

So, what about non-compiled languages? What are they, and why would you use them? Well, there’s a bunch, and they’re popular because they’re usually easier to get started and be productive fast. There’s a lot that you may have heard of here: PHP, which Facebook and WordPress are based on. There’s Ruby, Python and PERL—all of which play a very big role on the Web. Oh, and there’s JavaScript. Each of these has its own fairly rich story behind it, why you would use it, and how it can approach similar performance in many cases to a compiled language like C.

There’s literally a whole Neal Stephenson-like book here, based on REAL history. And the real history is still playing out. Wait, the guy at Google helping to get rid of C compile-time is the guy who invented Unix? Yes. What about the guy who invented C? Oh, Dennis Ritchie invented C—sadly, he just passed away. By the way, Dennis co-invented Unix with Ken Thompson, and actually made C as an update to a similar prior language named BCPL, so that Unix could be ported to different hardware more easily. As a result of the ease of porting Unix to different hardware that C brought about, C is just as important an ingredient to Unix’s success as Unix itself. And Ken’s making a replacement for C? Yes. For Google? Yes. And Google really needs this? And they’re tapping the inventor of Unix? Yes. Wow.

Context, context, context! What about the language we’re focusing on now, Python? How does that fit in? Well, a young dutchman named Guido van Rossum decided to do a summer project taking the best of the Unix and C worlds and make a language with the strengths of a prior academic language called ABC, and make it as pragmatic and human-understandable and real-world problem-solving as possible. Guido asked: Can one language be equally loved by newbies and compsci-freaks? His answer was: Yes! It’s all lovable stuff about C and Unix with none of the crazy syntax and overhead—plus Python can be spot-optimized with C for maximum punch.

Oh, and the Google empire was built on Python. Yep, PageRank was originally written as a Python program. Tons of Google systems were written in Python—at one point, rumored to be as much as 25%. They’re phasing Python-for-systems out now, because Google really needs to use a compiled system-language like C (Go), but Google’s Python-love runs deep. Many of their API client libraries are still made available in Python, despite the groundswell popularity of JavaScript as the one language to rule-them-all. Oh, and MIT even dumped a now legendary 20-year tradition of teaching LISP to incoming students as an introduction to computer science (6.001 Structure and Interpretation of Computer Programs), in favor of Python. So, what’s Guido van Rossum doing now? Oh, he works for Google.

Well, this is a whole lot of talking, and I’m all about doing. Seth Godin reminds me today that too much talk doesn’t do anyone any good—and points to a historic Jeffrey Pfeffer piece on why we can’t get anything done. And so, it’s time to move onto the new problem at hand: tutorials to get over the unexpected and interesting hurdles I encountered with Boris, trying to introduce him to Python from a Windows laptop—and why even just getting an SSH program to log into a remote server is a challenge, because Windows is not part of the Unix lineage, so everything is subtly harder. It was so hard in fact (copying and pasting with PuTTY from a Windows laptop with no mouse), that I switched to just showing him a native install of Python under Windows. I’ll switch back the remote server login approach when I have a reasonable way (WinSCP?). And when I really get around to the first “course” for my Linux distribution, Levinux, it’s going to be a doozie.