Plan For Obsolescence-proofing Before Learning to Program

by Mike Levin SEO & Datamaster, 01/14/2013

Are you getting ready to learn to program? The purpose of this article is to equip you to fend off the fear, uncertainty and doubt (FUD) that is about to be thrust upon you as you go researching what programming language to start with. I’m here to tell you: start with something else. Buy yourself some time, and get used to some hardware issues surrounding code obsolescence, career obsolescence, and planning for neither to occur to you.

Obsolescence is a fact of life in technology. This is a sobering truth to people who are already in the field—but all the more so for people just getting into the field. There are so many good technologies heavily used online, guaranteeing you a fresh set of skills for life—I mean like Macromedia Flash… oh wait, no sorry. I mean like .NET… oh wait, no sorry. Neither one is really a good idea to take up as your primary language these days.

On Windows, there’s new thing after new thing after new thing. Joel Spolsky famously calls this effect “Fire and Motion” in reference to how developers keep you working hard just to stand still, while THEY forge ahead with truly new products that everyone is forced to adopt, renewing their revenue streams on a predictable recurring schedule. Becoming a developer is about INCREASING your freedom and independence from “the man” and not increasing dependency. On Windows only recently, you’ve had to move from Win32 to WPF to .NET to Silverlight, and now to whatever the “Windows Runtime” is that underlies Windows 8.

Info-tech is filled with carcases of technologies that were once the “must-know” tech darlings, which did indeed ensure you a job and a place in a trendy in-demand job marketplace… up until that time that it no longer did. The change may come with a new platform darling, like the switch from desktop to mobile. The change may come from a “Steve Jobs-like” declaration, such as shipping Macbook Airs with Flash is a bad idea. Or with common-sense security, like even having Java on your machine puts you at significant security risk.

The corollary rule to this is that if you take great risks and bet correctly, you can reap great rewards. A great current example is Objective C on the iOS platform, because you could build an Angry Bird empire or set the stage for a $billion-dollar Instagram buyout by Facebook. But no mistake: Objective C is a vendor-specific, philosophically-laden version of C that’s going to turn you into a dedicated developer for products in the Apple ecosystem, and not as a hacker for a broad variety of hardware platforms and problems. Adopting Objective C is a risk, and not everyone hits the jackpot. Think about your game!

There’s a few important concepts to grasp when deciding to become technical, learn programming, or whatever else you want to call it. In short, you’re trying to take a little more control over your own life, because so many creative and business ideas immediately call for programming, by not taking it up, you are forever relegating—and I would argue deliberately subordinating yourself to—to folks who can program. By some readings, they are an objectively more capable class of citizens than those who can’t. Business Insider has an aritcle that sums it up with this brilliant quote by a guitarist speaking of learning to sing:

You need to learn to sing. Because if you don’t, you’re always going to be at the mercy of some asshole singer.

When you learn to sing, no one can take your voicebox away from you. It is a skill and asset for life, and the more you practice, the better you’re going to get. Your voice-box is a mostly static tool that can get seasoned with time, as your mastery creeps ever-closer to perfect. There’s a few things you can’t do about the ravages of age, but you should be retired on your fortune by then anyway, living off your proceeds and personal brand (sans new performances). A similar mastery dynamic is at play with most musical instruments, like violins, which undergo very little turbulence with technological advancement over the ages, and much-less in 10-year spans.

Technology isn’t the same way. You’re lucky if you’re programming with the same programming language in 10-years, much less on anything that even resembles the computer or keyboard you’re using today. The carpet is simply yanked out from under you in tech, every 10 to 20 years. There’s very little avoiding it. Being a tech is a lot more like being an auto-mechanic, working on the new models of cars than it is like being an artist, creating pure beautiful work that transcends the tools and the ages. Google is a very-much 1997 to 2017 thing. Facebook, likewise. And things like Instagram can be measured by heartbeats in comparison.

Shit changes. I hardly need to make the argument, beyond spewing out a bunch of names. IBM. Netscape. Yahoo. Sun. MySpace. Microsoft. The list goes on. If you latch onto “one thing” that can make it very big as a sign of the times, then you’re in the lottery to become that one-big-thing before anyone else does, and that obsessive game of being first and remaining best, least you be knocked off the top of the hill by the next guy. After that, you have to aggressively re-invent and re-invent yourself, and basically go insane in that game.

There is another way… a better way that lets you live a much more sane life, trading your capacity for modern currency. You don’t have to be super-brilliant, be your own tech, be the first-mover in an undefined space, and turn yourself into a throbbing-brain alien to maintain that lead in life. Leave that to Larry and Sergey. Leave that to Mark. Leave that to Jeff Bezos. We will soon see whether Tim Cook is one of them. And we will soon attach a name to whatever crazy brilliance is going on over at Samsung. But I digress. Enough about the pulsing brain-aliens. What about us?

For most of you, all my talk about Objective C, .NET, Java and all the rest is coming off like blah, blah, blah. You’re just sort of trying to stay tuned-into the current best wisdom, which is leading you down the path to JavaScript, CSS and HTML5. With just these 3 technologies, you can do modern programming on either the server or the browser—and arguably even native-ish desktop and phone apps that approach the performance-level of what people call “native” apps. And much of that is true. And the pitfalls you will fall down are generally smaller and less dangerous than other paths. But it is still not the whole story.

The whole story begins with understanding hardware and layers, and WHY certain software runs on certain hardware, and exactly WHAT layers are, and how to pick your layers intelligently, because you are indeed picking your poison. Bad choices here are the very things that will kill you 5 or 15 years down the pike. A whole generation of Flash developer carcases are there to prove it.

Flash is the perfect example of a whole series of pitfalls—both numerous, and fatal. Flash is one big whopping LAYER between you and the hardware, that superimposes various very constrictive presumptions that are not always true. One of these is that you have very powerful hardware, and can use it all to make little circles spin and men move on your screen. Needless to say, mobile blew a hole in this supposition.

Other suppositions with Flash included that you would have a mouse that enabled point-hover-click, which is not true in phones and tablets. Tablets don’t have point-and-hover. They go right to click. Think about it: there’s no mouse. This is a huge reason why so few Flash games worked well in mobile. Flash is also—no matter what a great job they did of getting it distributed pre-installed with browsers—a proprietary vendor format.

Adobe controls Flash. There’s not some giant open source standards-minded industry association pushing Flash forward, as there is the C language, HTML5, or any number of other technologies. That makes Flash fragile and vulnerable. Flash is not just fragile due to shifting platform form-factors and user interface systems. Flash is also (now famously) vulnerable to corporate bitch-slapping. No one wants an outsider inserting themselves between you and your own future revenues. No one wants their own product options limited by some uppity software company that managed to extract the troll-toll from a prior generation of PostScript printers, and is trying to reproduce that success today with animation and video software.

Apple may be squeezing Flash off the future generation of mobile platforms: both iOS and Android. But bigger games are afoot. Android is tomorrow’s Flash. Google is very much setting itself up to be something like tomorrow’s Microsoft, but 1000 times more powerful, because they also store your data in the cloud, provide both your physical and digital goods (if they have their way against Amazon), and have your entire search-history, and own a significant portion of your overall online behavior and profile. Having been through it all before, companies like Samsung are hedging their bets by putting out non-Android/non-Microsoft versions of their phone, on Tizen, and such.

The prize in this grand new game is becoming the next GE/NBC like company—but for the Internet age. During the days when GE owned NBC, GE was in the position to predispose your world-view through the programs you watched over television (both news and sitcoms), and literally set your expectations for life, thereby turning you into a well-trained GE customer for life. It was the last time there was such a clear total-lifestyle provider.

First cable television, and then the Internet put an end to the GE/NBC-style total lifestyle provider company, because fragmentation channels into more of-interest niches made lifestyle-peddling too expensive and inefficient. The Internet created a historic and social anomaly that bubbled and amplified throughout the late-90’s and 2000’s, setting back lifestyle-peddling by decades. In its place, came the great playing-field leveler that was Google. Therefore, me getting into the field of search engine optimization. I was intent on riding this wonderful anomalous bubble as long as it would last, until corporate America could reassert control over our lifestyles.

Enter the “new” Google, Apple, Amazon, and maybe Facebook, Microsoft and Samsung. There are a certain number of fabric-of-life checklist items that go into knitting-together a new lifestyle-peddling company. It’s way more like an artistic process today than it is like a formula, which is why the grandmaster artist, Steve Jobs, was so good at it. Steve had the critical and counter-intuitive insight that everybody else balked at… right until Apple was the most valuable company on the planet.

And that realization is that hardware mattered. The physical object that you hold in your hand, and all the unlimited nuanced ways that you interact with that object are important. A great deal of your day gets plowed into those interaction, and a great deal of your identity (verging on religion) go into “getting good at” the platform. This thought-process extended to everything that went into the hardware too, such as the programming-language and the use of pinch-and-zoom use of touch-screens and the like.

See, Steve hated layers. He grew up with the computer industry and saw what computers really ran great. He saw game consoles. He saw the anomalous Commodore Amiga computer, and how “hitting the hardware” made the miracles—which was really the brainchild of Atari engineers. He also saw the Amiga’s failure, and how Commodore squeezed profits from total vertical integration, like no one else on the planet, and how the lack of good API’s to the hardware and developer good-will spelled a doomed future for the platform.

Steve tried reproducing the purest form of the vertical-integration-love that he learned with the NeXT computer and failed, and figured out what parts needed to remain standard commodities, and which parts needed to be secret sauce. He learned how to treat, win, and wrangle developers. He learned all the many moving parts that needed to be synchronized in one beautiful dance… to kick Bill Gates’ ass. And he did. Steve had over 30 years of lessons to learn from, and was a very good student of history, when it came to his passions—if not his health.

And Steve helped to create one of the most beautiful proprietary platforms that really does in fact improve your life. The default champion (and unfortunate) of the free software movement called the iPhone a “beautiful prison”—and proceeded to ungracefully comment on Steve’s death as something positive. This highlights one of the most important issues of our day to be understood to be a truly powerhouse tech in control of your own life, and immune to sudden obsolescence.

Beautiful prisons can be good—IF the warden remains benevolent, AND the prison remains profitable. The place you don’t want to end up is somewhere that you THOUGHT was a beautiful prison, but really wasn’t—Flash, Java, .Net, etc. There are at least two remedies to this: 1) bet correctly on the beautiful prison, and 2) focus on the fundamentals that are immune to such industry turbulence.

And it’s probably really best if you do both, since so few mobile apps are stand-alone anymore, without a significant server piece that’s much more dominated by the fundamentals of tech, than the glitzy graphics-processing consumer devices, like phones and tablets. While Objective C will make you a wizardly front-end developer on Apple devices, it won’t necessarily make you wizardly when you have to burst-out 100 more instances of your server on the Amazon EC2 cloud to accommodate sudden unexpected (but accounted-for) bursts in usage. Think getting suddenly featured in the App Store on an app that connects to servers (Twitter, Instagram, etc.).

Focusing on the fundamentals helps you with the server-side of things more than on the glitzy phone and tablet-side, but its also the side that is less susceptible to fads, trends, and the accompanying sudden obsolescence. To focus on fundamentals, you strip out LAYERS—the very kind that Steve Jobs hated. But you can’t strip out all layers. Everything has layers. Layers are not bad. You couldn’t do anything in tech without layers, unless you were willing to program in machine-code. And when you do program in machine-code (or the next-level up, which is Assembler), you are tying your expertise to that very particular piece of hardware, which may not be around in a few years.

So, layers are the bits that “thunk” the capabilities of what you’re trying to do down to SOMETHING LESS than the particular hardware you’re working on, but extend the viable life of the piece of code you wrote, because it will be able to run on more things over the coming years. There is therefore a balance to be struck here between limiting the capabilities of what you can do, versus extending the applicability of what you’re creating. If you were doing a one-time, one-piece art-installation that your reputation for life would be based on, you would learn and hit the hardware hard, to eek out maximum performance.

Conversely, if you were designing the software to run Google’s systems across unlimited unknown future server-hardware in endlessly evolving eco-conscious datacenters, you would ensure your software could flow seamlessly from Intel-based servers to ARM-based servers, to whatever else might come in the future. In other words, you write to the lowest-common-denominator reliable OPERATING SYSTEM—and not the hardware. And THAT’S why Unix was invented. THAT’S why Unix is standard. THAT’S why Linux and OS X are valid options for a development platform, and the totally non-standard NOT derived from Unix Windows operating systems are not.

If you’re just becoming technical, THROW OUT YOUR WINDOWS BOXES—or at least, install Ubuntu on them to start your transition to standards-land. Ubuntu isn’t 100% standard. It doesn’t match the industry-wide POSIX-compliance measures for Unix. Linux in-general doesn’t. However, Linux matches it ENOUGH for you to be on very solid ground battling obsolessence, and keeping your skills relevant.

It sounds crazy, but it’s more important to instantiate your hardware-of-choice than it is to learn to program (at first). It’s more important to instantiate ROUGHLY EQUIVALENT multiple versions of your hardware-of-choice than it is to learn to program (at first). What the heck am I talking about? My recently released Levinux is a virtual Linux that you can instantiate with a double-click from the desktop of your Windows, OS X or Ubuntu desktop. It is a stripped-down Linux installation that has almost NONE of the developer-predisposing layers installed.

So for example, if you want Python on Levinux, you need to (easily) install it. If you want to develop in C++ or Standard C, you have to install the GNU Toolchain. It’s easy to do these installs, and in fact, Levinux has a “Recipe” system to cook-up exactly the server and dev-platform you need. But it makes you consciously think-through the issues, instead of diving blindly ahead. You at least have to pick which Recipe you’re going to run. My personal dev recipe installs Python, Mercurial. Another very popular one I expect would be the GNU compiler and git. And yet another would be JavaScript with node.js.

I am writing this article to put perspective on Levinux. It is just ONE (and I think the easiest) way to get yourself a quick barebones, standard-enough, few-layered place to begin thinking about building YOUR stack to run code—in a way designed to fight obsolescence SERVER-SIDE. It’s hard to advocate anything client-side. Maybe just do whatever you want there, and get some parallel learning experience, because there’s nothing I can propose that will compete with the coolness of Objective C native apps on iOS or HTML5/CSS/JavaScript apps on any Web-centric platform. So, just do those that way.

Other places you should probably think about being able to run your code in addition to the little virtual machine I provide, is at least one cloud-instance. Cloud instances can be quite generic in nature. Amazon peddles their “Ami Linux”, which is their version of a sort of stripped-down, small footprint Linux that serves as a good starting point for a small, efficient virtual instance of Linux. Rackspace also has a very fine offer (what I use) starting at $10/mo for a very modest machine. And finally the last place should be some sort of real and dedicated hardware, such as a Raspberry Pi or some other hardware you have sitting around the house that you can throw to the cause.

And so there. Now you’ll have three places you can run the same code—once you eventually start writing it: the Levinux server, a Rackspace (or EC2 or other) cloud instance, and some piece of hardware you breathed life into. To breath life into hardware, you simply need to get a Debian or Ubuntu Linux install disk, and boot your system through USB, and follow the install instructions. Almost anything can become a Linux server, from old laptops to $35 parts you can order off the Internet. My favorite Linux server hardware these days is the $35 Raspberry Pi computer.

And so there you have it. It’s just as important to know where and how your code is going to run, and how you’re going to keep it running over the years, as it is to pick your programming language. Beware the pitfalls designed to lock you into particular vendor solutions, or which are fads tied to a particular state-of-hardware. Instead, strive for a sort of timelessness—but that timelessness realistically only exists on the server-side of the equation and not the sexy client hardware. So just accept that. There are at least two separate parallel endeavors in becoming technical: learning something like mobile app development where you can’t really get away from trends and style, and server back-end development, where you want anything except trendy fashions—and that means a barebones Unix-like operating system, plus something else.

Precisely what that something else is will be a subject of future articles.