Note: If you want to go down the true righteous path of learning Unix/Linux system stuff first, then building on top of that with an appropriate programming language, then try out Levinux, my own Linux distro designed to get you started right.
When it’s time to become technical, one of the overwhelming questions is where to begin. Your decisions at this point could determine whether you become genuine technical talent, or just another code monkey. There’s a difference between folks who fundamentially know things and make the right decisions for the right reasons, versus those who can slap out some PHP to customize WordPress. The former can be a tour de force techie, and the later, a hack—in the not-good sense.
There is quite literally too much to learn, and a lot of it is driven by fads. Yep, fads exist in tech just like everywhere else, and what sounds like fundamental progress today can seem like a huge waste of time tomorrow. Plus, even among the compeltely enlightened and valid paths, there are dramatic splits based on how you want to specialize, and what and “secret-weapon” advantages you wish to have.
You’re going to face an onslaught of keywords, with each one’s “camp” claiming thiers is the one true path, such as Java, .NET (the two biggest, most major camps), and a series of smaller camps that generally align to programming languages or specific platforms, like Objective-C for iOS development. Based on what choices you make now, you are more-or-less setting yourself on a path that is very difficult to change, because you are going to be fundamentially re-wiring your brain, and the first few things you learn are going to predispose you towards everything that follows.
There is no one true enlightened path for programming languages That is because different programming languages have different strengthes. Your choice of programming language should align to both your personal strengths and the types of problems you’re going to be working on. Google invented their own programming language called Go, because their problems in creating Google “systems” are so particular, requirements so precise, and number of people working together so large, it was worth it for them. The shortcomings of other languages were just too expensive, and Go works for them. But it isn’t for everyone.
On the other extreme of a language just invented for a particular problem is a language older than computers themselves, called LISP. It’s still alive and active in various dialiects including Clojure, that have a small but fanatical following. The “personality” of LISP couldn’t be more opposite to Go. Go is new and LISP is old. Go is designed to very specifically solve Google-scale, concurrency and execution-speed problems. LISP is a language for creating other languages to solve problems that could barely be solved in any other way. Go was born to be collaborative, where LISP results in hard-to-decypher personal bizarro worlds (you may have wrote your own language, for example).
When you’re just getting started in becoming technical, it’s almost too much to ask to pick a language as a first step. There is so much predisposing that’s going to occur there. Today’s assumptions are tomorrow’s regrets. Languages and their execution environments are always being retired or going out of style, such as happened with the highly touted VBScript on Microsoft Active Server Page platform. But then, the stage was set for the stellar rise of PHP, a similar Web development environment whose most compelling attribute was that it was there when ASP died.
So, what about PHP? It seems easy to learn. Facebook and WordPress are based on it, so it has to be a good choice, right? Well, it does in fact have a tremendous Web-development-centric philosophy, and is easy to set up and configure and get your first sample code running. Problem solved, right? Well, for a certain class of problems, yes—like writing the original versions of Facebook. But when you have 500 million users, it turns out Facebook have to re-write PHP itself. Twitter had a similar problem with Ruby on Rails and re-wrote portions in Scala.
And it’s not just programming languages where you can go wrong. It’s programming style too. For example, even the object-oriented programming that has been preached as the gospel for so many years is being increasingly challenged as just one programming style that is not well-suited for every situation. It’s great if you’re programming a very large game requiring “emergent” behaviors. But if you’re just trying to automate some simple tasks in your life, you’re adding all this mental overhead to stuff that could otherwise be trivially easy in Python or other languages that don’t insist on object-orientated notation and all the mental gymnastics that come with it.
Rather than facing the giant smorgasbord of technology choices, it is better to look at what’s thought of as the technology “stack” with hardware at the bottom and the apps you want to write at the top. In between, there are many layers put in to make things easier for you to write apps and move them between different hardware. Those layers consist of (among many other things) operating systems and a programming language’s execution environment. At the bottom is hardware.
From the hardware on up, each layer provides an additional level of abstraction, and a wider array of choices where you can go wrong. Abstraction just means simplification and standardizing of how things work together. If your application was written directly for a particular piece of hardware, you would have to rewrite it every time Intel put out a new chip or motherboard.
That is in fact how things once were in the days of a line of computers from the 60′s and 70′s called PDPs. One of the first great abstractions in the computer industry came from Ken Thompson in the form of Unix, an OS that could exist on a broad array of hardware more easily than other OSes of the day. Therefore, you could write your apps for Unix rather than just the PDP-11 hardware, and share your code worldwide.
Unix represents something akin to fundamental fabric of the information technology world—as opposed to a fad. All other operating systems come and go, but only Unix in its various forms survives and thrives. Software written for DOS or Amiga or Windows 3.1 you could still get to run if you put in enough effort. But software written for Unix has a sort of timeless forever quality to it. It could probably be compiled and run on any modern Macintosh with minimal effort. So, keep your stack short, and make sure Unix (or Linux) is in it.
The point here is that Unix is the one layer in the technology stack that should matter most to you as a burdening technology powerhouse. Trust me, you WANT this timeless momentum-sustaining juggernaut of an abstraction layer underpinning all other work you do. You’ll be able to swap out hardware, such as Intel, ARM, MIPS, Cell, and most of your code will still run with minimal effort. You could even swap out flavors of Unix, like Linux, Solaris, BSD or OS X, and your code will still run.
This helps to illiminate all the other decisions you will make. For example, choose a programming language that existed (long-term and well supported) on Unix, and many language pitfalls are avoided. Focus on learning the programming code execution context would be for the languages you’re interested in, and what it requires and how you can recreate it.
That’s not to say that you have to become a Unix guru. But you should be able to install it (or Linux) on just about anything, and configure it to your requirements for the application you’re developing. Recreate it. Run commands. Edit text files. Learn your way around the Unix file structure. That’s often more important than coding. It sets you on the path to having some invaluable sysadmin, netadmin and devops skills.
You can get tutorials on getting started with programming anywhere. Once you’ve picked your lanugage, you could pick up any number of countless books, or start online classes and start down some course. But I recommend taking a step backwards and focus on understanding all the various ways your code can actually get run. What makes it possible, and what are the relative merits of different approaches.