We are currently living the most interesting computer history since the 80’s
by Mike Levin SEO & Datamaster, 07/26/2012
As a forty-something geezer, it’s self-evident that this current moment in time is the second-most-interesting point in the history of computers that I’ve lived through—or at least was conscious for. But I don’t know if this is so obvious all you youngun’s, so I wrote this article to point out the awesomeness of what we are currently living through, from my perspective as an ex-Commodore employee and fan.
I’m typing this article on my Nexus 7 on the New York subway, in the Gmail app, because I haven’t installed a good cloud notes app yet. Maybe Evernote. It really astounds me that the any-platform offline-or-on notes app is still an unsolved problem—except maybe Evernote. Apple comes close with its built-in Notes app, but its not web-editable. And of course, there’s no Android version. I’m caught in the battle over ecosystems—but things haven’t been this cool since the late 80’s.
We are privileged to be a part of the second personal computer renaissance. The first was in the 80’s when Microsoft’s dominance wasn’t yet established, and it could have been anyone’s game. I was an Amiga person back then, and nothing in the 25 intervening years has measured up since… until recently. I guess it started with me five years ago in line for the first iPhone on the first day. In the subsequent 5-year journey, I’ve felt the spirit of the Amiga Computer recaptured (I know rabid Amiga fans would dispute this—but the Amiga was no less proprietary than the iPhone). But the coolness is finally now becoming much more cross-industry than just Apple, and I’m hopeful this computer renaissance won’t turn out as badly as the one from the 80’s.
I simply love Apple these days, and started swapping out my “tech religion” when they performed the miracle of porting the entire Mac computer line to a new CPU architecture (PowerPC to Intel). Even Bill Gates marveled at Apple’s ability to pull that off. Now I’m no fan of Intel, but pulling that off demonstrated a level of savviness and strategic brilliance unlike anything I had seen since my Commodore days. But Apple one-upped anything Commodore ever did by combining that port to Intel with a port to sit the Mac on top of the most beloved OS of computer science and information technology people worldwide: Unix. Now, combine that with incorporating NVidia graphics into standard Mac’s in order to placate the gamer snobs, and finally get that price down to be comparable to an equivalent quality PC… well, I already called it a miracle.
Now those 2006 Mac Intel changes were quickly followed up by the iPhone in 2007, which is a wholly new platform. But a number of other things were going on that set the stage for today’s awesomeness. Circa 2006, the One Laptop Per Child initiative (OLPC) determined to make $100 the magical price-point for laptops in developing nations. Well, they fell short of that goal ($200), but started the ball rolling, with Asus releasing the first $400 netbook circa 2007 to meet the worldwide hunger for such a device that the OLPC initiative sparked the appetite for, the Eee 700-series, which I bought and still have. I was feeling history play out. Things were getting cool then, but weren’t quite there yet.
2010, about 3 years after Asus altered worldwide expectations for a laptop price-point, Apple released the iPad and demonstrated to the world how for just $100 more than the Asus Eee, you could get a much richer more pleasant experience. The screen is not only much bigger and higher quality, it’s also a touchscreen. The graphics scroll smoother and are video-capable. Getting apps that work well is just a few clicks away in the App Store or iTunes. And the very OS is built to be usable in such a small form-factor, and you don’t have to cramp you hands eyes on a chicklet keyboard and squint your at a tiny screen. Apple created a market with a product they produced way less expensively than anyone else at the time could have, so they made a tidy margin too. Got two on the first day—one each for me and my wife.
Part of Apple’s secret capabilities came from owning nearly every step of the process, from the CPU to the circuit board to the OS to the touchscreen glass and casework. Now granted, they used Samsung to fabricate the CPUs and Foxcon to make and assemble the parts. But Apple had like 100% control as if those companies were an extension of itself. It is a case of nearly total vertical integration that hasn’t been seen since the days of Commodore and the MOS chip design and fabrication company that they owned.
Left unchecked, Apple could have become the dominant supplier of electronic goods on the planet in a way that would have made Microsoft’s accomplishments seem like child’s play. Without the former Apple employee and Danger Sidekick co-founder Andy Rubin developing Android, which subsequently got acquired by Google then given out for free to mobile manufacturers, who knows if anything could have competed with iOS devices. But it wasn’t just phones and tablets. Apple mounted an industry-changing assault on the laptop space too, with its Macbook Air.
The svelt form-factor coupling with superior OS and an okay price threw the laptop industry into crisis. Everything else looked like clunky bricks in comparison and no one could put out a credible contender. I’m now up to three Macbook Airs. If you combined Macbook Air sales with iPads, Apple became the #1 mobile company on the planet, surpassing HP, Lenovo, Dell and all the other usual suspects. Because nobody else could bring to market a credible device to compete with Apple, Intel had to release a product specification for PCs that could actually compete with Airs—the Ultrabook.
But there’s more remarkable stuff going on that led to me tapping out this article on the Nexus 7. Remember Apple teaching the world you don’t need Intel to have an awesome tablet experience? Remember Intel telling the world you don’t need an NVidia graphics card for an Ultrabook (game-capable graphics are built-into Ivy Bridge processors)? Well, what do you do if you’re NVidia, getting double whammied? Well, you make a system-on-a-chip Tegra and a whole reference product for the industry for a $200 tablet, Kai, and get Google to adopt it as their tablet product, and get Asus, the guys who kicked off the netbooks craze in the first place, to build it.
They hit a magical $200 price point at a magical 7-inch form-factor on a device with no cellular—a brilliant decision to keep the carriers off and to simply make an appealing Android device available as a household’s second or third tablet, or possibly tenth Internet device. The Galaxy Note tempted me. Nexus 7 tipped me over the edge of becoming Android-literate and multi-lingual on at least tablets, if not phones. But I’ll be able to get around on an Android phone now too, I’m sure.
But I’m not only multilingual with OS X, iOS plus Android. My main day-to-day computer is running Ubuntu Linux. The Web browser is the great equalizer. And any good OS these days just magically fades into the background and lets you focus on the Apps. Ubuntu does that particularly well. And I’m actually running Ubuntu on a Commodore 64x, a retro re-issue of the C64. So on my desk, I’ve got an Apple and a Commodore (echoes of the 80’s). I’m able to embrace Android quickly because it too fades into the background once you load an app.
Also on my desk, I begrudgingly also have a Windows 7 HP laptop which I keep docked there for the occasional must-have-Office moments. But every time I go to it, I smell the desperation of a company who stole then botched windowing interfaces, stole then botched Web browsers, and is struggling with what to steal and botch next. And then to add insult to injury, when I open a command-line, it’s not the now old, familiar Unix terminal—but rather, this non-standard underpowered thing that I can’t even run an ssh command from to get to a Unix/Linux computer. Proprietary is only worth it if the experience is so much better than the free and open alternative that there’s really no choice. And it’s not. With how polished Ubuntu is now, I would never touch Windows again if I had a choice.
It is the 80’s all over again. This could be anyone’s game. But hopefully unlike the 80’s, overpriced mediocrity won’t win out. I don’t suppose it can anymore, with the Internet showing everyone what’s what, and chip fabrication plants and the other components no longer being rare. Plus, the thing that makes this new generation of devices really special is their high multimedia ability at a low price—and the people who loved the old Commodore Amiga loved it for its innovative graphics coprocessor chip, the Blitter. Today, the dark horse leading the charge with these devices (or at least the non-Apple charge) is NVidia, the graphics coprocessor company! Multimedia awesomeness assured!
This requires a touch of explanation. Old tower PCs were extremely modular and broke out functions into discrete and often interchangeable components. This made them upgradable and spawned an awesome peripheral expansion industry. But it lost the elegance and opportunity to cut costs and enhance the “default” performance with tight integration. This was most of the 90’s and early 2000’s.
But laptops were different. Each one was a work of art—not very expandable, but packing everything it needed into a tiny package. This made them more custom, expensive and disposable (non-upgradable). Unfortunately, the components they crammed in there was the same old power-hungry shlock. But the Wintel consortium and public’s sheepishness was such that Wintel Tower PCs and laptops had a lock on the industry for a quarter century.
A series of events unravelled this. I already covered OLPC, Asus, Apple and NVidia. But you can’t forget how AMD took the wind out of sails of the Intel Mhz Ghz arms-race with low-power x86 clone chips that everyone preferred for laptops. Also, a lot of the serious gamers driving the super-powered PC market switched to XBox 360 and PS3 game consoles. And finally, Google started offering a free Office software suite in the form of Google Docs that solved one of the greatest pain-points of MS Office—collaboration and zillions of versions of the same file.
So basically, all the killer-apps driving traditional PC sales flocked to other more appropriate platforms—the cloud, game consoles, tablets and phones. All thats left for PCs is the vast infrastructure investment of companies around the world who locked themselves into a particular vendors solution and must continue with it like it or not. This is similar to the IBM / Novell lock-in of the 80’s with big business, and the Microsoft lock-in amounts to just about the same thing today. Steve bet right. Bill bet wrong.
Once upon a time, business drove consumer devices, ala the PC compatible. You felt safe getting a PC at home, because it’s what the office had, and you could bring pirated software home. Today, consumer devices are just so much cheaper and cooler, and apps are like $1 apiece in the App Store, or free in the Debian repository. And now, companies who were totally clueless in the 90’s and 2000’s have finally learned their lessons, forged unholy alliances (Google+Asus+NVidia), and are artistically knitting together the best possible products with chip-level integration to produce things like the $200 Google Nexus 7. That makes what’s going on now as interesting as the 80’s.
And perhaps it’s appropriate to end this article with that certain intangible something that makes all the difference. It’s all the genius little-things layered upon each other that makes a wholly remarkable device. And despite my infatuation with the low-cost and 7” form-factor of the Nexus 7, I had to write this article in GMail because it didn’t have a built-in notes app. I couldn’t switch to Google Drive, because I couldn’t edit offline. And I even tried installing Evernote as the guaranteed fix to the offline/online cloud word doc problem—but the Android version of Evernote sucks—only letting you edit a paragraph in a strange user interface!
Ugh! So, what I ended up doing was copying the draft version of this document which was in my gmail on my iPhone, and copying it back over to Notes on my iPhone—a comfortable writing environment I’ve been getting used to as always-there for the past 5 years—right back to iPhone day one, knowing when I got into work it would be sitting there in the Notes folder of my Mac Mail program… but it wasn’t! I had upgraded to Mountain Lion yesterday… and… so… there was an unfamiliar Notes icon on my Mac Launcher bar. I opened it, and voila! There’s my article, right up to the latest word. Apple still has the best, most seamless cloud writing arrangement. I just wish you could use it through any old browser and on Android.
I guess I’ll stop this article here. This is the stuff that excites me, and I could go on forever. Suffice to say, we live in exciting computing times that the OLPC triggered in 2006, and after six years, has just acquired a momentum the likes of which I have not seen since the 80’s. There are no take-backs now, and things are about to change even more dramatically, ending 25 years of doldrums. I think it’s important to have a sense of the history that’s playing out. When you know the deeply rooted reasons for things, it helps you to become multi-lingual technology-wise, and focus in on the bits that transcend platforms and time, so that your skill-set always remains relevant and valuable. And hopefully, things get as fun and interesting as they were during the 80’s. The deck is getting reshuffled, and it once again could be anyone’s game.