Setting the Stage: Context for the Curious Book Reader
Context for the Curious Book Reader
This journal entry is a raw, stream-of-consciousness exploration of a foundational concept for the book: the idea of technological ‘bedrock.’ As I grappled with the rapid pace of change, particularly with AI, I found myself returning to the enduring principles of the Unix/Linux world. This piece connects the competitive need for process acceleration with the stability of mastering core, timeless skills. It’s a dive into a bit of tech history, the powerful 80/20 rule for learning, and an argument for why humans with deep technical understanding will remain essential. This is the ‘why’ behind the ‘what’ of the skills I believe are truly future-proof.
Acceleration is key. Things are getting so competitive out there that if the things you’re doing aren’t also somehow accelerating your overall process with some sort of self-improving iterative loop, you’re going to fall behind the competitors that are doing exactly that. Processes are being optimized.
The Acceleration Imperative
I wrote the last article on pivoting, just as I myself am pivoting on how I push these daily technical journal articles out to my website as the fodder for future AI processes to scrape and shape into countless versions of the book: “Future-proofing Your Tech Skills In the Age of AI”.
It seems an impossible task but it is not because information technology has a sort of bedrock that isn’t changing fast. Oh, just like bedrock really floats on the plate tectonics and just as nobody really believed in plate tectonics — it was in fact ridiculed until it was not — so too does tech glacially drift just slow enough so that your hard-won muscle memory and spontaneous mastery over your tools doesn’t have to be pulled out from under you every 2 to 5 years as it today seems to be… even faster now, because AI.
Finding Bedrock in the Tech Quake
That Bedrock is the Unix-like Linux terminal on which everything meaningful is built today, according to the 80/20-rule. Oh there’s plenty that’s not built on Linux, but you get 80% of the benefit by focusing on the first 20% of the stuff you should learn. And within Linux there’s a lifetime of learning, so once again we apply the 80/20-rule so that you only focus on the 20% of what you need to learn about Linux to get 80% of the benefit.
A Brief, Tangled History of Modern Computing
This is true because we are not living in a LISP-hardware world running LISP operating systems in which LISP programs are blended together with LISP databases and LISP file systems, blurring the lines between every best-practice separation of concerns we know today, thumbing it’s nose at the John von Neumann computer architecture and making all AIs have perfectly persistent memory evolving since the 60s when we had neural networks with the Perceptron Mark 1 when the tech was ready. The AI Winter didn’t have to happen, but because it did an opening was made for the static-world with an also very substantial set of benefits, but quite different from the bizarro alternative reality where LISP won. We live in a Unix world, relabeled Linux.
Long story short, compiling stuff specifically for the hardware you’re working for a nifty compatibility layer and lots of ability to port your software between different hardware platforms is neat too. That’s what UNIX allowed. UNIX was all upper-case in those days because it was a competitor to MULTICS before…
Well, think of it as the player piano aficionado and rebel Ken Thompson flying Red Team into the air-ducts of the AT&T cheese grater Deathstar blowing it up and it, leaking the plans to his alma matter Berkeley so it was functionally public domain though SCO didn’t think so, then eventually Linus Torvalds reverse engineers the things, slaps on the also reverse-engineered GNU (stands for GNU is not UNIX) command-set, bippity boppity boo… the Web.
Something like that, at least. Stuff that inherently spreads helps stuff that inherently spreads. I think I’m skipping over MINIX, the TCP/IP stack and the BIND DNS system and giving BSD it’s due. Certainly the general public license (GPL) plays in there too. But all this came together to make something with a kind of grassroots groundswell force. Small-time players with the motivation and webmastering vibe could take up the LAMP stack and build an empire. And many did. And this all together is a decent consolation prize to what could have been the high-tech future that was promised in the 1950s but never delivered because LISP went away… and because nobody layered the recognizers in neural nets.
Sooner or later Moore’s Law would have even made the single-layer neural network bottlenecks work under software simulation work. But parallel processing happened because John Carmack wrote DOOM for PCs almost a decade later in the late 90s it was all about ATI Radeon vs. Nvidia GeForce to make descendants of DOOM run well. Then around 2013 Demis Hassabis of DeepMind trained a computer to master Atari games then Google in 2017 told us that Attention Is All You Need with Transformers, which OpenAI took the ball and ran with the “T” of Transformers giving us ChatGPT the end of 2022, and here we are.
The Human Mandate in an AI World
Everything was getting optimized before. Optimization is one of the big tricks of the Industrial Revolution with business process geniuses like W. Edwards Deming telling us about overall operation efficiencies (OOE) to tweak out the effectiveness of assembly lines and total quality management (TQM) to actually listen to the people on those assembly lines to plow all the learnings from the trenches back into process and quality product. And BAM! The Japanese go from a reputation for low-quality to high-quality, Harley-Davidson makes a comeback and now we can pretty much peak-efficiency anything. Now in the age of AI we can go even further and figure out what to peak-efficiency that we didn’t think of before where everybody is doing something wrong.
Today, that is focusing on tech that is not bedrock for where you develop your skills, if you want to remain technical in nature and not constantly go obsolete. I think a lot of people have this idea that all technical skills are becoming obsolete because AI and that everyone should deliver pizzas, go into plumbing or at most master soft-skills for prompting AIs to do all the technical stuff for you. The reason this isn’t true is because AI isn’t 100% reliable, but non-AI tech stuff (which you can use AI to help you build) can be.
Tools that make tools that make tools.
At the highest-level you have humans and AIs collaborating, because for a good long while the humans are going to be in charge as the bosses. It’s not all humans being bosses. It’s a small handful of humans who can work at that boss-of-AI level who hold those jobs; system analysts, domain experts and stuff. If you’re both a system analyst and also a domain expert (in some area other than system analysis) all the better.
Jensen Huang talks about this. Go into biology. Go into anthropology. Robots are not going to single-handedly be doing the biology experiments or the anthropological excavations. Humans are going to get their hands dirty, especially where high levels of oversight by something with strong human sensibilities are required. So sure, become expert in something like that. But then also learn Python, no matter how much Jensen Huang says that’s silly. If you really parse the interviews with him on this topic, it always comes out in the wash that hand-waving away the need to be technical is not actually true. Somebody’s got to “own” the code the AI makes.
Somebody has to internalize it as if they wrote the code themselves. Even if there’s more complexity than a human could ever understand, that complexity needs to be properly packaged up into building blocks and rigorously tested. Even if rigorously testing means envisioning a testing framework and having an AI step in to help with the actual tedium of the testing. This reasoning goes on and on as a regressive argument. It never stops at “just trust the machines”. It always has a human as the last step in the regression. Be that human.
The Modern Technologist’s Toolkit: Python, Rust, and the 80/20 Rule
And it doesn’t have to be god-awful stuff, either. This is perfectly love-worthy technical subject-matter. Old-school Unix packaged up as Linux today as the bedrock of tech can actually be quite interesting. Using that as the platform to run Python on is also interesting, as is the management of Python code as the perfect Lego building-block glue language to tie everything together. Being expressive in Python is just as poetic and lovely as being expressive in any spoken or written human language. In fact because of its legacy as an ABC learning language, it’s very much geared for the purpose of beginner expressiveness. And because of the innovations Guido van Rossum, the creator of Python who took the ABC language and ran with it, it takes you almost as far as you want to go.
I say Python only almost takes you as far as you want to go because it’s another 80/20-rule most common use case language. It helps you do most of the things you want to do with the least effort possible, until you come to edge cases. Oh, Python handles plenty of edge cases just fine but very often it doesn’t, and that’s where whatever language of your choice comes in. There’s a huge trend right now to rewrite stuff in the RUST language for no particular reason except that you can get 10x or 100x or 1000x better performance out of RUST than you can from Python in those particular edge cases.
So why not write everything in RUST? Because 80/20 again. Good enough is good enough is good enough… until it’s not. You have to get past those first 2 good enough’s before reaching the 3rd, at which time you choose between RUST, the Google Go Language (written in part by Ken Thompson) and some variation of the C programming language (written for Ken Thompson).
I began this article with the concept of accelerating. Ken threw a lot of the accelerant onto the flames of tech. LISP stalled out. The optimizations of Ken’s stuff made Sun Microsystems and Silicon Graphics (SGI) Workstations the ones used to render Jurassic Park dinosaurs and the like. And those were both Unix. Oh there was LISP hardware at the time from companies like Symbolics and TI, but the cost/performance factor kicked in, then Linux kicked in and game over. Unix/Linux has some sort of accelerant magic to it. The UNIX- HATERS Handbook called it a virus, and it’s true in all the good ways. That became the bedrock of tech.
Part of this acceleration trick that’s very tied to Unix is feeding the output of one process (aka program) into the input of another process (aka program). If you want to accelerate you master this trick of a well-planned pipeline so that each step in your process (each program) can be individually examined and optimized. It’s all about containerization and program boundaries and roles and responsibilities and interfaces. You design it all well so that batons can be passed to each other down the chain and make lots of transparency which is the ability to examine what’s coming out of the thing on the left and going into the thing on the right, and you can have a strong mental model of it all. Having that strong mental model of each component gives you the ability to iteratively improve each step… optimizing… accelerating.
Book Analysis
Ai Editorial Take
This entry is a potent blend of manifesto and history lesson. Its core value lies in providing a strong antidote to the prevalent anxiety about AI obsoleting technical roles. By grounding its argument in the enduring, 50-year-old philosophy of Unix, it makes a compelling case for focusing on fundamentals over fleeting trends. The author’s personal, energetic voice is a major asset. The piece successfully argues that true acceleration comes not from frantic tool-hopping, but from mastering a stable, powerful base. This is a cornerstone chapter for anyone building a technical career today.
Title Brainstorm
- Title Option: The Bedrock of Acceleration: Unix Philosophy in the AI Era
- Filename:
unix-bedrock-acceleration-ai.md
- Rationale: Directly connects the article’s two central metaphors—’bedrock’ for stability and ‘acceleration’ for progress—to the modern context of AI. It’s descriptive and captures the core thesis.
- Filename:
- Title Option: Don’t Fear the Prompt: Foundational Tech in a World of AI
- Filename:
foundational-tech-in-ai-world.md
- Rationale: Addresses the underlying anxiety about AI making tech skills obsolete and positions the article’s argument as the reassuring answer. It’s engaging and problem-oriented.
- Filename:
- Title Option: The 80/20 Rule for Future-Proofing Your Skills
- Filename:
80-20-future-proofing.md
- Rationale: Focuses on the practical heuristic that runs through the entire piece. It appeals to readers looking for an actionable strategy.
- Filename:
- Title Option: Ken Thompson’s Ghost: How Unix Still Shapes Everything
- Filename:
ken-thompsons-ghost-unix.md
- Rationale: A more narrative and evocative title that leans into the historical anecdotes. It creates intrigue and highlights the enduring legacy of Unix.
- Filename:
Content Potential And Polish
- Core Strengths: The central ‘bedrock’ metaphor is powerful and provides a strong, consistent anchor for the entire argument.”, ‘The passionate and slightly irreverent voice makes a potentially dry topic engaging and personal.’, ‘Effectively weaves together high-level philosophy, tech history, and practical advice (e.g., focus on Linux, Python, Rust).’, ‘The repeated application of the 80/20 rule provides a memorable and useful heuristic for the reader.
- Suggestions For Polish: The historical narrative, while charming, is sprawling. It could be tightened to more clearly and quickly connect the dots between Unix’s origins and today’s AI landscape.”, ‘The transition from the history of gaming and GPUs to Deming and TQM is a bit abrupt. A smoother bridge could strengthen the connection between hardware acceleration and process optimization.’, “Clarify the final point about the Unix pipeline. Explicitly state that this philosophy of small, interoperable tools is the practical application of the ‘bedrock’ principle and the key to acceleration.
Next Step Prompts
- Expand the final paragraph into a practical tutorial titled ‘Thinking in Pipelines.’ Provide three concrete examples of command-line pipelines that solve a common developer problem, and explain how the same modular, input-output thinking applies to designing modern microservices or data processing workflows.
- Based on the historical anecdotes, write a short, dramatic narrative of the ‘Unix vs. LISP’ timeline divergence, personifying the technologies and key figures to create a compelling ‘What If?’ scenario for the book’s introduction to Part II.