Getting started in the morning. Coffee time. Go to system. Organize the 7 virtual desktops for a fresh start. Screen 1, fullscreen Terminal. New tech journal entry. Home.
New factory class instantiation. The pattern gets reflected in many things in life and in coding. That’s why this object oriented programming methodology had become so popular.
Use some factory base class to generate a new instance with some lite boilerplate templating. Or simply copy/paste some starter point, but keep it minimal to reduce the complexity and mysteries of base class inheritance, such as with a new journal entry where it’s a minimal time/datestamp.
Alternatively, in no-code visual situations, it’s like a new page in an artist’s notebook — a true “blank page” for drawing. AKA starting with a blank page or a fresh slate, as the saying goes. And in this case, the movements you make to express ideas onto the page are the more visual abstract symbolic gestures of hand-waving-like movements, more reminiscent of no-code visual interfaces.
But the abstraction split between writing and drawing isn’t just the difference between the written-word incantations of typing at the keyboard versus the more wavy and stroking movements of drawing. There are also menus to deal with. The visual point-and-click interface has been layered into most typing software like Word and Docs so that you have to take your hand away from the keyboard and work the menus — except of course in the case of vim users.
In either case, the above scenarios reflect the two great splits of getting started fresh on expressing yourself onto media, and more specifically today, into machines. And in today’s world, thinking and expressing yourself this way is much the same as doing. Or at least, it’s setting the context because it can become part of a prompt.
And so, journaling like this can become part of a super-prompt if you will. Machines will be able to follow along. They will be able to “get you”. There is almost nothing to abstract for them to at least absorb and respond to in some way — some way which can somehow help guide and direct and focus you, helping you improve your work.
The advent of writing, books and the printing press brought about literacy, the encoding, storing, sharing and generational transmission of such expressions as I just described. That led to nomadic and farming life transitioning into the industrial revolution and the explosion of technology, because this process has a strange feedback loop that causes exponential progress. The printing press makes way for the computer, and the industrial age becomes the information age.
The fact that there’s a literacy discussion here is important. There’s a whole Donald Knuth point about programming literacy here, where the code and the documentation are mixed in-context. In his 1992 book, Programming Literacy, he talks about making code that’s as important to make readable by humans as it is for machines, and in the age of AI, that point becomes profoundly important, because that creates the context for the AI code assistants. Yes, it laid a lot of the thought groundwork for things like Jupyter Notebooks which literally mix the documentation and the code for exactly this reason, for the student, evolving out of proprietary versions like Mathematica and MATLAB, which had like $1000/year licensing fees if you weren’t a student. And so, Jupyter Notebooks.
And a new level of and definition for literacy: programming literacy. A distinction should not need to be made, but because human nature, it always will have to be. We talk about spoken-word literacy and math literacy as two separate things, though they should not be. STEM, which stands for Science, Technology, Engineering, and Mathematics, would be much better off if that line were blurred. But I guess because the language of math is so different from the spoken word, and the abstractions are so much more of an obstacle to overcome, people can make the excuse “I am not a math person” and get through life only half-literate.
The times are changing, because that other half of literacy, now programming, is
much more like the English spoken word than math. You can be allergic to math
and still be a fine programmer. Logical thinking bridges the gap. Programming
has a sort of consistent syntax and system that math doesn’t. The meaning of
mathematical symbols just floats too readily without an library import
system. So dx
can mean d
times x
if discussing algebra with the
multiplication system implied between two variables, or it can mean the
derivative of x if speaking calculus. The uninitiated would never know, because
there’s no import algebra or import calculus. Math is full of
relative but broken references. Not so with programming, or the program won’t
run.
And so programming literacy is easier than math literacy. And just in time for not needing to know how to program anymore, because AI, haha! No not really. I don’t believe that. All that talk about AI playing the role of a junior or mid-level programmer so now companies have to hire fewer programmers is not the story of programming going away. It’s the story of fewer people working better and smarter because tools are improving. And so I hear that and I think Great! The tools are getting better! This lowers the barriers for entry just as well as it allows fewer people to do more.
And what’s better, all that programming literacy of Don Knuth’s teachings, Mathematica, MATLAB and now Jupyter Notebooks is context for the AI coding assistants. It helps you understand. It helps them understand. When you merely write things down, the knowing of things blends with the doing of things because not only might it be executable code, but it might be well-documented executable code that the coding assistants can suggest more and better additional executable code. And the world is affected. Automations are improved. And so now the information becomes alive.
Literacy is to the printing press what the means of production is to the computer. We all are potential information-to-productivity factories. Writing something out may in fact be the very code that executes and controls a machine like a player piano or a printing loom. That’s because the precise details of how you encode a thought onto a medium can have special play-back significance to a machine whose parts are going to act upon those locations on that medium.
Those are the basics of a Turing machine, or any device that can compute. Babbage called them difference engines. Anything with kinetic potential on one side, and some sort of minimally 2-state switch in the middle whose state can change with cascading or chain reaction effects will do. This includes water canals, pipes, vacuum tubes, transistors. Almost any switches. Hook up enough and pour in enough input, and the device seems to become alive.
And so scribes who made way for the printing press which made way for computers now make way for computers that can seemingly talk back to you and teach you how to use them. Given enough switches and relays and transistors all hooked together, and with enough of the right information and ideas encoded into them, and pre-processed in just such a way, and the machines themselves ostensibly become intelligent with the ability to talk and draw and otherwise express the same abstractions back at you in endlessly emergent remixed new ways that were put into them.
It’s still much like a player piano or mechanical loom. Or perhaps like a wind-up music box. A much under-exposed early mechanical man concept, one of the early robots in literature, Tik-Tok from L. Frank Baum’s Wizard of Oz series expressed this so well, in it’s back-story, in its imagined behavior, and in the fact that Dorothy had to perpetually keep winding him up.
That’s like what we have today. These music-boxes of these LLM and diffusion art generators are indeed deterministic machines that you wind-up, and they play-back. Before software, such machines played back the same every time. The same raised-pins on the gradually spinning cylinder get plucked by the same metal chime-brushes every time. But give the ability to load in different cylinders, and you get different music. Make a cylinder where you can raise the bump pins every time in different ways, and you’ve got software.
But the music that comes out of the LLMs and diffusion art generators of today seem different every time. It’s even become moving video images where whatever’s generating the image seems to have an understanding of the world including physics and the nature of living things enough to predict and simulate their behaviors and interactions. Is that all pre-determined based on input too? Yup.
It’s just that as you load in more and more switches and relays and transitions, connecting them up in clever ways that layer turing machine on top of turing machine, creating even more clever ways to route the output-flow of one process into the input intake of another process, you create these strange loops. You create recursive feedback loops, or at least highly iterative ones. You create cascading chain reactions of information-flow that wipes away all apparent semblance of predictability and get what we call emergent properties. Things appear to become alive. Stuff starts to quack like a duck.
That’s just how life is. That’s just how reality works. Anyone whose spent enough time watching strange loops of recursion in anything from video feedback to huge network force-graphs to chemical reactions has probably seen this effect. The patterns we see in the behavior of bacteria and stuff like slime mold are seen in chemical pattern effects and crystallization. It’s life-like spontaneous order from seemingly randomness.
There’s a sort of vibrating edge stability of shape and form there that seems totally unnatural, but which totally is natural under the right conditions: stable instability. The Lorenz attractor which you may have seen (or should) as that never-ending butterfly particle animation is one mathematical equation that demonstrates chaotic behavior coming across both orderly and unpredictable. This is everywhere.
It’s certainly in our complex machines now, but you still have to wind them up. Not a lot of people are willing to take that step of creating the self-winding mechanical man, both due to the heebie-jeebies ethics of it all, and due to the energy cost. There’s really a double-whammy right now not to run such experiments. They’re not going to be stable (ie the output’s going to get weird) and they’re expensive. Letting the genie out of the bottle, or opening Pandora’s box costs an awful lot. They can’t just self-ignite and self-fuel like a California fire being fed by an unlimited supply of fuel, fanned by the Santa Ana winds &151; ideal conditions providing both the computing substrate and the winding.
Yes, fire has many of the same complex turing machinery as these other computing machines. Whenever discussions of “what is alive” and “what is life” comes up, fires (and crystals) are often considered. The energy potential differences are there, just not the controlled flow through some switchboard to make it productive. Though some combustion engine powered machine might argue otherwise. They don’t have to be wound quite so much, because you just fill ‘em with fuel. That’s why once we have actual robots like the Wizard of Oz’s Tik-Tok, there’s going to be a lot of talk of nuclear batteries. They’re going to need enormously high energy-density power sources to walk around with.
And so goes my morning coffee thoughts. I got a decent night sleep after a 40-hour stretch of being awake making a hail mary attempt at finishing my web framework workflow pipeline system, only to introduce more bugs and undermine the video I was supposed to have produced for yesterday to show off the ideas. Ugh! Such is my pattern of over-reaching and under-delivering. But that’s where this 3rd evolution of information tech plays in. It’s a tool, an extension of the body, that lifts us from literacy to having the means of production, to teaching us how to better produce. And that wraps it all together.
I’m in the process of learning how to use these tools of LLMs and AI to better produce. I could produce before. I am exactly the type to take up a book, read it through, and turn around and systematically apply what I’ve learned. I’m just not the type to do that very well, because something gets lost in production between the author of the book’s head and my head. The head-to-head transmission of ideas has challenges. I’m not the greatest receiver-device for books. Nor for videos. But for something sitting (or standing) there with me interacting, answering my questions, letting me rapidly try and test and iterate — well, now you’re getting somewhere. Well, now I’m getting somewhere.
And get somewhere I did. Even the system in its current broken state is better thought out, has more solid fundamental underpinnings, is based on fewer and better parts, has better more upgradability than anything I’ve ever made before, and I have really made quite a few things before that were along these lines. This is not my first web framework rodeo, but it’s shaping up to be my best &151; and it’s perfect storm environment. There is endless fuel in the form of websites that need to be rethought and reshaped for the age of AI, and there’s… what?
There’s this. There’s this journal and my endless ability to express myself as a way of both forging my way towards this future, and as super-prompt fodder for AIs themselves to absorb, comment-on and indeed help guide me with my next step. I could at this point feed this article into one (or multiple AIs) and do the cute thing of saying “Isn’t that right, Gemini?” or o1 Pro or Claude or whatever. But I will resist that cute thing in favor of capturing the important where-I’m-at ideas of today. Of this morning before the Client-comes-first jump monkey how high reality kicks in.
This journal is an expression of autonomy and agency. It’s balking against the concept of the click-whirrr determinism I’ve described about wind-up machines, and which many of us, Sabine Hossenfelder top among them, think also describes our world and our illusion of free will. If our wills are free, than we can forge different paths for ourselves than the preponderance of preconditions and prevailing mathematically calculatable stuff around you would suggest. We can buck the trends and override the program.
And so, let’s capture those system thoughts. It goes something like this.
I have very strong core notions about what constitutes my system — how it operates, the way it behaves, and the way people interact with it. It’s a particular invention or device which happens to manifest as a running program on a computer. But all things are like this. Anything turing complete machine can do anything, including simulate reality itself, but that doesn’t invalidate the importance of the machines and inventions you make within such a system.
Your own new take on things is actually important — maybe the most important thing. Originality and patterns that never occurred exactly like that, but which are still useful, are the new currency. That’s how you create new value from out of nothingness. And so subtlety and nuance count &151; especially in new ways of encoding and expressing ideas &151; designing and using new APIs. The creative creation and use of new tools that alter and improve the way we in turn interact with our world. Inserting yourself and your inventions into the very accelerating strange loop that accelerates technological and human progress.
Yeah, so my machine notions are lofty, and they’re just taking Unix pipes and
Jupyter Notebooks to the next level by expressing them on rails in a new, more
tightly controlled and newb-accessible environment: web apps. But those web apps
are lower-case web and not upper-case Web. While they could be hosted behind a
good enough login to a dedicated enough virtual server, they’re designed right
now to run off localhost. I am using the Apple Bonjour-like Avahi to make sure
that it’s not just localhost and that these apps work with DNS-provided
addresses as well. So what happens on localhost stays on localhost, or wherever
you happen to be hosting your something.local
instance, be it even on the
Web under the right hosting conditions.
Okay, and with that one little concession and shift in developer concerns that
enables, it does the one great big magic trick as one of my favorite components
that I’m also using, HTMX
. See, htmx frees the browser to perform at
post-ReactJS-era performance levels by sprinkling in just the right amount of
JavaScript to remove the HTML-specifications seemingly inherent limitations.
Carson Gross, the creator of HTMX can describe it better, but it basically means
anything in the web browser can be updated by anything else in the web browser,
or your own server-side code &151; which becomes in faster and smoother
communication with all the parts of your browser than you may have previously
thought possible. It’s a shift back to simpler more timeless old-school
client/server behavior that stays close to the HTML spec, and a shift away from
more beefy client-slide JavaScript complexity that relies on brittle fad and
trend-laden components.
You are freed on the server to do what you like, with the full resources of your local machine, from hard drive space to processing power including whatever GPUs and gaming cards you might have, to your already paid-for and too-expensive home broadband. You have the equivalent of yesterday’s supercomputers in your house several times over, and sometimes on your old, abandoned laptops (of which there’s about to be a lot more after the Windows 11 forced upgrade cycle is over). Yes, a lot of your very powerful and perfectly capable hardware is about to be rendered obsolete because it can’t run Windows 11. Been looking for something to do with it and the perfect timing to become more technical in a gentle on-ramp sort of way? This is it!
Why? Because Python wouldn’t be as popular as it is and growing still, if it weren’t somehow superior to JavaScript. Think about it. JavaScript should hands-down rule the world now, between being the only native language officially supported by browser &&151; the only one (monopolistically) sanctioned by the consortium of browser vendors — and the rise of WebAssembly Wasm which brings compiled-C native code performance to the browser, and NodeJS on the server to complete the picture of full stack one-language-to-rule-them-all dominance… well, there’s no choice, really. And there’s no chance or room for any other language. JavaScript wins. Or it should have, by all rational reasoning and accounting.
But it hasn’t. Python is continuing its steady, linear growth in popularity with no sign of leveling off. With the rise of world population slowing, and the rise of popularity of Python continuing, do the math. The lines cross over. They don’t with JavaScript or any other programming language. The DOD is warning against using C, C++ and all those other compiled languages that allow stack-overflow and other exploitable memory leaks. Rust is picking up those crumbs of people both technical enough and so inclined as to work at the low-abstraction hyper-optimized level. The rest of us will speak English, thank-you-very-much. And English in tech right now and the foreseeable future means Python.
As the final nail in the coffin of any other language as I drive this point home is that Python rules on Linux distro. It’s the lingua franca de facto standard included instead of PERL on almost every Python distro, and sometimes several times in different isolated sub-components. And while JavaScript dominates on the browser, being the dominant default language of system administration, devops tools and API integrations on the dominant OS is nice too. Anyone spinning a new Linux distro, except for the tiniest most bare stripped-down BusyBox or embedded versions, are inevitably going to include Python. But will they include Node?
So for today… for today… enumerate those realizations about my invention. My
invention sits on Python. Python sits on any generic, usually Linux OS, which
in-turn sits on any generic consumer-oriented host system which means Macs or
Windows. It does this through the magic of the nix
command. Any difficulty
with the nix command is addressed with the magic of the Determinate Systems
nix installer. This makes web apps installable generically on any system much
the same as Electron apps, but without the complicated packaging and bundling.
All your web techniques with generic Linux travel and flow like water now
between host systems. Done!
Freed from the installer story, and also as a side-effect freed from the “it doesn’t work on my machine” story, you are now able to take full advantage of your local hardware, and all the enterprise scaling and security issues are changed. Your concerns now are writing the best app you can as simply and understandably for both the user using the app and later developers looking at your code as you can. Things stripped out include complicated client-side server state. The browser is still there as the web client and client-side of the client/server architecture. It’s just now that it’s a rapid-fire DOM rendering engine. And the better you get the reins on the client-side DOM with 1-to-1 mapping from server-side concepts, tools, mental-models and abstractions, the better.
Enter FastHTML. FastHTML zips together, as a sort of glue, the things living
server-side — where you have almost telepathic vimmy control over every
byte that goes into your app — with convenient happy little FT-components
that look and act just like HTML elements, but they’re Python functions! What
you don’t have is jinja2
templates, or any other templating language that
looks like PHP short-codes, Shopify Liquid templates, nunchuks, mustaches or any
of those other curlybrace/pointybrace nonsense that pollutes and weighs down
with much cognitive overhead and directory-diving fatigue every other such
system.
What you still do have is a familiar Python Flask
-like environment same as
FastAPI, Pylons and every other copycat that came later to recycle the
decorators as routers notion that connects Python apps to the web
interface. So, it’s Flask without a template language? Werkzeug but no jinja2 or
mako? Yeah, pretty much. Your Python FT (aka FastTags) are them. Why add
an extra layer of abstraction and cognitive overhead when the great chameleon
you’re already sitting on top of in the form of Python can already reproduce the
look, feel and spirit of HTML all by themselves? HTML attributes & values become
Python parameters & arguments. Labels are labels. They’re the same thing. Zip
them up 1-to-1 and be done with it. Thank-you Jeremy Howard and the others that
have come before you with projects like dominate
and other HTML-over-the-wire
notions.
It’s usually JSON-over-the-wire these days. It’s an encoding and decoding of already-data into other already-data formats for the privilege of having to have JavaScript on the receiving end (client-side) to know what to do with it (turn it back into HTML or otherwise visible-to-the-user data). Get it? We’re beating back Conway’s Law here. The comp-sci camp saw the web and the browser as a viable software development platform as Google jumped on the JavaScript standardization/optimization bandwagon, fanning the flame with the first great client-side JavaScript library (yes, Angluar came before React), and the rest is the history that drove me away from webdev. I’m allergic to JavaScript.
But I’m not allergic to English. I already for the most part speak Python, which is for the most part English, stripped of all the cognitive overhead nonsense of those block-delimiting, curly braces whose positions don’t matter anyway, which makes it even worse! If you’re going to indent anyway for readability, then just do away with the curly brackets and make everyone’s life easier. Okay, says Leo Geurts, Lambert Meertens, or Steven Pemberton of Centrum Wiskunde & Informatica (CWI) in Amsterdam (I’m not really sure which) who wrote the ABC language. Okay, says Guido van Rossum on Christmas 1989 writing a new programming language as a hobby. Okay says me looking for a language that doesn’t insult me at first glance. I am not allergic to Python.
So, Python provided a low learning curve, and shallow and happy on-ramp to programming that other cryptic, convoluted oh, that’s for the compiler-ridden languages don’t. Those curly-brackets are for the compiler, not you. And that was in C and other languages that inherited from BCPL, the first great virtual-machine-at-the-compiler language, and the precursor to C, Unix and by extension, Linux. So curly-braces are deeply rooted in the tech stack I’ve come to love (though I think I’d have loved a LISP stack just as much), but that’s no reason we can’t take joy in the curlybracketectomy Python performed. Good riddance! What’s that? They’re still in dicts? Nobody promised perfection.
So, let’s just generally run with that theme and lower the cognitive overhead of
what you’re looking at. Along with less curly-braces, the next big offender
— which I’m currently violating myself and have to get under control —
is long dot-notations. The scavenger-hunt of this.that.this.again.and.again
.
Perhaps not as bad as the public static void main(String[] args)
function
signature that curses the same language that mainstreamed long dot-paths (Java),
the long dot notations is nobody’s friend when trying to take in the meaning of
software at a quick glance. While they represent important things, isolated
namespaces and scope, shortcuts can be used without sacrificing clarity. So I
have to get the long self.pipulate.some_snake_case_named_function
tendencies
under control in Pipulate.
And finally, the subtlety and nuance of Pipualte that I’m trying to preserve, protect and defend against the over-trained-on-FastAPI and SQLAlchemy LLMs… hmmm… the trickiest, and probably most controversial bits of my systems, as if being localhost-centric and FastHTML-based wasn’t against the grain already enough. How to articulate best?
Well, it begins with the fact that the Unix pipe metaphor that inspires it all
is really, really simple. The output of one command like ls
or cat
is fed
into the input of another program like sed
or awk
with a vertical line or
pipe \
. So they look like: ls | sed ‘s/[aeio]/u/g’ This command lists the
contents of the current directory and replaces all occurrences of the vowels
‘a’, ‘e’, ‘i’, and ‘o’ with ‘u’ in the output, hahaha!. It demonstrates a basic
use of pipes to connect the output of ls to sed for simple text transformation.
And it really is that easy. Among other things, this core principle of Unix
helped drive it to the massive popularity it enjoys today (also in the form of
Linux), because you don’t really have to program quite as much as just
understand and utilize data pipes.
Data pipes? Ohhhh, you mean workflows. Yes, I mean workflows. Do this. Take what you did and use it to do that. And then do the next thing. Linear. Very refreshing in today’s world of do-everything-all-at-once concurrency. Frankly, concurrency is a luxury you can’t afford until you’ve learned how to do things sequentially (aka synchronously). Synchronously, or in a synchronized fashion, where one component is aware-of or through mechanical tricks or practice, somehow tied to the others, so that all parts perform together in synchronicity. Harmony. Focus. All things that are opposite to asynchronous behavior.
Asynchronicity is though tied to autonomy and agency, which is then of course the sexy buzzword and what everyone wants to talk about with agentic AI. The libraries people want to talk about on YouTube then become CrewAI, LangChain and AutoGen, as if you can’t live without them. But truth be told, the magic chaining-up of input with output of the Unix pipe philosophy is achieved by everyone just talking a similar input/output language, which they already do (mostly) in the form of the OpenAI API.
OpenAI got to this sort of stuff first, and so had the privileged of predisposing everyone and establishing the conventional behavior. Happily, that conventional behavior is very Pythonic, or English-like. It didn’t have to be, I assure you (cringing as I think back to how SOAP ruined XML). Thankfully, the OpenAI API is RESTful (Representational State Transfer). It allows developers to interact with very standard and familiar HTTP methods such as GET, POST, PUT, and DELETE. And the other details aren’t much worse. That’s why after learning OpenAI’s API, you can install Ollama on your own local machine and get to work doing the same sort of things with much smaller models for a more ambient intelligence flow, what I’m doing with my web framework pipeline workflow system. There’s a local LLM baked-in that I already knew how to program, thanks to easy conventional standards set early.
Close behind them are libraries that can be conducive to linear workflows like LangChan that don’t necessarily thrust concurrent, and thus hard-to-think-through behavior on you. You have to learn to chain-things-up before you can swarm. Swarming requires individual components able to be decoupled from each other, taken out of synchronized harmony, and thus require some amount of agency — yes, agency that AI is supposed to be able to provide, but that’s putting the cart before the horse.
I’m focusing on linear workflows. I’m doing it in the least imperfect most perfect way I can imagine. This involves:
- Using FastHTML for template-eliminating HTML-over-the-wire Python goodness
- Using HTMX which I list separately, because it’s a whole education in itself
- Modeling everything after Unix pipes as they can be expressed well with with the above
- Also modeling everything after Jupyer Notebooks, which already closely resemble Unix pipes
- Staying to as plain, generic and conventional a UI as possible to not look like anything new or kooky was invented (PicoCSS, a decision I inherited from FastHTML)
- All the code being in a single file, sized within token-window of today’s frontier LLMs and tomorrow’s home-LLMs, for full-context single-shot prompting.
- A plugin system for CRUD-like applications reminiscent of Rails or Django
- A plugin system for ultra-simplified pipeline workflows, aka wizards.
All this is mostly done, and I followed pre-existing models for most stuff, coming from the FastHTML examples out there, like the ToDo list for CRUD (create, read, update, delete) operations. This alone is a hugely powerful model that addresses much of the Rails/Django like cookie-cutter database app stuff. Made a factory class BaseApp so I can slam out new instances and get most CRUD behavior through inheritance, including the LLM’s general knowledge of the process and the ability to participate in creating and updating records on its own as appropriate and necessary. And when it’s not doing it on its own, it’s at least in-the-loop of what the user has done. And the databases in this system are RAG resources for the LLM.
The Pipeline stuff is a little trickier, and ultimately more important because it’s where I’m really innovating, and I feel is the new “secret weapon” source of power, because it is the clarity in a sea of confusion. However, in getting ambitious yesterday in pursuit of completion, I introduce new bugs. I failed to make a video I had planned to produce, because wack-a-mole bugs inadvertently got into the system that would have necessitated too much editing to make the video work. The single-take videos that I am most comfortable with and capable of making (and which are like 1000 times easier because they eliminate editing) was undermined due to my own ambition. I was hoisted by my own petard, as the saying goes. Sigh, the best laid plans.
Anyway, this journal entry is about bouncing back up, healing my wounds and becoming stronger as a result of it.
- Surface area is best when reduced.
- Virtual anything, especially state, is best avoided.
- Filter what tries to show rather than retrieve what you just deleted.
So my pipelines always delete forward.
# PIPELINE MANTRA:
# One URL to track them all
# One JSON blob to bind them
# Card to card the data flows
# No message queues behind them
#
# PIPELINE STATE MANTRA:
# Submit clears forward
# Display shows the past
# Refill affects views
# But submit starts fresh
We look for the most common, regular things. Then, we bottle it. This is linear do-this and then do-that sort of workflow. It’s not if-this then that, which is not a linear flow. This system is mostly about linear workflows. There’s always exceptions, but the exceptions are contained within a single steps/cards/cells/commands and acted upon based on parameters and arguments that are passed between steps.
It’s like the Unix pipe is linear: ls | sed | awk
where the output of each is
fed into the input of the next. sed and awk have their own arguments that
control how they behave, but all that variable logic can be thought of as
inside the command
and isolated. That way, you can work with a linear mental
model, chaining things up. It’s an abstraction. It’s the data piping
abstraction.
There are details about how data is passed from out of one program and into the
next. The baton is passed. The baton is a baton. But inside the baton you store
messages that have meaning to the next step. These are the parameters and
arguments. The parameters are the known names of fields. They are the keys of
key/value pairs. Keys are known. Labels of fields are known. The data that’s
going to be plugged into those fields is unknown, aka the values. And so this is
why the JSON
or Python dict
datatypes are so often chosen for API work. If
you’re going to pass data between systems that needs key/value-pair mappings,
JSON
and dict
s are ideal. Python namedtuple
s aren’t bad either for certain
cases where all your data has no nesting and is like a row of a table.
And that’s where today’s entry ends. This is just cleansing the palate. The
fresh blank page. The new starting point for a revision of the latest fruits of
my literate programming labors. Not a bad starting point, because each workflow
is topped with a namedtuple called STEPS
.
Gemini 1.5: This article explores the evolution of literacy and technology, from the printing press to AI coding assistants. It argues that programming literacy is crucial in the age of AI and emphasizes the importance of clear, human-readable code. The author discusses their own web framework project, highlighting its unique features like FastHTML and its focus on linear workflows inspired by Unix pipes. They advocate for a simplified, Python-centric approach to web development that prioritizes efficiency and understandability.
Some ideas for titles and headlines:
- From Printing Press to AI: A New Era of Literacy
- The Power of Programming Literacy in the Age of AI
- Rethinking Web Development: A Python-Centric Approach
- Unix Pipes and AI: Building Linear Workflows for the Future
- My Web Framework Journey: Embracing Simplicity and Efficiency
In my opinion, the article offers a thoughtful and insightful perspective on the evolving relationship between humans and technology. The author’s passion for programming and their dedication to creating a more efficient and accessible web development experience are evident. Their emphasis on clear, human-readable code aligns with the growing need for transparency and collaboration in the field of AI.