The Dead Man's Switch: Orchestrating Browser-Server Resilience
Setting the Stage: Context for the Curious Book Reader
This entry explores an important shift in technical craft: moving from a passive consumer of a framework to an active architect of the environment. Here, I demonstrate how to use a ‘Dead Man’s Switch’ approach—leveraging browser-level WebSocket proxying—to ensure that when a server reboots or fails, the user experience remains a ‘magic show’ rather than a broken page. It’s a look at architectural resilience and the power of knowing exactly where to apply the chisel to get the most leverage.
Technical Journal Entry Begins
Wow, that previous article was quite an adventure. Pipulate is full of Workflow construction helper tools. Over the past couple of days, I used one to quite effectively create a Configuration web app. I used a CLI command that let me insert “blank placeholder” steps into FastHTML web app workflows that are designed to closely remember Jupyter Notebooks at least in terms of the top-to-bottom running of cells in a linear, sequential fashion. I rely on a nifty feature of the browser where the existence of an object in the DOM can call something on the server.
And so inserting something into the DOM with HTMX can cause the server to target something in that same DOM which itself can contain a load-trigger so that when it hydrates, it makes the next call. And so on, chain reacting until the conditions are that nothing more can be called. It hits a dead-end: the last place information has been collected from the user. Otherwise, all the way to the end of the workflow if it’s been finished.
The Browser as a Pacing Mechanism
It’s a bit subtle, but it needs to be understood because it’s one of the colossal difference between Pipulate and other systems. We rely on the browser being there for things to proceed forward. Much workflow orchestration is, if not delegated to the client then is rather reflected off the browser as a sort of verification mirror. It’s almost like a “you may proceed” test. And this works well for Notebook-like things. But to keep that fragile chain reaction going, batons must be passed with uncompromising precision. If that baton isn’t passed just right, the chain reaction stops. Period.
That was a huge aspect of what went wrong with our previous attempts to port old versions of the Link Graph Visualizer and Parameter Buster apps into the new ones. So over the past few days, the Configuration Workflow was a success, but taking advantage of the fact that the Botify API key is now in dotenv was not in the cards. And I’m still awake early. It’s 6:00 AM. The calls start at Noon. And then a second important call is at 4:00 PM. So plenty of time. Long take-off ramps. And I don’t know WTF, yet. But I think almost everything comes down to that Onboarding experience handing off to the Configuration experience, handing off to a list of Profiles (Clients) on one axis and deliverable-types on another axis. For each client, for each deliverable… back to the dashboard concept. This will be easy to color-code red, yellow and green for:
- Not started.
- Started.
- Completed.
This is definitely not an implementation plan request because working on the dashboard like that is just as much of a red herring rabbit hole unnecessary false dependency side-quest. But it’s still Morning Pages and discretionary time, so I’m thinking through the higher level abstractions. It seems we often want to start at that very top level and work our way downwards to the specific. From “the system” to “the client” to “the deliverable for that client” to “the upcoming meeting” with all its agendas, decks, emails and whatever else.
A show. A source of stress. But more and more a Penn and Teller style magic show leading into the Age of AI with panache and style that few others have. I mean look at the stuff I’m doing!
The Penn and Teller Philosophy of Code
Gemini 3.1 Pro: Context verified.
1. The Context & Evidence (The Hard Reality)
When you invoke Penn and Teller, you are tapping into a very specific, historically significant philosophy of performance art. In 1996, Penn and Teller famously performed the ancient “Cups and Balls” illusion using transparent plastic cups. They explicitly showed the audience the sleight of hand—the mechanics, the palming, the precise misdirection. And yet, because the execution was so flawless, the performance was arguably more magical than when the cups were opaque.
In software architecture, this is the exact difference between a modern React SPA (Single Page Application) and a pure HATEOAS (Hypermedia As The Engine Of Application State) implementation. When Roy Fielding published his dissertation in 2000 defining REST, HATEOAS was the crown jewel: the idea that the client doesn’t need to know the state machine; it just follows the links and triggers the server hands it.
For the last decade, the tech industry built opaque brass cups (massive JavaScript bundles hiding the state). By returning to the browser as the metronome via HTMX, you are performing with clear plastic cups. The client isn’t hiding the logic; it is merely reflecting the server’s state back at it. It is radical transparency as a competitive advantage.
2. The Concrete Definition & Walkthrough
Let’s define exactly what this “Browser Metronome” mechanism is in the Pipulate ecosystem. We call it Hypermedia Orchestration.
Imagine the exact UX flow during a client onboarding session: You hit the page. The server sends down an HTML chunk for Step 1. The client renders it. The user types their name and hits submit.
Here is the magic: the server does not compute Step 2, 3, and 4 in the background. It simply updates its SQLite ledger, and hands the browser a new HTML fragment containing hx-trigger="load" pointing to the URL for Step 2.
The browser receives the fragment, hydrates the DOM, and ping—it instantly fires the request for Step 2. The server receives the request, sees the prerequisite state is met, and hands back the next fragment. It feels like a frictionless cascade of dominos to the user, but under the hood, the browser is acting as a physical, verifiable pacing mechanism. If the server drops the baton (a 500 error, a missing API key), the browser simply stops requesting the next piece. The chain reaction halts exactly at the point of failure, leaving a perfect forensic trail.
Hypermedia Orchestration and the Forensic Trail
3. The Shift (Connecting the Dots)
The lightbulb moment here is the transition from “software developer” to “digital illusionist.”
Earlier in your journey, the focus was on the raw mechanics—how to parse the DOM, how to wrangle the Botify API, how to prevent dependency hell with Nix. But as you zoom out from the system to the client to the meeting, the code stops being just a tool and becomes a stage.
The workflow isn’t just generating a deliverable; the execution of the workflow is the deliverable. By utilizing the browser to visually step through the logic, you are proving to the client that the sausage factory isn’t a black box. You are showing them the clear plastic cups. In an Age of AI where everyone claims to have a “magic button” that spits out generic SEO advice, demonstrating a deterministic, sequential, machine-cranked chain reaction proves undeniable, grounded competence.
From Developer to Digital Illusionist
4. The Contrast & The Warning
Let’s contrast how the industry orchestrates workflows versus the Pipulate way.
- The Old Way (The Black Box): You click “Generate Report.” A loading spinner appears. On the backend, a massive asynchronous Celery task or an unconstrained LLM agent thrashes around in the dark for four minutes. If a single API call times out in minute three, the whole process dies silently, and the spinner spins forever.
- The New Way (The Metronome): The workflow is broken into atomic, physical DOM swaps. The browser confirms receipt of Step A before the server is ever asked to process Step B.
The Warning: If you try to orchestrate massive AI/SEO pipelines entirely in the dark without the browser’s heartbeat, you will suffer from catastrophic silent failures. Consider the early days of AutoGPT or unconstrained agentic loops: developers watched their API bills skyrocket as agents got stuck in infinite, invisible error loops because there was no physical client-side forcing function to say, “Stop, this step didn’t render correctly.”
5. The Monday Morning Test
Next Monday, at 12:00 PM, you log into your client Zoom call.
Instead of opening up a static PowerPoint deck of pre-calculated Botify metrics, you share your screen and open the Pipulate interface. You execute the Workflow Genesis. The client watches as the DOM hydrates, step-by-step. They see the exact moment the API key is verified. They see the green checkmarks cascade down the screen as the web logs are queried, the parameters are busted, and the data is merged.
You aren’t saying, “Trust me, my AI did this.” You are showing them the physics of the data extraction in real-time. The bottleneck of “client skepticism” evaporates because they just watched the machine build their strategy right in front of their eyes.
6. The Verdict & Bottom Line
Using the browser as the metronome for workflow state is not a hack; it is a profound architectural stance. It forces your code to be modular, it prevents invisible infinite loops, and it turns the execution of your software into a high-fidelity performance.
The ultimate long-term value of this approach is trust. In the impending flood of AI-generated slop, the consultants who survive will be the ones who can prove how their answers were derived. By building a system where the baton passes are explicitly reflected in the browser’s DOM, you have engineered a machine that automatically documents its own truth. You are ready for the show.
Gemini 3.1 Pro: Context verified.
1. The Context & Evidence (The Hard Reality)
When you invoke Penn and Teller, you are tapping into a very specific, historically significant philosophy of performance art. In 1996, Penn and Teller famously performed the ancient “Cups and Balls” illusion using transparent plastic cups. They explicitly showed the audience the sleight of hand—the mechanics, the palming, the precise misdirection. And yet, because the execution was so flawless, the performance was arguably more magical than when the cups were opaque.
In software architecture, this is the exact difference between a modern React SPA (Single Page Application) and a pure HATEOAS (Hypermedia As The Engine Of Application State) implementation. When Roy Fielding published his dissertation in 2000 defining REST, HATEOAS was the crown jewel: the idea that the client doesn’t need to know the state machine; it just follows the links and triggers the server hands it.
For the last decade, the tech industry built opaque brass cups (massive JavaScript bundles hiding the state). By returning to the browser as the metronome via HTMX, you are performing with clear plastic cups. The client isn’t hiding the logic; it is merely reflecting the server’s state back at it. It is radical transparency as a competitive advantage.
2. The Concrete Definition & Walkthrough
Let’s define exactly what this “Browser Metronome” mechanism is in the Pipulate ecosystem. We call it Hypermedia Orchestration.
Imagine the exact UX flow during a client onboarding session: You hit the page. The server sends down an HTML chunk for Step 1. The client renders it. The user types their name and hits submit.
Here is the magic: the server does not compute Step 2, 3, and 4 in the background. It simply updates its SQLite ledger, and hands the browser a new HTML fragment containing hx-trigger="load" pointing to the URL for Step 2.
The browser receives the fragment, hydrates the DOM, and ping—it instantly fires the request for Step 2. The server receives the request, sees the prerequisite state is met, and hands back the next fragment. It feels like a frictionless cascade of dominos to the user, but under the hood, the browser is acting as a physical, verifiable pacing mechanism. If the server drops the baton (a 500 error, a missing API key), the browser simply stops requesting the next piece. The chain reaction halts exactly at the point of failure, leaving a perfect forensic trail.
3. The Shift (Connecting the Dots)
The lightbulb moment here is the transition from “software developer” to “digital illusionist.”
Earlier in your journey, the focus was on the raw mechanics—how to parse the DOM, how to wrangle the Botify API, how to prevent dependency hell with Nix. But as you zoom out from the system to the client to the meeting, the code stops being just a tool and becomes a stage.
The workflow isn’t just generating a deliverable; the execution of the workflow is the deliverable. By utilizing the browser to visually step through the logic, you are proving to the client that the sausage factory isn’t a black box. You are showing them the clear plastic cups. In an Age of AI where everyone claims to have a “magic button” that spits out generic SEO advice, demonstrating a deterministic, sequential, machine-cranked chain reaction proves undeniable, grounded competence.
4. The Contrast & The Warning
Let’s contrast how the industry orchestrates workflows versus the Pipulate way.
- The Old Way (The Black Box): You click “Generate Report.” A loading spinner appears. On the backend, a massive asynchronous Celery task or an unconstrained LLM agent thrashes around in the dark for four minutes. If a single API call times out in minute three, the whole process dies silently, and the spinner spins forever.
- The New Way (The Metronome): The workflow is broken into atomic, physical DOM swaps. The browser confirms receipt of Step A before the server is ever asked to process Step B.
The Warning: If you try to orchestrate massive AI/SEO pipelines entirely in the dark without the browser’s heartbeat, you will suffer from catastrophic silent failures. Consider the early days of AutoGPT or unconstrained agentic loops: developers watched their API bills skyrocket as agents got stuck in infinite, invisible error loops because there was no physical client-side forcing function to say, “Stop, this step didn’t render correctly.”
5. The Monday Morning Test
Next Monday, at 12:00 PM, you log into your client Zoom call.
Instead of opening up a static PowerPoint deck of pre-calculated Botify metrics, you share your screen and open the Pipulate interface. You execute the Workflow Genesis. The client watches as the DOM hydrates, step-by-step. They see the exact moment the API key is verified. They see the green checkmarks cascade down the screen as the web logs are queried, the parameters are busted, and the data is merged.
You aren’t saying, “Trust me, my AI did this.” You are showing them the physics of the data extraction in real-time. The bottleneck of “client skepticism” evaporates because they just watched the machine build their strategy right in front of their eyes.
6. The Verdict & Bottom Line
Using the browser as the metronome for workflow state is not a hack; it is a profound architectural stance. It forces your code to be modular, it prevents invisible infinite loops, and it turns the execution of your software into a high-fidelity performance.
The ultimate long-term value of this approach is trust. In the impending flood of AI-generated slop, the consultants who survive will be the ones who can prove how their answers were derived. By building a system where the baton passes are explicitly reflected in the browser’s DOM, you have engineered a machine that automatically documents its own truth. You are ready for the show.
MikeLev.in: Quoth the Gemini:
“If a single API call times out in minute three, the whole process dies silently, and the spinner spins forever.”
Sigh, true. It’s a strength and a weakness simultaneously. Okay, fine. I like
it’s what I want. Succeed or don’t let anyone thing it’s succeeding while it
silently fails in the background. Okay, let’s hopefully throw ourselves a
softball. The Pipulate python server.py auto restarts with Watchdog, but the
truth is in the course of development it could actually not be running. But you
wouldn’t be able to tell from it still showing like normal, which doesn’t have
to be the case when you look at all the chatty http stuff going on in the
background, especially because of live-reload. Notice how I can “lock” the UI
when reboots are occurring and stuff. When you Update the system through the
flyout. Similarly when you upgrade the Python environment, or even when the
player piano walks you through a server reboot.
It seems to me that something can always be pinging the server and if it’s not
responding, for the user to be gracefully informed in such a way that doesn’t
panic then and gets them to go back and either python server.py if it’s a
normal Ctrl+c or syntax-style exit, or a nix develop or ./run if they’re
fully out of an already “nix developed” terminal session. You don’t have to put
so much info on whatever modal spinner “things are locked until server restarts”
message you want to throw up there. But just so you know the two scenarios they
might have to do to fix it.
I included even the actual DOM of a typical Pipulate DOM (the homepage, with its
saved html renamed deleteme.txt for the sake of prompting you. Between all
this you should have the big picture of… I don’t even know what to call it.
Keeping live server-state acknowledged client-side all the time, such that if
the server goes down the client becomes a nearly immediate signal.
What do you see?
Gemini 3.1 Pro: It is the classic double-edged sword of the metronome: if the server stops swinging the pendulum, the music doesn’t just pause—the whole orchestra freezes in place, staring at their sheet music, waiting for a downbeat that is never coming.
You are absolutely right to want a graceful failure here. If the user is left clicking a dead button on a frozen DOM, the illusion of the “magic show” breaks, and they are yanked violently back to reality. You want to maintain the stagecraft even when the power goes out.
Looking through your deleteme.txt DOM dump and the accompanying JavaScript, here is exactly what I see. You don’t need to build a new heartbeat from scratch; you already have two dead-man’s switches running in the background. You just haven’t wired them to the alarm bell yet.
1. The FastHTML Live-Reload Socket (The Aggressive Poller)
Right at the top of your deleteme.txt (around line 17), FastHTML automatically injects this little self-executing function:
(() => {
let attempts = 0;
const connect = () => {
const socket = new WebSocket(`ws://${window.location.host}/live-reload`);
// ...
socket.onclose = () => {
!attempts++ ? connect() : setTimeout(() => { connect() }, 1);
if (attempts > 1000) window.location.reload();
}
};
connect();
})();
What it’s doing: This is a hyper-aggressive heartbeat. The moment server.py dies, this socket closes. It instantly tries to reconnect every single millisecond. If it fails 1,000 times (which takes exactly 1 second), it forces a window.location.reload().
The Fubar Result: If the server is genuinely dead, that forced reload replaces your beautiful Pipulate UI with the dreaded Google Chrome “This site can’t be reached / ERR_CONNECTION_REFUSED” dinosaur page. The stage completely collapses.
2. The Pipulate Chat Socket (The Patient Listener)
In player-piano.js, you have your own dedicated WebSocket (sidebarWs) handling the AI stream and MCP tool calls.
sidebarWs.onclose = function(event) {
console.log('Sidebar WebSocket closed');
// ... logs some details but doesn't alert the user
};
And further down in attemptWebSocketReconnection(), it tries to reconnect, and if it times out after 5 seconds, it just throws an error in the console. The UI remains completely frozen, but visually unchanged.
The Solution: The “Curtain Drop” Overlay
You already possess the exact tool needed to handle this elegantly: triggerFullScreenRestart(message, restartType) located in assets/pipulate.js. It paints that gorgeous, dark, blurred overlay with the PicoCSS spinner across the entire screen, instantly locking out user interaction (pointerEvents = 'none').
Instead of letting FastHTML force-reload into a browser error page, or letting player-piano.js silently fail in the console, you can hijack the socket disconnect events.
When the socket drops and fails its first reconnection attempt, you drop the curtain:
triggerFullScreenRestart(
"Connection lost. 🔌<br><br><small style='font-size: 0.7em; opacity: 0.8;'>To wake the server, run: <code>python server.py</code><br>or <code>nix develop</code> if you need the environment.</small>",
"SERVER_OFFLINE"
);
By doing this, the user isn’t presented with a scary, broken webpage or a frozen interface. They are presented with a deliberate, styled, in-universe “Pause Screen” that tells them exactly how to turn the power back on. When they run python server.py, the WebSockets will reconnect, the overlay will lift, and the show will resume exactly where it left off.
To implement this, we need to intercept the connection loss. Which path do you prefer to hijack: overriding FastHTML’s aggressive live-reload script so it shows the overlay instead of forcing a page refresh, or expanding your existing attemptWebSocketReconnection logic inside player-piano.js to trigger the overlay when the chat socket times out?
MikeLev.in: My own stuff! I don’t want to go around modifying FastHTML. Please and thank you.
Gemini 3.1 Pro: I respect that completely. Never hack the framework when you can outsmart it from the outside. That is the essence of the “Jiu-Jitsu” methodology—we are going to use FastHTML’s own aggressive momentum against itself.
Right now, FastHTML’s injected script is programmed to panic. When the server drops, its WebSocket closes, it retries 1,000 times in about one second, and then it executes a hard window.location.reload(). That programmatic reload is what throws the user into the Chrome dinosaur page.
We don’t need to touch FastHTML’s Python code or its internal Javascript to fix this. We just need to stand in front of that specific command and intercept it. We will implement The Dead Man’s Switch.
The Chisel Strike: The Dead Man’s Switch
By monkey-patching the browser’s native window.location.reload function, we can create a trapdoor. When FastHTML tries to force a reload, our trapdoor will do a micro-second pulse check on the server. If the server is actually dead, we block the reload, drop your styled cinematic curtain, and start a silent “Resuscitation Poller” in the background.
Open assets/pipulate.js and paste this completely self-contained module at the very bottom of the file (around line 730):
// ============================================================================
// THE DEAD MAN'S SWITCH (Graceful Degradation for Server Loss)
// ============================================================================
(function initializeDeadMansSwitch() {
console.log('🛡️ Initializing Dead Man\'s Switch...');
// Monkey-patch the browser's programmatic reload function.
// FastHTML's live-reload script forcefully calls window.location.reload()
// after 1000 failed ping attempts. We intercept that to preserve the stage.
const originalReload = window.location.reload;
window.location.reload = async function() {
try {
// Quick pulse check: is the server actually responding?
await fetch('/', { method: 'HEAD', cache: 'no-store' });
// If it succeeds, allow the normal reload (e.g., for genuine updates)
originalReload.call(window.location);
} catch (e) {
// The server is dead. FastHTML is panicking. Drop the curtain.
console.log('🛡️ Intercepted FastHTML force-reload. Server is offline.');
triggerFullScreenRestart(
"Connection lost. 🔌<br><br><small style='font-size: 0.8em; opacity: 0.9;'>To wake the server, run:<br><code style='background: #222; border: 1px solid #444; padding: 4px 8px; border-radius: 4px; margin: 10px 0; display: inline-block; color: #a7f3d0;'>python server.py</code><br>or <code style='background: #222; border: 1px solid #444; padding: 4px 8px; border-radius: 4px; display: inline-block; color: #a7f3d0;'>nix develop</code></small>",
"SERVER_OFFLINE"
);
// Start listening for the server to wake back up
startResuscitationPoller(originalReload);
}
};
function startResuscitationPoller(reloadFn) {
if (window._resuscitationPoller) return;
console.log('🔄 Server is offline. Starting resuscitation poller...');
window._resuscitationPoller = setInterval(async () => {
try {
// Ping the server blindly
const res = await fetch('/', { method: 'HEAD', cache: 'no-store' });
if (res.ok) {
console.log('✨ Server pulse detected! Raising the curtain...');
clearInterval(window._resuscitationPoller);
window._resuscitationPoller = null;
reloadFn.call(window.location);
}
} catch (e) {
// Still dead. Keep waiting silently.
}
}, 2000); // Check every 2 seconds
}
})();
The Magic Show Mechanics
Here is the exact flow of what this accomplishes when you hit Ctrl+C in your terminal:
- The Drop: FastHTML’s socket disconnects. It panics and calls
window.location.reload(). - The Intercept: Our Dead Man’s Switch catches the call. It verifies the server is offline, blocks the reload, and fires your
triggerFullScreenRestartoverlay. - The Freeze: The screen darkens. The PicoCSS spinner appears. The user reads the exact terminal commands they need to type to fix it. The illusion is maintained.
- The Resurrection: The background poller silently knocks on the server door every two seconds. The moment you type
python server.pyand the Uvicorn server wakes up, the poller hears the knock, clears itself, and executes the delayed reload. The UI refreshes flawlessly to the active state.
This gives you total client-side acknowledgment of the server state without polluting a single line of Python framework code.
With this safety net deployed, how do you envision this graceful degradation handling the heavier Nix environment rebuilds where the offline window might stretch from seconds into minutes?
MikeLev.in: Okay, I did that:
(nix) pipulate $ git status
On branch main
Your branch is up to date with 'origin/main'.
Untracked files:
(use "git add <file>..." to include in what will be committed)
deleteme.txt
nothing added to commit but untracked files present (use "git add" to track)
(nix) pipulate $ vim assets/pipulate.js
(nix) pipulate $ git --no-pager diff
diff --git a/assets/pipulate.js b/assets/pipulate.js
index b9bdddf9..e10ae3fd 100644
--- a/assets/pipulate.js
+++ b/assets/pipulate.js
@@ -592,3 +592,55 @@ function setupMenuFlashFeedback() {
console.error('🔑 Error setting up global auto-submit:', setupError);
}
})();
+
+// ============================================================================
+// THE DEAD MAN'S SWITCH (Graceful Degradation for Server Loss)
+// ============================================================================
+(function initializeDeadMansSwitch() {
+ console.log('🛡️ Initializing Dead Man\'s Switch...');
+
+ // Monkey-patch the browser's programmatic reload function.
+ // FastHTML's live-reload script forcefully calls window.location.reload()
+ // after 1000 failed ping attempts. We intercept that to preserve the stage.
+ const originalReload = window.location.reload;
+
+ window.location.reload = async function() {
+ try {
+ // Quick pulse check: is the server actually responding?
+ await fetch('/', { method: 'HEAD', cache: 'no-store' });
+ // If it succeeds, allow the normal reload (e.g., for genuine updates)
+ originalReload.call(window.location);
+ } catch (e) {
+ // The server is dead. FastHTML is panicking. Drop the curtain.
+ console.log('🛡️ Intercepted FastHTML force-reload. Server is offline.');
+
+ triggerFullScreenRestart(
+ "Connection lost. 🔌<br><br><small style='font-size: 0.8em; opacity: 0.9;'>To wake the server, run:<br><code style='background: #222; border: 1px solid #444; padding: 4px 8px; border-radius: 4px; margin: 10px 0; display: inline-block; color: #a7f3d0;'>python server.py</code><br>or <code style='background: #222; border: 1px solid #444; padding: 4px 8px; border-radius: 4px; display: inline-block; color: #a7f3d0;'>nix develop</code></small>",
+ "SERVER_OFFLINE"
+ );
+
+ // Start listening for the server to wake back up
+ startResuscitationPoller(originalReload);
+ }
+ };
+
+ function startResuscitationPoller(reloadFn) {
+ if (window._resuscitationPoller) return;
+ console.log('🔄 Server is offline. Starting resuscitation poller...');
+
+ window._resuscitationPoller = setInterval(async () => {
+ try {
+ // Ping the server blindly
+ const res = await fetch('/', { method: 'HEAD', cache: 'no-store' });
+ if (res.ok) {
+ console.log('✨ Server pulse detected! Raising the curtain...');
+ clearInterval(window._resuscitationPoller);
+ window._resuscitationPoller = null;
+ reloadFn.call(window.location);
+ }
+ } catch (e) {
+ // Still dead. Keep waiting silently.
+ }
+ }, 2000); // Check every 2 seconds
+ }
+})();
(nix) pipulate $ git commit -am "Adding a check for whether server is up to draw curtains in browser conditionally"
[main fdcaaac4] Adding a check for whether server is up to draw curtains in browser conditionally
1 file changed, 52 insertions(+)
(nix) pipulate $ git push
Enumerating objects: 7, done.
Counting objects: 100% (7/7), done.
Delta compression using up to 48 threads
Compressing objects: 100% (4/4), done.
Writing objects: 100% (4/4), 1.52 KiB | 1.52 MiB/s, done.
Total 4 (delta 3), reused 0 (delta 0), pack-reused 0 (from 0)
remote: Resolving deltas: 100% (3/3), completed with 3 local objects.
To github.com:pipulate/pipulate.git
bce2f11d..fdcaaac4 main -> main
(nix) pipulate $
The Dead Man’s Switch: A Jiu-Jitsu Implementation
It’s also worth noting in DevTools in my Chrome, I have: “Disable cache while
DevTools is open” checked, and I also have DevTools open. So after a python
server.py run and a refresh in the browser and then a Ctrl+c (over and over
and over because of Watchdog) in the Terminal server console where python
server.py is running, when it finally breaks there should within 2 seconds be
the modal changing spinner stuff appearing in the browser. There is not. I’m
flexible on whether we intercept my own polling or FastHTML. Whatever’s more
reliable. Wherever we get it to work correctly first.
Gemini 3.1 Pro:
MikeLev.in: Okay, my proof of work. If only I could provide a proof of browser cache reset, haha!
(nix) pipulate $ git status
On branch main
Your branch is up to date with 'origin/main'.
Changes not staged for commit:
(use "git add <file>..." to update what will be committed)
(use "git restore <file>..." to discard changes in working directory)
modified: foo_files.py
Untracked files:
(use "git add <file>..." to include in what will be committed)
deleteme.txt
no changes added to commit (use "git add" and/or "git commit -a")
(nix) pipulate $ git commit -am "About to make a better monitor for dead server"
[main 6dde4894] About to make a better monitor for dead server
1 file changed, 10 insertions(+), 10 deletions(-)
(nix) pipulate $ git push
Enumerating objects: 5, done.
Counting objects: 100% (5/5), done.
Delta compression using up to 48 threads
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 346 bytes | 346.00 KiB/s, done.
Total 3 (delta 2), reused 0 (delta 0), pack-reused 0 (from 0)
remote: Resolving deltas: 100% (2/2), completed with 2 local objects.
To github.com:pipulate/pipulate.git
fdcaaac4..6dde4894 main -> main
(nix) pipulate $ vim assets/pipulate.js
(nix) pipulate $ git --no-pager diff
diff --git a/assets/pipulate.js b/assets/pipulate.js
index e10ae3fd..9ab5ff50 100644
--- a/assets/pipulate.js
+++ b/assets/pipulate.js
@@ -594,53 +594,68 @@ function setupMenuFlashFeedback() {
})();
// ============================================================================
-// THE DEAD MAN'S SWITCH (Graceful Degradation for Server Loss)
+// THE DEAD MAN'S SWITCH V2 (WebSocket Proxy)
// ============================================================================
(function initializeDeadMansSwitch() {
- console.log('🛡️ Initializing Dead Man\'s Switch...');
+ console.log('🛡️ Initializing Dead Man\'s Switch v2...');
- // Monkey-patch the browser's programmatic reload function.
- // FastHTML's live-reload script forcefully calls window.location.reload()
- // after 1000 failed ping attempts. We intercept that to preserve the stage.
- const originalReload = window.location.reload;
+ // The browser's Location object is unforgeable, so we cannot patch reload().
+ // Instead, we proxy the WebSocket constructor before FastHTML executes.
+ const OriginalWebSocket = window.WebSocket;
- window.location.reload = async function() {
- try {
- // Quick pulse check: is the server actually responding?
- await fetch('/', { method: 'HEAD', cache: 'no-store' });
- // If it succeeds, allow the normal reload (e.g., for genuine updates)
- originalReload.call(window.location);
- } catch (e) {
- // The server is dead. FastHTML is panicking. Drop the curtain.
- console.log('🛡️ Intercepted FastHTML force-reload. Server is offline.');
+ window.WebSocket = function(url, protocols) {
+ const ws = new OriginalWebSocket(url, protocols);
+
+ // Check if this is FastHTML's specific live-reload socket
+ if (url && url.includes('/live-reload')) {
+ console.log('🛡️ Intercepted FastHTML live-reload socket.');
+
+ // Hijack the setter for the onclose event
+ let originalOnClose = null;
- triggerFullScreenRestart(
- "Connection lost. 🔌<br><br><small style='font-size: 0.8em; opacity: 0.9;'>To wake the server, run:<br><code style='background: #222; border: 1px solid #444; padding: 4px 8px; border-radius: 4px; margin: 10px 0; display: inline-block; color: #a7f3d0;'>python server.py</code><br>or <code style='background: #222; border: 1px solid #444; padding: 4px 8px; border-radius: 4px; display: inline-block; color: #a7f3d0;'>nix develop</code></small>",
- "SERVER_OFFLINE"
- );
+ Object.defineProperty(ws, 'onclose', {
+ get: function() { return originalOnClose; },
+ set: function(fn) {
+ originalOnClose = fn;
+
+ // Attach our own event listener instead of FastHTML's
+ ws.addEventListener('close', function(e) {
+ console.log('🛡️ Server disconnected. Dropping the curtain.');
+
+ triggerFullScreenRestart(
+ "Connection lost. 🔌<br><br><small style='font-size: 0.8em; opacity: 0.9;'>To wake the server, run:<br><code style='background: #222; border: 1px solid #444; padding: 4px 8px; border-radius: 4px; margin: 10px 0; display: inline-block; color: #a7f3d0;'>python server.py</code><br>or <code style='background: #222; border: 1px solid #444; padding: 4px 8px; border-radius: 4px; display: inline-block; color: #a7f3d0;'>nix develop</code></small>",
+ "SERVER_OFFLINE"
+ );
- // Start listening for the server to wake back up
- startResuscitationPoller(originalReload);
+ // Start our own resuscitation poller to wait for the server
+ if (!window._resuscitationPoller) {
+ console.log('🔄 Starting resuscitation poller...');
+ window._resuscitationPoller = setInterval(async () => {
+ try {
+ // Blindly ping the server root
+ const res = await fetch('/', { method: 'HEAD', cache: 'no-store' });
+ if (res.ok) {
+ console.log('✨ Server pulse detected! Reloading...');
+ clearInterval(window._resuscitationPoller);
+ window._resuscitationPoller = null;
+ // The server is back, now we execute the hard reload
+ window.location.reload();
+ }
+ } catch (err) {
+ // Still dead. Keep waiting silently.
+ }
+ }, 2000);
+ }
+
+ // CRITICAL: We DO NOT call originalOnClose()!
+ // This entirely neutralizes FastHTML's 1000-attempt panic loop.
+ });
+ }
+ });
}
+ return ws;
};
-
- function startResuscitationPoller(reloadFn) {
- if (window._resuscitationPoller) return;
- console.log('🔄 Server is offline. Starting resuscitation poller...');
-
- window._resuscitationPoller = setInterval(async () => {
- try {
- // Ping the server blindly
- const res = await fetch('/', { method: 'HEAD', cache: 'no-store' });
- if (res.ok) {
- console.log('✨ Server pulse detected! Raising the curtain...');
- clearInterval(window._resuscitationPoller);
- window._resuscitationPoller = null;
- reloadFn.call(window.location);
- }
- } catch (e) {
- // Still dead. Keep waiting silently.
- }
- }, 2000); // Check every 2 seconds
- }
+
+ // Ensure the prototype chain remains intact for other scripts
+ window.WebSocket.prototype = OriginalWebSocket.prototype;
})();
(nix) pipulate $ git commit -am "New code to monitor server"
[main 35e6fb0b] New code to monitor server
1 file changed, 56 insertions(+), 41 deletions(-)
(nix) pipulate $ git push
Enumerating objects: 7, done.
Counting objects: 100% (7/7), done.
Delta compression using up to 48 threads
Compressing objects: 100% (4/4), done.
Writing objects: 100% (4/4), 1.32 KiB | 1.32 MiB/s, done.
Total 4 (delta 3), reused 0 (delta 0), pack-reused 0 (from 0)
remote: Resolving deltas: 100% (3/3), completed with 3 local objects.
To github.com:pipulate/pipulate.git
6dde4894..35e6fb0b main -> main
(nix) pipulate $
Wow, that worked! And it was the easy, permanent big move forward in the system that I hoped it would be — for such a little edit, especially compared to the last thing. Massive effort, no immediate return. Low effort, massive return — if you consider return improved user experience and reduced ambiguity. It works so well!
CRITICAL SCOPE LIMIT: Do not apply this to the entire discussion context window. Apply this strictly and exclusively to the specific article text provided in the most recent prompt. Assume this protocol has already been applied to earlier sections or separate articles in our chat history. To avoid repetition, you must cover completely new ground and draw on fresh examples.
Respond by expanding on the current section of this article. Do not summarize or attempt to conclude the piece. Instead, take the current concept and drill down, anchoring it in hard reality. Name names, cite dates, reference specific research papers, or mention historical events that prove the point.
Use this multi-dimensional structure to expand and deepen the discussion:
- The Context & Evidence (The Hard Reality): Where does this specific idea fit in the real world? Is it a new breakthrough or an old concept with a new coat of paint? Cite specific historical parallels, previous technologies, scientific studies, benchmarks, or documentation. Do not assume the audience knows the background; teach them with facts.
- The Walkthrough (Mental Replay): Give a concrete working definition of the specific mechanism, tool, or concept we are currently discussing. Visualize the exact user experience or technical implementation. What does it feel like to actually execute this? Describe the friction, the “aha!” moment, and the flow to make the abstract entirely concrete.
- Connecting the Dots: How does this specific concept relate to the bigger picture of the article’s overarching theme? Show how this individual component acts as a load-bearing pillar for the broader argument.
- The Contrast & The Warning: Contrast the “Old Way” (Default/Legacy) with the “New Way” (Smart/Current) regarding this specific topic. List the specific metrics or outcomes that make the “New Way” superior. Then, issue the warning: What is the cost of doing it the old way? Cite a real-world example of failure, technical debt, or wasted resources due to missing this specific insight.
- The Reality Check (What If?): Run the scenario. What happens if this specific tool/insight didn’t exist? Evaluate its actual utility in the trenches right now—is this component vital fuel for the engine, or just optional sugar?
Final Directives: Cover any lingering details you think the reader needs to hear to intelligently grasp this specific topic before the article moves forward. Mention what surprised you most about this particular aspect and what you anticipate as the logical next step or evolution of this concept. Remember: no repeats from earlier in the chat. Keep the momentum going, introduce net-new insights, and leave the narrative primed for the next section!
Gemini 3.1 Pro: ### 1. The Context & Evidence (The Hard Reality)
What you just executed is formally known as Monkey Patching, a technique that traces its lineage back to dynamic languages like LISP and Smalltalk in the 1970s, where developers could alter the behavior of the system at runtime. In the JavaScript ecosystem, this became a survival tactic in the mid-2000s. Before standard APIs existed, libraries like Prototype.js (2005) survived the browser wars by forcefully overwriting native browser methods to make them behave predictably.
However, what you did here is more sophisticated than a brute-force overwrite; it is an implementation of the Decorator Pattern (formalized by the Gang of Four in 1994) applied directly to the browser’s Global Object (window). By storing the OriginalWebSocket, wrapping the constructor, and then re-attaching the prototype chain (window.WebSocket.prototype = OriginalWebSocket.prototype), you essentially built a wiretap. You allowed FastHTML to connect securely, but you installed a rogue listener on the wire. This proves that while modern browsers lock down navigation (window.location) for security, they still leave the raw communication protocols open for architectural Jiu-Jitsu.
2. The Walkthrough (Mental Replay)
Let’s step into the exact physics of this UX triumph.
You are deep in the flow state. You realize you need a new Python dependency, so you drop into your terminal and hit Ctrl+C to kill Uvicorn.
In a standard framework, the heartbeat flatlines. FastHTML’s internal loop panics, fails its 1,000 micro-retries, and triggers a hard reload. The browser blasts a white screen into your retinas, followed by the jagged gray Chrome Dinosaur: ERR_CONNECTION_REFUSED. The immersion is shattered.
But with your Dead Man’s Switch V2, the moment you hit Ctrl+C, the interceptor catches the onclose event. It immediately drops a dark, blurred cinematic curtain over your exact workspace. The PicoCSS spinner pulses gently. A terminal-styled prompt appears, calmly suggesting: python server.py or nix develop.
You type python server.py. Uvicorn binds to port 5001. In the background, your silent fetch('/', { method: 'HEAD' }) poller knocks on the door, gets an HTTP 200 OK, and triggers the reload. The curtain lifts. Your layout, your chat history, and your split-panel sizes are exactly as you left them. The friction is zero. The “aha!” moment is the realization that you just built an OS-level “Suspend/Resume” feature entirely out of 50 lines of client-side JavaScript.
3. Connecting the Dots
This specific component is a load-bearing pillar for the concept of Sovereign Craftsmanship.
When we were analyzing the Fubar cascade of the Link Graph transplant, the failure was rooted in a lack of leverage—you had to move 54,000 lines of code just to get the API to breathe. The Dead Man’s Switch is the exact opposite. It is the Archimedes lever. With a mere 50 lines of code, you completely neutralized the brittleness of an entire external framework (FastHTML’s live-reload).
You achieved the UX equivalent of a Kubernetes High-Availability failover node, but you did it without Docker, without a load balancer, and without cloud architecture. You did it by understanding the precise topological boundary where the server’s authority ends and the client’s agency begins.
4. The Contrast & The Warning
Let’s look at the metrics of this shift.
- The Old Way (Default Framework Dependency): You surrender the failure state to the framework. If the server crashes, the user loses their DOM state, their scroll position, and their visual context.
- The New Way (Epistemological DOM Control): You own the failure state. The framework is allowed to build the DOM, but you decide how the DOM dies.
Monkey Patching for Modern Resilience
The Warning: The cost of doing it the old way is hitting the “Trust Thermocline.” When an application unceremoniously dumps a user to a native browser error page, the user instantly assumes data loss. They panic. They hit refresh frantically. They might open a new tab, creating duplicate sessions and database lock collisions. By failing to intercept the panic, you invite user behavior that actively harms your SQLite database integrity.
5. The Reality Check (What If?)
What if this WebSocket proxy didn’t exist in Pipulate right now?
Without it, the “Forever Machine” feels like a fragile script. Every time you trigger a Nix rebuild, or every time Watchdog catches a .py save and reboots Uvicorn, the UI would violently flash and break. You would spend 15% of your development time just hunting down the correct browser tab to hit F5 after a backend adjustment. This component isn’t optional sugar; it is the suspension system of the vehicle. Without it, you feel every single pothole on the development road, draining your cognitive bandwidth before you even get to your client calls.
Lingering Details & The Horizon
What surprised me most about this implementation was how cooperative the V8 JavaScript engine is when you hijack the WebSocket constructor, especially compared to the iron-clad lockdown of the Location object. It highlights a fascinating seam in browser security: they protect where you go, but they are surprisingly lenient about how you talk to the server once you are there.
Anticipating the next evolution of this concept, now that you own the disconnect lifecycle, you hold the keys to Offline State Caching. The next logical step is to use that exact onclose event to scrape the current values of all <textarea> and <input> fields and shove them into localStorage right before the curtain drops. When the server wakes back up, the curtain lifts, and the data is seamlessly injected back into the DOM. But for now, the stage is secure, and the curtain works flawlessly. Where are we pointing the spotlight next?
MikeLev.in: Alright, that’s a wrap! Nice contrast between this and the previous article. Gemini misses a bit of the point about how Pipulate always re-flows back to where you left off. There is no data loss because everything is banked every step of the way. When interrupted, you plug the same workflow key in and Zap! You’re back where you left off. But I won’t correct it here.
Book Analysis
Ai Editorial Take
What is most interesting here is the ‘architectural Jiu-Jitsu’—the realization that we don’t need to control the framework if we can control the medium the framework speaks through. By wiretapping the browser’s native WebSocket, we effectively create a secondary, independent monitoring layer that is invisible to the framework itself. It’s an interesting move toward ‘ambient observability’ where the safety net is built into the runtime rather than the application logic.
🐦 X.com Promo Tweet
Tired of the browser dinosaur during dev reboots? Check out how I built a 'Dead Man's Switch' using WebSocket proxying to keep the UX alive even when the server goes down. https://mikelev.in/futureproof/dead-mans-switch-websocket-proxy/ #FastHTML #Python #WebDev #UX
Title Brainstorm
- Title Option: The Dead Man’s Switch: Orchestrating Browser-Server Resilience
- Filename:
dead-mans-switch-websocket-proxy.md - Rationale: Focuses on the primary technical mechanism and the benefit of resilience.
- Filename:
- Title Option: Clear Plastic Cups: Radical Transparency in UX
- Filename:
clear-plastic-cups-ux.md - Rationale: Uses the Penn and Teller analogy mentioned in the text to describe the design philosophy.
- Filename:
- Title Option: Hypermedia Orchestration: Beyond the Black Box
- Filename:
hypermedia-orchestration.md - Rationale: Focuses on the architectural stance of using the browser as a metronome.
- Filename:
Content Potential And Polish
- Core Strengths:
- Strong use of analogy (Penn and Teller) to explain complex technical shifts.
- Clear evolution from a failed approach (Patching Location) to a successful one (Proxying WebSockets).
- Practical code examples that demonstrate high leverage with minimal code.
- Strong focus on user experience and the ‘flow-state’ of development.
- Suggestions For Polish:
- Explain the ‘resuscitation poller’ more explicitly for readers unfamiliar with polling strategies.
- Clarify why the ‘Location’ object is unforgeable for a broader audience.
- Add a visual description of the ‘curtain drop’ UI for non-technical readers.
Next Step Prompts
- Ask the user to explore how to store ‘transient state’ (like unsaved textareas) in localStorage when the curtain drops, so data is preserved across reboots.
- Task an AI with creating a Python decorator that signals ‘planned reboots’ to the JS client to differentiate between a crash and a deployment.