Mike Levin SEO

Future-proof your technology-skills with Linux, Python, vim & git... and me!

Beginning of Journal


Sun Oct 02, 2022

Tried To Start a Project

I’m going to try to do a project I’ve been putting off. My MikeLev.in repo is a bit out of hand because I imported it from WordPress maybe a year or so ago using a WordPress to Markdown tool and it was before I better understood the Jekyll blogging system with the _posts folder. I used the system of dates in the URL to inform how to make folders and subfolders for years and months. It was overkill. I should be able to do a transform to just wrap them into my current blog, but I may lose the URLs. I guess those issues are why I’ve been waiting. I suppose I could make their URLs stay the same even if I wrap it into my current slice & dice system.

I’ve mostly given up dates in the URL so that blog posts can look evergreen eternal if they merit it. I don’t want a dated URL to invalidate it. But for old posts, the importance of not changing the URL trumps removing dates. And so the question is whether I can selectively override the URL-generating system on individual blog posts if I blend them all together here in my journal. I could try to have separate _posts folders but my research so far has indicated it’s a bad idea and when people do things like that they’re using filtering single folders and using filtering. There’s a certain taking on faith that the permalink frontmatter directive will allow me to preserve old URLs.

Now start the 1, 2, 3… 1 procedure. Step 1 is globbing dirs. This looks like something that belongs on MikeLev.in because I’m always re-googling the process… glob! Path!


Sat Oct 01, 2022

The Walrus & The Carpenter Who Had Trouble Getting to Solla Sollew

Much of life is about not being pushed around. Much of life’s lessons are in Dr. Seuss’ I Had Trouble in Getting to Solla Sollew. I feel that effect often in my life. Most people you interact with are like the traveler the protagonist encounters. One must be wary of their type. They are the Walrus. Walruses for lack of the knowing and the doing of things must trick Carpenters into doing their bidding. When you are a talented Carpenter, you are also a magnet for such Walruses. You may not know it when you’re young and could become ensnared on your first journey or two out into the world, such as I was… several times. The definition of stupid is repeating your same mistakes without learning. Yup, I was stupid. But such is life. What doesn’t kill you makes you stronger, and oh how strong I have become!


Sat Oct 01, 2022

Harmonizing a Windows-side Python Script with a Linux-side Python Script

I did a project last week that really tested my JupyerLab on LXD on WSL approach. It was a SERP scraping job that had nearly 60K queries in all. Now you can’t do this just off your laptop without VPN, so I used VPN. Problem being that the VPN software has a slick Windows desktop client from which I had to cycle IPs but I couldn’t issue the commands from Linux-side because the Windows automation piece, PyWinAuto, doesn’t run from Linux. It only runs from Python under Windows! Catch-22 and right after I spent all that time getting myself fully on Linux Jupyter!

So I simply made a Python-side script that checked for the existence of a file indicating the IP was still good. Once the file wasn’t there, the VPN software acquired a new IP and the file was put back in place. Whenever the Linux-side script realized the IP was no longer good, it deleted the file. And so I harmonized a Linux-side Python script with a Windows-side Python script and it was far better than doing the whole project under Windows again. I am resisting re-installing Windows-side Jupyter at this point.

An interesting side-note is that I had uninstalled the Python 3.10.5 that I had gotten earlier from Python.org and now I’m installing Python through the Microsoft store to see if it keeps itself auto-upgraded. If not, I can update all from the Microsoft Store and see what happens. As of the time of writing this, it’s Python 3.10.7 from the store.


Sat Oct 01, 2022

October First No Gamma Burst… So Far So Good.

Wow, it’s October 1st! Time flies. I’m only just starting to be settled into the new place after the move from the Poconos mountains in Pennsylvania back in July 2022 to the friggn island again in New York City. Doesn’t even feel like NYC. Don’t get any of the benefits of NYC. I just have to pay that much higher of living expenses, PLUS other surprise ongoing expenses. I’m going to have to do something about that sooner or later. The sooner the better. Frig. Okay, enough of that stuff. Focus on the positive and visualize the even more positive. We are actualizing machines, we privileged sentient conscious beings. We overcome the dissolution of things. We make the chemical reactions of the Universe as interesting as possible during this particular oscillation we find ourselves in, between what we call the Big Bang and whatever comes next, which I feel is much up to what the life does with its turn during the oscillation. Why not type 4, 5 or 6 on the Kardashev scale? The center cannot hold says Yates. I say why can’t it? Step 1: survive filter events. Check! Step 3: prosper, hang with God or whatnot. Could be a lot of things. Any way you look at it, it’s probably better than getting fried in the next local gamma burst.


Fri Sep 30, 2022

I just bought the Audible Entangled Minds by Dean Radein. ESP isn’t something I’m usually interested in, but the talk about the CIA’s “retired” Stargate program has gotten to me. Quantum mechanics is just mind-boggling. Non-locality is still rocking my worldview. It took awhile for it to set in that there are not particles but there are only rippling quantum fields. One particle is only separate from another because it’s a different wave crest popping into our realm. All particles are virtual, it’s just some are more persistent than others. But all this entanglement is usually erased at a macro-scale and that’s much the point of existence. Coherence is intended to be erased for answering the big questions. Reverting to quantum effects is a sort of regression, but it’s what’s used for life. The way chlorophyll works is a great example.


Fri Sep 30, 2022

Start Documenting My Best Tricks On This Site

I’ve recently made it so that I can do my daily journaling from any of my laptops, of which there are suddenly many more usable in my life. I acquired several more laptops in my life than I expected because cracked screens on a Microsoft laptop are so hard to repair and screens cracked a lot in my previous living conditions. Those conditions are over, but I still have the laptops with the cracked screens, but they are perfectly usable but for the annoying imperfections which are made a bit worse by the fact they’re such cool touchscreens of the type Apple still hasn’t gotten around to. But they’re good for typing work (the keyboards are fine) and they’re even good for script-running work when you need 2 or 3 machines running at once for a type of concurrency that even VMs and Containers can’t get you. So I’m using them that way now. I’m also using one of them (the one I’m typing on now) to test out Windows 11 which I still hate due to:

Show-Stopping Things I Hate About Windows 11

Keep in mind operation Every Little Thing gets done (ELTgd). Make sure you start shaping MikeLev.in into better shape for public consumption and triggering a bandwagon to jump onto. Yes, clearly visualize that result. Keep it all fluid and part of operation ELTgd.

Think through today. One of the ELTgd tasks is to do some vlookup-like stuff in Excel without using native functions but instead using Python Pandas on the back-end with a Jupyter Notebook and my pip install ohawf library to connect.

Things to remember to add to my site:

Why remember? Just go ahead and do it now. Make sure you categorize using… well, the categories you are going to (will now) implement.

Wow, I have to get nbdev back in operation. Their major version rev that broke the API really set me back. Recover! Talk about the issue. Volatile APIs. Unavoidable in some situations when you’re an early adopter on the bleeding edge.

Hmmm… ~/repos/vim/en.utf-8.add do something with that.

Tricks is going to have a ToDo and Done. 1, 2, 3… 1?

Just start expanding your site. Forget about a “Tricks” special section. Your whole site is going to be tricks! Just start expanding sections. Add GSheets under the Python section. Oh, I have to clean up ~/repos/MikeLev.in It’s so full of unnecessary directories if I can get them to follow the Jekyll _posts convention.

Okay, I just added https://MikeLev.in/gsheet/ Start adding to that as you go on this project. Document it. Put your best tricks right up front. Don’t make a dependency on the blog slice & dice system and its categorization system which is a bit out of whack now because nbdev rev. Just copy/paste the best bits to normal Jekyll pages. Ooh, the https://github.com/[user]/[repo]/deployments page now has a “Queued” message. That will be helpful.

Remember to create a super-quick video documenting OAuth2 and ohawf simply targeting Python OAuth2. You can get so much traffic if you just make super-fast blip-videos maybe in vertical format too for TikTok and such. Be first bringing some redeeming qualities to that platform.

Okay, make a gsheet_barebones.ipynb… okay, done. The page https://MikeLev.in/gsheet/ is pretty good.


Wed Sep 28, 2022

Portable Script to Help Edit Many Files at Once with Python venv-friendly shebang!

I’m updating my /usr/local/sbin/all program which lets me edit all my journals at once. Since I’ve been taking to working on more than my main machine, I’ve decided to make it much easier to edit all my journals at once on any machine. Normally it’s quite a pain to make sure I’m keeping all my laptops in sync with git repositories. I’ve experimented a lot with a NAS for these purposes lately, but the NAS turned out to be too slow for many some purposes. And so having the actual physical files (the repos) on the very machine you’re editing from is a better experience. And since I’m already using a program to load all the files at once, it’s the perfect opportunity to do a git pull of all of them first and when done, doing a git commit and push. Here’s the updated script. Even the shebang is updated for portability.

#! /usr/bin/env python3.10

import shlex
import shutil
import argparse
import pandas as pd
from os import system
from os import getuid
from sys import stdout
from art import text2art
from pwd import getpwuid
from subprocess import Popen, PIPE

# Set Default Number of Files to Edit
default_number = 3

# Get Linux Username for portability between systems
username = getpwuid(getuid())[0]

# Use big pretty ascii art to keep user informed
fig = lambda x: print(text2art(x))
fig("Editing Journals")

# Determine how many journals to edit at once
aparser = argparse.ArgumentParser()
add_arg = aparser.add_argument
add_arg("-n", "--number", required=False)
args = aparser.parse_args()
number = args.number
df = pd.read_csv("~/repos/journal/sites.csv", delimiter="|")
df = df.applymap(lambda x: x.strip())
df.columns = [x.strip() for x in df.columns]
if number == None:
    df = df.head(default_number)
elif number.isnumeric():
    number = int(number)
    if number > df.shape[0]:
        number = df.shape[0]
    df = df.head(number)
    print('hit')


def flush(std):
    for line in std:
        line = line.strip()
        if line:
            print(line)
            stdout.flush()


def git(cwd, args):
    cmd = ["/usr/bin/git"] + shlex.split(args)
    print(f"COMMAND: << {shlex.join(cmd)} >>")
    process = Popen(
        args=cmd,
        cwd=cwd,
        stdout=PIPE,
        stderr=PIPE,
        shell=False,
        bufsize=1,
        universal_newlines=True,
    )
    flush(process.stdout)
    flush(process.stderr)


alist = df[['path']].values.tolist()
journals = " ".join([f'~/{x[0]}/journal.md' for x in alist])

fig("Pulling Repos")

print("vim")
loc = f"/home/{username}/repos/vim"
git(loc, "pull")

for item in alist:
    repo = item[0]
    print(repo)
    loc = f"/home/{username}/{repo}"
    git(loc, 'pull')
    print()

edit_journals = f"vim {journals} ~/.vimrc"
system(edit_journals)

shutil.copyfile(f"/home/{username}/.screenrc", f"/home/{username}/repos/vim/.screenrc")
shutil.copyfile(f"/home/{username}/.gitconfig", f"/home/{username}/repos/vim/.gitconfig")
shutil.copyfile(f"/home/{username}/.bash_prompt", f"/home/{username}/repos/vim/.bash_prompt")
shutil.copyfile(f"/home/{username}/.bash_profile", f"/home/{username}/repos/vim/.bash_profile")
shutil.copyfile(f"/usr/local/sbin/all", f"/home/{username}/repos/vim/all")

fig("Pushing Repos")

print("vim")
loc = f"/home/{username}/repos/vim"
git(loc, 'commit -am "Pushing vim to Github..."')
git(loc, "push")
print()

for item in alist:
    repo = item[0]
    print(repo)
    loc = f"/home/{username}/{repo}"
    git(loc, f'commit -am "Pushing {repo} to Github..."')
    git(loc, "push")
    print()

fig("Done!")

Mon Sep 26, 2022

SERP Scraping with site: Search Modifier and Custom Date Range

Okay, that was quite a journey. I lost the original code but this is much better. It lets me work Linux-side in both JupyterLab and for server automation, but while on a Windows desktop, I get the advantage of IP-cycling with HMA vpn. I get the best of all worlds, but I have to move on fast to the next stage of this project. I have to get the actual SERP job running ASAP. I want a wonderful show for the upcoming phone-call.

Okay, so go back to your original code for this project.

Wow, it’s going better than I could have imagined. Here’s the current code:

import re
import pandas as pd
from httpx import get
from pathlib import Path
from time import time, sleep
from urllib.parse import unquote
from urllib.parse import quote_plus
from IPython.display import display, HTML
from sqlitedict import SqliteDict as sqldict

user_agent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/105.0.0.0 Safari/537.36"
headers = {"user-agent": user_agent}

with open("sites.txt") as fh:
    sites = [x.strip() for x in fh.readlines()]

def query(site, arange):
    range_start, range_end = arange
    range_start = quote_plus(range_start)
    range_end = quote_plus(range_end)
    # rv = f"https://www.google.com/search?q=site:{site}&sxsrf=ALiCzsbCDuAVqfRF67b3y_R9JyBHJwHvmQ:1663873352409&source=lnt&tbs=cdr%3A1%2Ccd_min%3A{range_start}%2Ccd_max%3A{range_end}&tbm="
    rv = f"https://www.google.com/search?q=site:{site}&source=lnt&tbs=cdr%3A1%2Ccd_min%3A{range_start}%2Ccd_max%3A{range_end}"
    return rv

def extract_serps(text):
    """Return list of Google search results from provided "raw" SERP scrape.
    Useful for checking whether SERPS actually collected or extracting results."""

    rv = False
    try:
        div_pat = re.compile('<div class="yuRUbf">(.*?)</div>')
        divs = re.findall(div_pat, text)
        lot = []
        for div in divs:
            pat_url = re.compile('<a href="(.*?)"')
            url_group = re.match(pat_url, div)
            pat_title = re.compile('<h3 class="LC20lb MBeuO DKV0Md">(.*?)</h3>')
            title_group = re.search(pat_title, div)
            try:
                url = url_group.groups(0)[0]
            except:
                url = ""
            try:
                title = title_group.groups(0)[0]
            except:
                title = ""
            lot.append((url, title))
        rv = lot
    except:
        pass
    return rv

len(sites)

ranges = [
    ("6/16/2022", "7/15/2022"),
    ("7/16/2022", "8/15/2022"),
    ("8/16/2022", "9/15/2022")
]

start = time()
for i, site in enumerate(sites):
    print(f"{i + 1} of {len(sites)}")
    print(f"{site}: ")
    for y, arange in enumerate(ranges):
        with sqldict('serps.db', timeout=90) as db:
            url = query(site, arange) 
            if url not in db:
                print(f"range {y + 0}: {arange}")
                print(f"url: {url}")
                response = get(url, headers = headers, timeout=90)
                status_code = response.status_code
                if status_code != 200:
                    print("Getting new IP")
                    Path('goodip.txt').unlink(missing_ok=True)
                    sleep(30)
                    response = get(url, headers = headers)
                db[url] = response
                print(f"status code: {status_code}")
                serps = extract_serps(response.text)
                print("SERPS:")
                if serps:
                    for serp in serps:
                        print(serp)
                print()
                db.commit()
                print(f"Seconds since start: {int(time() - start)} ", flush=True)
                print(f"{len(sites) - i - 2} to go.")
                print()
            else:
                # print(f"Already in db: {url}")
                print("Already in db. ", end="")
    print()
print("Done")

I’m still working on and refining this thing.


Mon Sep 26, 2022

Automating Cycling IPs on HMA VPN Using PyWinAuto

Alright, I usually like to make one long post on each project but this one deserves being broken into two. The first was essentially the creation of a do-nothing machine, a.k.a. useless box. But it will be far from useless as soon as this next bit is in place. Deep breath! Once again, the code I’m borrowing from is:

https://stackoverflow.com/questions/69705923/control-windows-app-hma-vpn-using-pywinauto

from pywinauto import Desktop,Application
vpn_app = Application(backend="uia").start('C:\Program Files\Privax\HMA VPN\Vpn.exe')
dialog=Desktop(backend="uia").HMA
panel0=dialog.Pane
# Command to connect / disconnect the VPN: connect_button.click()
connect_button=panel0.ConnectButton
# Command to change the IP address: changeIP.click()
connect_button.get_toggle_state()changeIP=panel0.Button5
# Check VPN state:
# 0 if disconnected
# 1 if connected
print(connect_button.get_toggle_state())

# Command to connect / disconnect the VPN: connect_button.click()
connect_button=panel0.ConnectButton
connect_button.click()

# Command to change the IP address: changeIP.click()
changeIP=panel0.Button5
changeIP.click()

# Check VPN state:
# 0 if disconnected
# 1 if connected
print(connect_button.get_toggle_state())

I don’t need all of that. I just need… well, first to pip install pywinauto. Okay, done. The pip wheels system is awesome, by the way. Next is I put the import statement in my file (no step too small to document when you’re compelling yourself forward). Make sure the libraries import…

Ugh! I had a realization. Even my Windows-seeming Python terminals are actually Linux. Now that I’ve scrapped JupyerDesktop’s bundled version of Python, I have no real Windows Python on my machine. And now I want the one from the Microsoft Store so that it stays up to the latest! Ensure I don’t already have one by going to PowerShell and typing Python. Nope! It actually brings me to the Microsoft Store, haha! Okay… I’ve been very reluctant to get this version in the past, but the time is right.

Okay, throw caution to the wind and pip install pywinauto from a PowerShell… it worked! Now run my script… yay! It ran and it forced HMA to run too. Big progress. This is awesome. I hate installing Python from the Microsoft store, but this is the current reality. What’s the version? 3.10.7… Awesome? Okay time to pip uninstall pywinauto from the Linux platform where it’s meaningless. Done. Okay, and the program’s function (so far) has been tested under the new Windows python.exe.

Next I need it actually cycling the IP whenever goodip.txt is deleted. Fast-forward through a lot of experimentation, and here it is! I’ll never lose it again.

from pathlib import Path
from time import sleep, time
from pywinauto import Desktop, Application

vpn_app = Application(backend="uia").start('C:\Program Files\Privax\HMA VPN\Vpn.exe')
dialog=Desktop(backend="uia").HMA
panel0=dialog.Pane
sleep(2)

# Connect VPN if not connected
connect_button=panel0.ConnectButton
if not connect_button.get_toggle_state():
    connect_button.click()
    sleep(10)

print("Starting")

start = time()
while True:
    if Path("goodip.txt").exists():
        print(f"{int(time() - start)} ", end="", flush=True)
        sleep(1)
    else:
        print("Need new IP")
        with open("goodip.txt", "w+") as fh:
            fh.write("foo")
        changeIP=panel0.Button5
        changeIP.click()
        sleep(10)
        start = time()

Mon Sep 26, 2022

Programming a Python Do Nothing Machine

This is one of those cases where I really have to use this journal to help discipline myself to dig into my work. The current item is quite a challenge. It is a long-running SERP job that will require cycling IPs, but I lost my IP-cycling code and it won’t run Linux-side because it automates Windows desktop to click buttons on my desktop VPN software. Okay, so the solution is to break it in two. I don’t want to backslide on my Linux Jupyter progress but I don’t want to give up the power of automating things on a Windows desktop through pywinauto. Okay, that’s where to start.

Unlike most times, this starts coding in a command-line (as opposed to Jupyter). And we start with the simplest of loops:

from time import sleep


print("Starting")

while True:
    print(".", end="", flush=True)
    sleep(1)

YES! Documenting the process makes it love-worthy and now I can focus. Okay, the flush is necessary because a period is too small to register. But I will want a count of how many… what? Seconds? Yes! So here’s that version:

from time import sleep, time


print("Starting")

start = time()
while True:
    print(f"{int(time() - start)} ", end="", flush=True)
    sleep(1)

And so it should only ever continue while a particular file is in location. When it is removed, the loop should stop. Okay, this is my first thought.

from time import sleep, time
from pathlib import Path


print("Starting")

start = time()
while True:
    if Path("goodip.txt").is_file():
        print(f"{int(time() - start)} ", end="", flush=True)
        sleep(1)
    else:
        print("Need new IP")
        break

I’m not so sure the loop should stop. Rather, the new IP should be issued and a new start time assigned. The number should always represent the number of seconds since getting issued the new IP. I used the “touch” program to create goodip.txt so I can recreate it easily and should test what removing it does. Oh, removing it should put it back and reset the counter! Haha! It’s a do-nothing machine. Okay, here’s my Do Nothing machine. Every time you delete goodip.txt it restarts the timer at zero and creates a new goodip.txt file. Woot! Okay, the rest is just pywinauto stuff.

from time import sleep, time
from pathlib import Path


print("Starting")

start = time()
while True:
    if Path("goodip.txt").exists():
        print(f"{int(time() - start)} ", end="", flush=True)
        sleep(1)
    else:
        print("Need new IP")
        start = time()
        with open("goodip.txt", "w+") as fh:
            fh.write("foo")

Sun Sep 25, 2022

Failed To Get This Going During Weekend

Ugh, after all my work getting Linux Jupyter running with graphics under Windows, my next project involves cycling IPs after every so often. I lost my code somehow from the last time I did this, but it’s so critical and necessary to this next job which I would actually like to get running overnight tonight to get a jump on it. So, recreate! What do you still know? When I investigated it last, I found this StackOverflow discussion and code:

https://stackoverflow.com/questions/69705923/control-windows-app-hma-vpn-using-pywinauto

from pywinauto import Desktop,Application
vpn_app = Application(backend="uia").start('C:\Program Files\Privax\HMA VPN\Vpn.exe')
dialog=Desktop(backend="uia").HMA
panel0=dialog.Pane
# Command to connect / disconnect the VPN: connect_button.click()
connect_button=panel0.ConnectButton
# Command to change the IP address: changeIP.click()
changeIP=panel0.Button5
# Check VPN state:
# 0 if disconnected
# 1 if connected
print(connect_button.get_toggle_state())

# Command to connect / disconnect the VPN: connect_button.click()
connect_button=panel0.ConnectButton
connect_button.click()

# Command to change the IP address: changeIP.click()
changeIP=panel0.Button5
changeIP.click()

# Check VPN state:
# 0 if disconnected
# 1 if connected
print(connect_button.get_toggle_state())

What a gift and wonderful starting point. Okay, bite the bullet and reinstall JupyterLab-desktop. It’s up to 3.4.6-1 just released 2 weeks ago. https://github.com/jupyterlab/jupyterlab-desktop/releases/download/v3.4.6-1/JupyterLab-Setup-Windows.exe

Wow, 532MB. This thing is a quarter to half half as big as Anaconda. Anyhoo, force Windows to let me download, then force Windows to let me run. New Windows security really works against this sort of thing. I’m going to Install Python environment using the bundled installer to stay on the well beaten track and see what versions they’re bundling these days. I’m also going to accept the rest of the defaults regarding “just for me” and paths.

Okay, done. Reboot system to make sure I’m making a very clean fresh start. So now only Ubuntu 20.04 for journaling in memory plus JupyterLab Desktop for Windows. Okay, sometimes getting momentum again is all about a series of ridiculously small but clearly defined steps. JupyterLab Desktop uses my Windows “home” folder (C:\Users\%USERNAME%) as home, so from there I can go into the repo folder, which is the same across most of the platforms I use at home now (yay!).

Oh I just noticed that right-click closing a tab also closes that Python kernel. I wonder how long it’s been that way. Useful information. Okay, I’m convinced I lost the old code to do this, so work it up in baby steps… 1, 2, 3… 1? Make a strong new nickname for this, so the code-loss problem won’t happen again. It deserves its own repo… serpcycler! Okay, done. Make a strong new nickname for this, so the code-loss problem won’t happen again. It deserves its own repo… serpcycler! Okay, done.

Ugh, I have to get this done and my first attempt failed miserably and because it’s the weekend and there’s tons of distractions, I lost momentum. And that’s fine because the projects it lost to were worth it. Anyway, it failed because my attempt to install the latest JupyterLab-Desktop failed to connect to the kernel. I could revert to older versions but I’m taking this as a message to stay on Linux and just find another way. I can communicate across OSes with files. And so all I need is a file that says needs new IP and when it’s noticed, it gets a new IP, deletes the file and starts monitoring for it again. Yes! This way I can keep the bulk of the project Linux-side.

Okay, so this project is broken into at least 2 parts, split across OSes communicating via files. Make a Python busy loop Windows-side without Jupyter.


Fri Sep 23, 2022

If Greys Exist They’re a Banana Crops

But before I tell you about this, I save this love message from OBS Open Broadcast Studio for my own edification. If you want to learn about the Greys and a better way to live life than pursuing money, scroll down a bit.

OBS Studio 28.0.2 
28.0.2 Hotfix 
Fix macOS updater not updating to newer versions [Rodney] 
Fix YouTube Manage Broadcast dialog being too large for 768p displays [cg2121] 
Fix broken prefix for obspython binary module on Linux [PatTheMav] 
Fix hotkey settings screen not accepting all input on macOS [PatTheMav] 
Fix memory leak with mpegts [pkv] 
Fix crash when left-clicking on non-multiview projectors [r1ch] 
Fix I420 HLG support [rcdrone] 
Fix resource leak in v4l2-output [shoffmeister] 
Fix source name edit textbox not accepting input on enter [PatTheMav] 
Add support for reading NV12/YUY2 PQ/HLG [rcdrone] 
Fix spacing in scene and source tree [gxalpha] 
Fix Qt5-linked plugins crashing Qt6-based OBS builds on Linux [kkartaltepe/norihiro] 
Update volume controls decay rate on profile switch [PatTheMav] 
Fix crash when removing filter after changing a value [PatTheMav] 
Fix frame sharing and colorspace issues for macOS Virtual Camera [PatTheMav] 
Fix crashes and unusable property button for VSTs on M1 Macs [PatTheMav] 
Fix Light theme Studio Mode labels and T-bar [shiina424] 
Update media states when image source is de-/activated [WarmUpTill] 
Don't save/overwrite browser docks if CEF hasn't loaded [WizardCM] 
Fix DeckLink Output color range and space [rcdrone] 
Undeprecate traditional capture sources on macOS 12 [gxalpha] 
Fix startup crash on Intel Macs [Jim] 
Fix NVIDIA Audio Effects not updating according to user selection [pkv] 

15 Year Timeframe

Okay so I’m on a 15-year timeframe for ad revenue on YouTube and things like it to support me financially. If I’m lucky, I’ll be able to move somewhere with such low cost of living that Social Security will feed and house me. I plan to die alone, not burdening family or friends, but to do it with self-love, flair and panache. We only need one or two really good friends in this life, and by that age you had damn well be one of your own. Live a bit in your mind but not so much so that you should be checked into a home. That plus a good term life insurance policy is all I need savings-wise. So long as I nurture and stay engaged with an audience over whatever is the YouTube of the day, I’ll be able to supplement my income and achieve all my Maslow Hierarchy of Needs. Then go for living a really long life. Maybe a centurion. Humble yet lofty goals for sure, and certainly off the beaten track.

Stay in tune with how the world’s changing and don’t be a dinosaur. Be a turtle or alligator if you want to achieve a static state and never evolve much. Perfect natural templates have a lot going for them if you’re only passing down the genes. But if you want more from life, like Shakespeare and Python, it’s better to be the mush under the dinosaurs’ feet. Be the mole burrowing through the dirt where it’s safe and food sources abound, but keep poking your head up to see how things are coming along topside. Then don’t be scared to scamper topside and check things out. Just don’t expect to live the life of a top-tier predator like a tiger for a few hundreds of thousands of years. The Cambrian Explosion of life templates was around 500 Million years ago, and we’re lucky we’re not all fish and crabs. I think we’ll find a lot of planets where if they got past slime mold, they’ll stall out at fish and crabs.

We’re lucky to be human. We’re lucky to be self-aware and to have empathy for other forms of life. It is almost the same thing because it plays off of our ability to run simulations of things in life in our head. We can imagine what it’s like to be a baby crying for its mother’s milk or to have its diaper changed. It’s a small stretch to imagine being like other primates and mammals whose babies have similar needs. The more sinister and venomous a creature’s babies are born, the less likely we are to have empathy for them. Those with imagination have deeper empathy and compassion for they can imagine being other life forms. Do you feel bad for accidentally stepping on ants? How about deliberately swatting mosquitoes? Anyone who tries to corner you into yes/no answers lacks empathy, and thus creativity and is probably living in a turtle or alligator-like static state. Reject false dichotomies—wrongful yes/no black & white framing of issues. Such people hate your creative, empathetic and dynamic nature.

Alligators think alligators are right so eat everything and make more alligators. Their genes tell them they’re more ancient than the strange land-monkeys doing so well right now, and if it just lies low and keeps doing what it’s doing, it’ll outlive them. There’s truth to this. Being partially aquatic blocks a lot of radiation and that rough hide does wonders for surviving the acid rain. A belly full of rocks will grind the bones of a anything you eat and an immune system to die for keep you alive. Hang out and digest all those carcasses of more fragile animals you leisurely gobbled up while you hang out basically doing nothing. Lather, rinse repeat. Laying low even helps time pass quicker up above while the environment resets. Fresh water is nice, but in a pinch salt water’s no problem either. It’s nice to have a gizzard. It’s nice to be a lizard. The turtle’s story is similar but different in vegetable themed ways.

So we strange monkeys—we’ve made it! And we’re running headlong into the next filter event which we’ll ether create for ourselves or fail to deflect. The math bears out that once you have the end-everything button and, exponentially advancing tech gives access to more and more people, statistically someone’s going to press it. Drax from the James Bond movie Moonraker taught me this when I was nine. It’s as certain as the next big asteroid hit if we don’t deflect it. One of the larger meta-purposes of evolution and humanity is to pull off both of these particular deflection. And even that’s just warmup practice for the gama bursts which will only likely pull through by colonizing the galaxy. Oh there’s that word, colonizing. Did I trigger you? We’ll I’ll tell you what, you can have your own planet. Would that make you happy? Maybe the Mormons have it right. So we’re working toward that, but meanwhile we’re moles.

Aliens that pulled off surviving those filter events did so at the expense of biological diversity. When you get a formula that works, you keep using it. Rinse and repeat for a few millennia and you’ve got a banana farm full of clones that the next disease will wipe out like sudden population collapse. That’s what the Greys are if they’re not proto-memories of what everyone looks like when we’re infants just opening our eyes. If Greys are real than my money is on them being a banana crop, and we’re of so much interest to them for exactly the X-Files hybridization reasons. But that’s just nuts. The principle of Occams Razor convinces me we’re the first human-like life our Universe has produced and all these things we see in the sky are of our own creation.

Folks who make the statement that that stuff in the sky is thousands of years ahead of anything we have lack imagination. By my estimation, the same ah-ha moment that Atoms exist that came from botanist Robert Brown of Brownian Motion fame from 1827, Einstein who explained it in 1905 and Robert Oppenheimer who made it go boom in 1945 happened with Quantum virtual particles when Richard Feynman noticed discrepancies in experiments from what math predicted. He was the Robert Brown who noticed the effect. It was explained and experimentally demonstrated by Willis Lamb and the Lamb shift effect and soon after by Hendrik Casimir with the Casimir effect. This was all in the years immediately following the big boom; 1947, 48 and such. Knowing virtual particles were real meant they could be aligned to point the same direction under a sufficiently strong electric field and become magnetized. Because it wasn’t a bomb, you didn’t hear the boom. You just hear rumors of things flying in the sky “thousands” of years ahead of anything we have. Uneducated heathen.

It’s 80 years since then. A lot can Lockheed Boeing in that time. Advantage and air-superiority can only be maintained if secrecy can be maintained. Counterintelligence! Misinformation! Go with the statistically most likely thing—banana crop Greys who conveniently look like what we first see when our butts are slapped and we open our eyes for the first time. Yeah, hybridization. Our misinformation officers can make a lot of money on that story and nothing motivates profiteers like Elizondo Greer. The nut jobs authors will jump on Cooper Friedman and the Lucas Spielbergs will follow. Nobody will know what the ship is going on. The bomb goes boom so the public has to know about Atomics. But we won’t make the same mistake with Antigravity. This is the nature of power—dumbasses. 1000 years ahead of us, sheesh! Pragmatic physicists onto the truth are allowed to do their thing so long as it doesn’t amount to Federico Capasso.

Anyhoo, my role here as a mere m0le popping my head up is to share my morning stories. Does it sound outlandish? Well I’m just saying living life without obsessing over savings and fear of the future is an okay route. Instead look for the loveworthy bits. Do what makes you happy. Don’t let the nattering nay bobs of negativism get you down. They lack love. Do what you love and Bob’s your Uncle! Everyone dies and you can’t take it with you and you’re doing no favors dumping a load on your decedents. Dare to be different and do it differently, looking out for the corruptions that set in upon your own success that will tempt you to revert to an excessively child-like state. Trust your feelings, but follow your extensively evidence-supported thoughts. My thoughts keep bringing be to Linux, Python, vim & git. It’s a mission of basic literacy that can help provide tools to future generations of catastrophe-diverting moles.


Thu Sep 22, 2022

Working on a TikTok Script

Alright, I’m doing some pretty amazing work but it’s not going to amount to anything if I don’t get the word out. Do something really wacky and different and uniquely you on TikTok to get the word out.

Walk them through it. Create an ah-ha moment. Release potential already inside of them by connecting dots. It’s about them, not me. Transition from any talk about myself to talk about them in the same breath.

Hello, I’m here to teach you magic tricks. I once did this trick and got a million views but it’s not my best trick. My best trick is teaching you how to become a modern day infotech carpenter to improve your life in countless ways.

I came to this conclusion in my late-thirties. By my late-forties I had the tools worked out–no small trick in tech where everything changes. I’m in my fifties today practicing what I preach to great effect.

I’m not selling anything but for my ad-bucks on the YouTubes and any thanks people want to send, because I don’t want to be on the hook for anything but my own genuine ways. This is all just a byproduct of me keeping sharp for my day-job.

Being an infotech carpenter means mastering a certain skills and tool-set to be in demand, love what you do, and stay flexible in life. I’m tempted to call it being an Information Age Samurai, but I’m leaning towards kind and generous.

If you don’t know The Walrus and The Carpenter from Through The Looking Glass by Lewis Carroll, you need to know it. The Disney version from Alice in Wonderland isn’t bad but it has a slightly different message.

On my last talking head video I showed you the best trick on Windows today—how to get an LXD container running.

I do this because disruption. APIs change. You learn something today and all the rules are different tomorrow. It’s especially this way on Windows and all proprietary software because their goal is to keep you off-balance and buying.

My goal is to stay on-balance and only optionally buy. This is what Unix/Linux knowledge and know-how lets you do. And now Windows gives you Linux… to a point.

Microsoft keeps control of the layered-on Linux APIs through their Windows Subsystem for Linux, thus continuing to lock you into the Windows platform. The wsl.exe program works similar but different to LXD on Linux. They don’t want you seeing LXD to make comparisons and ask scary questions.

Our goal is to get around that. We do that through actually activating and using LXD Linux containers on WSL—which Microsoft does not make easy. Enter my lxdwin script.

I cracked that nut the other day in a way that lets you do it too. Specifically, I showed you how to do it with an easy-run script that takes care of all that formidable configuration stuff that’s keeping this technique from being all the rage today.

Today I show you the next step where it’s installing enough to run Linux Chrome, which is important for my coming Chrome Automation for SEO series.

Even as my work is discovered through my very limited viewership here on YouTube, it will remain greatly unknown and a secret weapon for those who discover it. So be it. I don’t need no stinkin million views… again and again, that is because I do have a few.

The problem is that this work all seems so arcane. The old magic is always the most powerful because it came before the architectural lock-downs of NT, protection rings and similar sandbox, isolation-tech that prevents you from owning your own machine. Mickeys Sorcerers Apprentice shows us the dangers.

Today’s generations will tune-out ere even the trick is done. They see no benefit. So the trick of the trick is to tie it back to benefits to them fast in the first breath—to scratch an itch they kinda sorta know they have. Give them an ah-ha illuminating moment ASAP. Tie it to their Jupyter Notebook and subsequent six-figure salary and loving what you do experience.

Anyone running Jupyter today through Anaconda or JupyterLab-Desktop is missing out big time on their one big opportunity for Linux command-like experience and some good future-proofing. It’s the perfect opportunity to install Jupyter right and get on Linux right.

The Windows you’re on today will be obsolete in a year or two. All your APIs belong to them. Why invest yourself there? Linux CLI will never be obsolete. Their APIs resist change for decades on end.

Need more reason? The Jupyter you’re running today doesn’t install with the latest Python—and it’s a massive pain to upgrade. Plus even after all that, Windows-Jupyter doesn’t help you enter that super-valuable world of server-side automation—which you’d be a hair’s breadth away from if you weren’t being backslashed all the time. Backslash, backslash, backslash.

Jupyter on LXD on WSL on Windows runs just as fast or faster than the native Windows Jupyter itself. Don’t believe me? Wait until Python 3.11 hits with 30% performance gains and the difficulty of switching to other Python versions under Anaconda or Jupyter Desktop, still mostly stuck on 3.9 (if you’re lucky). With Linux-Jupyter you’ll just re-run my Install script—which leaves all your existing Notebooks and Jupyter configuration intact.

Linux Jupyter offers full integration between Windows-side and Linux-side so you can keep managing files how you like (usually Windows Explorer). There’s no pesky DOS-window you can’t close as with Anaconda nor is there a mysterious server-component only easily accessible through the Windows Settings / Apps / Add Remove Software interface. Sure, it’s in Task Manager if you can find it, but try viewing its console.

Even the graphics-oriented capabilities like non-headless web browser automation (watchable) is not only doable, but far far better because your finished product is ready to go headless 24x7 Linux-side. That’s the work I’m doing today in fact.

So really once you have this knowledge, to not use it and start a gradual baby-step path to an obsolescence-resistant future is to say you like being at the mercy of vendors. You like the forced upgrades, difficult to manage software, kooky paths and incompatibility with mainstream development.

That’s okay. I understand. Fear of new things and growth is a big motivator—however this is not a new thing so much as a fifty year old time-tested battle worn thing. But I understand. Microsoft does a good job convincing you otherwise, and I don’t have their budget or powers of persuasion to convince you otherwise, so you’re stuck and I’m pissing in the wind.

My next step is laying out the apt installs and pip installs Windows-side so they’re never lost when you re-run my Linux Jupyter install script located at https://mikelev.in/ux. Doing so by the way is totally safe to do because you won’t lose work, configuration, and after this video, your 2nd-phase installs either. I’ve even got Google Chrome configuration surviving.

Fate will try to trip you up at every turn. There’s a thousand reasons things go wrong but only 1 or 2 reasons they don’t. Having a guide whose macheted their way through this dense infotech forest infested with vendor-jaguars before showing you the way can help. That’s what I’ve done with these scripts.

Watching me write this first one would have bored you silly as I re-ran the script about a zillion times. It’s not that this stuff is easy. I just make it look that way from 30 years of experience. I count the Amiga because AmigaDOS scripts were nearly Unix scripts and I wrote me a lot of those in my day. So I’m 5 decades old with 3 of practice.

I made Ubuntu glad their file servers have good caches. Funny thing, downloads like this probably aren’t much worse than streaming a few movies. Look at the difference in value in your life!

If you want something badly enough and keep re-visualizing and thoughtfully working towards it, you’ll probably achieve it sooner or later. I never really visualized it. But I’m starting to here and now with this Noah’s Ark of Tech project.

On the occasions I tried going down the entrepreneurial or wealth route, I’m like I hate what I’m doing and the people who’ve “made it” on this route don’t look like they’re doing fun love-worthy things. They revert to being children and play in adult playgrounds and manage wealth and motivate themselves by fear of the future and coveting security over all else.

Then one day it hit me what Lewis Carroll meant by the Walrus and the Carpenter. I identified with the carpenter, craft, the love of craft and never looked back.

Wow, you never know what to expect with my videos, huh? Sometimes it’s the driving chats, sometimes it’s the live-casting coding sessions. Sometimes it’s a presentation. And sometimes it’s just my morning thoughts typed in the bathtub.

We are in the early days of the rise of Magic. Tony Stark is totally right. Sure, science. But it’s pretty darn magical, and you too should jump onto the correct spell-casting bandwagon, cause everyone knows it’s the ancient arcane stuff that’s best.

If you’re not feeling joy at being alive, or something analogous to it, every time you stop to think about it then you’re violating the 80/20-rule.

Everything you do is to move you to that feeling, or at least away from self-hate and hopelessness. If not, you’re stuck in a rut because 80% of what you do is only yielding 20% of what you need. Your monitoring agents are screwed up. They have their priorities wrong. They’re fixated on the dopamine or serotonin reward cycles. Such cycles are broken through cognitive awareness.

CBT

You gain this cognitive awareness my learning how to run vim fullscreen on screen 1 of a ribbon of virtual desktops and use this as “home” in all cases, at all times (of your life) and on all hardware (possessing a full-size keyboard).

Remember, Steve Jobs created the modern mobile revolution. The very company that told you to Think Different today wants you to be doom-scrolling sheep, consuming, consuming, consuming. And everyone copies Apple so the last thing any company wants you to do today is think differently. You might not but stuff.

Do an 80/20-rule pass on your current behavior—often. Take note of the little things that make a big difference. Vocalize (or subvocalize) what distracts you. Call them out (at lest in your head) as distractions. Know them. Categorize them under the list of things tipping you towards a violation of the 80/20-rule. No matter your seeming love for the distractions, if they’re not helping you towards your own goals, they’re being selfish and derailing you.

Know how to a good person. You do this by abiding by the “corrected” golden rule. You do not do unto others as you would not have them do unto you. This takes a passive stance towards doing unto others as opposed to the “broken” golden rule that has you doing unto others… like crusades and colonizizing. Hypocrites would say that if they were heathen they’d want folks with the word to come and do unto them. It ain’t so! Live and let live. Yeah, that’s the most efficient stating of the fixed-golden rule. Live and let live.

Once you take note of the tiny things that make all the difference, act on them fast but do so abiding by the 80/20-rule. Don’t let the little sub-tasks you launch to dispense with things in-turn violate the 80/20-rule. That’s accidentally diving down a rabbit hole. You can chase the rabbit forever and quite forget where you came from where and why you’re here.

So you must stop and journal. You write. You go momentarily meta, but then you back out of going meta, least that derail and spiral you. Exiting the meta-state and re-entering the flow-state or the zone is one of the key skills in life. Have the vim and verve to vim and curb (your meta-writing enthusiasm as I must so now).

Blamo whammo make a vid, for currently this is now hid.

Make the best of what you’ve got. The difference between this and having “made it” to the extent of your wildest dreams is all just a mind trick.

Distractions should be expected. You should plan your recovery from distractions as a sort of performance-art and when pulled off well, you should allow yourself to be self-congratulatory—but not too long! You evaded a pitfall. Lather, rinse, repeat.

Funny thing is I can no longer be by silly self around my child at their request. I must now be a sour puss and dour wuss, so you my YouTube audience are going to have to take the brunt of my silliness overflow. There is nothing that satisfies someone on the spirit-crushing wrong path in life as attempting to crush the spirit of the indomitable life-living crew. Behavior that attempts to bring down others has the scent of sour grapes.

It’s never too late. Those who tell you it’s too late are jealous of those who are taking their time finding their way, digging life…

The best I can do is leave a trail of clues.


Wed Sep 21, 2022

Google Chrome for Linux Under LXD on WSL on Windows

Okay I’m in a great position. I need to use this for my dayjob pronto! I have graphics working on the lxdwin default install. I have the optional apt and pip installs working too. Now with all due haste!

Put the Google Chrome requirements in the main repo. This is friggin’ awesome. 1, 2, 3… 1?

Did a few googles. Threw a few commands in apt_installs.sh. Testing now.

OMFG, thank you! It’s working. google-chrome is the way to launch it. Now get to work. Don’t lose anymore time in tooling. When you get a new dependency, throw it into requirements.txt or apt_installs.sh in the lxdwin repo. Wow, this is friggn’ coming together.

Oh, just do one more with most of the requirements thrown in as are obvious from looking at… at which repo? It’s mostly now from seokata. Okay, focus!

Got everything done today that I needed to do, albeit pushing midnight. Still counts, right?

Anyhow, on the lxdwin front, I got Google Chrome installing and its configuration settings surviving between reinstalls.


Wed Sep 21, 2022

WSL2LXD is now lxdwin, Apt & Pip Installs Done. Graphics Implemented.

Okay, it’s 7:00 AM. I’ve been having these enormously productive days on multiple fronts. If you want something done, ask a busy person to do it. If you want ungodly amounts done, get rid of the daily commute and other profoundly intense distractions in life. POOF! It takes awhile to get back into that productive mode. You have to essentially fall in love with the process again. I’ve known such zone-full flow-states in life before. It was probably my great twenty-eights at Scala Multimedia. Since then, it’s one version or another of trying to recapture the magic. The thing is if you keep an open mind and take a craftsperson’s approach to tool selection an use, you can rekindle the fire. You can have a second and third great-twenty-eight. I think I’m on my third. I did a second when I switched to Linux, Python, vim & git.

Now get to it. Do one more wsl2lxd install.bat run to assure yourself you’re at a good starting-point. And then:

All tests out well. I made a few tiny refinements to the install scripts. Administrator is not actually required for install.bat if you launch it under PowerShell or probably even COM without doing it with the icon from the desktop, LOL! There’s a statement there. Okay, onto the above two implements which should be easy, breezy, 1, 2, 3esy… 1?

Find the closest example of doing this in the install scripts.

All done. Wow, this is excellent. Going to rename repo (again), but this time to lxdwin.

Wow, just spent time debugging the script and it was spaces after the equals where a bash environment variable was being used.

Wow, I think I have it. Doing full reboot.

I think I nailed it.


Tue Sep 20, 2022

It’s really nice when what I want to do for the public is what I have to do for myself. Starting up the Linux Jupyter service after a Windows restart was the perfect example. I restart my machine from scratch more often than I really have to just to clear my mind. Clear computer, clear mind.

Anyhoo, the next thing is to change my shared data drive on my NAS to a shared repos drive (instead). Oh, a firmware update. ALWAYS do your NAS firmware updates. I owned my NAS for maybe a good 2 full years before I finally set it up and as it turns out I’m happy I didn’t have it on for those 2 years. It was really hostile out there. It still is, but QNAP has stepped up and with such high profile NAS hacking, I can lock things down and harden it right away with no Internet connectivity.

So I’m going to be breaking my “D” drive (for data) on my Windows machines, and switching it to an “R” drive (for repos). And while that’s going on, refresh my memory regarding the mounts.

Mounting a drive across systems is not the same as creating a symbolic link within a system. Symbolic links are easier. There’s fewer moving parts. Mounting may touch heterogeneous systems across weird boundaries, such is the case here. Another difference is that symbolic links don’t need target directories in location while mounted drives do. Think! Does this have any ramifications on the WSL2LXD project? It shouldn’t. Oh wait, what was once just a directory Windows-side becomes the R-drive. Can a shortcut make it look like a directory again? Yes, it looks like subst r: C:\Folder\Example

Okay, on the NAS web interface, change the name and path of the data share to repos. Okay go into Hybrid Backup Sync HBS3 and update the jobs so its backup up the location with the new name. Done. Okay, take a look at what happened to my “D” letter drive. An error occurred while reconnecting D to NAS address. Microsoft Windows Network: The local device name is already in use. This connection has not been restored.

Strange message. Anyway, it turns out that I’m going to keep the actual path of the shared folder on the NAS as /data and just rename it as repos. This has the desired effect without having to move a whole bunch of huge files around.

The command I will be using now is:

subst r: C:\Users\mikle\repos
NOTE: THIS DID NOTE WORK!

…except that I have a repos folder there already. Copy its contents into the new location. No, Move the files. I’m going to be deleting them from here anyway. This is big for me, moving my main repos off my local machine and onto the NAS. About 4GB of data which is going to take about 2.5 hours to finish. But it is totally worth it because now any machine I sit down at in my house will be equally up-to-date for journal entries without having to deal with git.

I the meanwhile, I want to get some work done. Specifically it’s stuff that’s going to take graphics capabilities on the Linux side.

Oh, it sped up immensely. Only 6 minutes left.

Still, think! Graphics on Linux-side is really a matter of doing a series of apt installs. I want genuine Chrome OS and not just the Chromium that installs automatically with pyppeteer. And I think I’d like to keep the pyppeteer issues separate from the Chrome issues. So focus on the Chrome install first. And by that I mean focus on getting apt_installs.sh working. 1, 2, 3… 1?

Thought-work. This is something that should happen once, just when the install.bat script is run. It should be preserved though, just in case. And so it should use whatever is in the same folder as install.bat because that’s the only reasonable decision there. It will have access to the entire ~/repos location and so it should move the finished one into ~/repos/temp which is used for other purposes during the install. That way even if you blow away the file you created, there’ll be a residual copy left over.

apt_installs.sh won’t be there for a lot of people doing my process because they’ll have just created a batch file Windows-side the copy/paste way. I’ll make sure they know they can create these files, but most of the time they won’t be there. So I need logic similar to the one that checks for the existence of the repos folder before it creates one, but checking the existence of a file.

Okay, the entire repos folder is moved over to my NAS. I have “R” mapped to \EchidNAS\repos. I made a new empty repos folder in my Windows home folder. And now to mount it from under WSL. Okay, now my WSL instance is ahead of my Windows, haha!

sudo mount -t drvfs '\\EchidNAS\repos' /home/healus/repos

Let’s get a ~/repos folder working in Windows, which in PowerShell would really be a $env:USERPROFILE\repos folder working. Okay, got it working. You need to run PowerShell as Administrator (of course).

New-Item -ItemType SymbolicLink -Path "C:\Users\mikle\repos" -Target "\\EchidNAS\repos"
NOTE: THIS WAS NOT LINKABLE IN-TURN FROM WSL

Ugh, I’m going to have to make good note of this because I’m going to have to do that on all my Windows machines.

And finally do a complete (and true) restart to see if it sticks.

shutdown -f -r -t 10

Oops, my WSL Ubuntu 20.04 lost repos (of course). I mean the folder’s still there, but it’s not the network drive silly. I need to re-mount it:

sudo mount -t drvfs '\\EchidNAS\repos' /home/healus/repos

…and then make it persistent updating the line in my /etc/fstab from:

\\EchidNAS\data /home/healus/repos drvfs defaults 0 0

to:

\\EchidNAS\repos /home/healus/repos drvfs defaults 0 0

And another restart!

After quite a bit of work, I’ve managed to get things pretty much the way it’s supposed to be. That convoluted PowerShell command up there is much more easily accomplished in a COM window with Amin rights using this command:

mklink /d %USERPROFILE%\repos R:

However, there’s a problem here that’s going to be unique to me because my repos directory is a Windows-style symlink to a letter-drive which assigned to an SMB-style network shared location.

Okay, so this is imperfect world stuff. Run the script to ensure that it will work for most people correctly. A repos folder is made in their Windows home and that’s used cross-operating system, and most people can start from there. For me, my repos directory appears to be there because of a Windows-style symbolic link, but the cross-OS stuff won’t work. Instead I’ll have an “empty” repos folder Linux-side like everyone else and I’ll have to do the mount and /etc/fstab edit to get my real repos folder. So be it.

Do a full install now (run the script) and ensure that I do have a blank repos folder so the video demos are solid.

Okay, so it does work solid for everyone else, LOL! They get ~/repos and ~/repos/transfer. But for me, I’m going to have to have to do the following commands logged into WSL but not LXD:

sudo mount -t drvfs '\\EchidNAS\repos' /home/ubuntu/repos
lxc config device add jupyter repos disk source=~/repos path=/home/ubuntu/repos

And then I have to sudo vim /etc/fstab to append this because I can’t echo-append it directly in. I’ll have to investigate that.

\\EchidNAS\repos /home/ubuntu/repos drvfs defaults 0 0

But now that I’ve done those things, I should be able to reboot my system and have my NAS-hosted repos folder there. Try it!

PERFORMANCE DROPPED TOO MUCH WITH REPOS ON NAS

Well I really outsmarted myself there. Moving my repos folder onto the NAS was a complete disaster, but it was for the best. I basically undid all my work and went back to a separate data folder on the NAS (home cloud) and a local repos folder. I’m basically reversing most of the stuff from the last few posts and hoping I can get to sleep early enough to hit the work I urgently need to deliver for work.


Tue Sep 20, 2022

Created Windows Shortcut To Start Linux Jupyter on LXD

Okay, I added LXD.lnk to the WSL2LXD repo. It’s very simple:

wt PowerShell -c "wsl -d Ubuntu-18.04"

This works because when the Ubuntu 18.04 .bash_login script is:

jupyterlogin

And this is a program in /usr/local/sbin on WSL:

until
        lxc jupyterstart 2>/dev/null
do
        sleep 1
done
lxc exec jupyter -- su --login ubuntu

And now this is actually using the alias feature of lxc which you can see when logged into just WSL with lxc alias list. Anyway, jupyerstart is an alias for:

exec jupyter -- su --login ubuntu -c /usr/local/sbin/jupyterstart

And now there’s one more /usr/local/sbin file in the picture, but this one is under LXD and just starts the Jupyter server:

cd ~/repos
screen -wipe >/dev/null 2>&1
if ! screen -list | grep -q "jupyter"; then
    screen -dmS jupyter /home/ubuntu/py310/bin/jupyter lab --ip 0.0.0.0 --port 8888 --no-browser --ServerApp.password='[long hash deleted]'
fi
echo "Visit http://localhost:8888"

Pshwew! There’s a lot of getting around API limitations going on here by use of sbin commands. There’s a temptation to make bash script one-liners and feed them on the command-line parameters of wsl and lxd, but that generally doesn’t work. If it’s not permission problems then it’s semicolon parsing. It turns out that dropping magic little sbin commands into place and calling them on the command-line seems to always work.


Sun Sep 18, 2022

The Two Maxwell’s of Light / You Can’t Take It With You

You can’t take it with you. And leaving it all to your children interferes with their journey. You and they are different beings and you should focus on your journey and help them while you’re here and your experiences overlap. When you’re gone what you should have left them is your learned wisdom. Maybe they can take that with them—because it’s information, both internal and possibly eternal. No bank account or real estate holding can compare.

When you ask yourself “what’s important?” in the most reductionist way possible, it seems to me you’re asking what’s more important, the arrangement of the atoms or the atoms themselves? Is it matter that really matters, like material possessions and the particular running instances of life-infused machines and creatures, or all the information that it would take to re-instantiate exactly such machines and creatures?

The bits representing wealth in your bank account are nonsense, as is the gold hidden in your mattress. When you die, none of that’s going with you. Matter most definitely stays behind, bets reallocated, recycled and re-experienced by… by… well, that’s the other thing. I’m not going all the way to souls with this, but certainly the arrangement of the atoms… that particular sequencing that allowed an interconnectedness and distinct subjective experience in this material world… that is something. Call this animate information what you will, but THAT is what’s important. important.

If you can take it with you, this is where the magic happens. Such information that’s somehow behind the material existence may exist elsewhere. These things elude science… for now. But just like Tony Stark talking to Thor, I’m pretty confident that such mysteries will be understood and moved from the realm of magic into that of science some day. Big Dogs will throw the alphabet soup of particle names in the Standard Model at you to try to convince you everything is known. Throw back… hmm, what is it? What still eludes science?

I’d start with non-locality of course. Spooky action at a distance. Entanglement and cohesion. Faster-than-light travel of information. Yeah, that’s a pretty good start. Even Einstein was driven mad over this. Big Dogs trying to brush off the unknown have to answer the non-locality question first. Are separate particles at a distance truly connected in some physical sense? If so how? If not, then how do they effect each other? This is our second big clue that existence isn’t what it seems.

Our first big clue was two-slits, or the double-slit experiment performed by Thomas Young in 1802 that established the particle/wave duality of light particles. Over time it was established with other particles getting progressively bigger, through electrons, atoms, large molecules (specifically soccer ball like carbon 60 buckyballs) to the point where it’s pretty clear that all matter has particle/wave duality and we’re all made from rippling fields and not distinct particles at all. There is perhaps less to explain here than with non-locality, but still every high falutin word off of the Standard Model you can quote is just snobbish erudition in the face of the fact we’re all just rippling fields in a way that occasionally looks like distinct particles, but truly is still just still not understood.

There’s two Maxes of light. The first was physicist James Clerk Maxwell who worked with the visionary experimentalist and distinctly non-physicist Michael Faraday in the 1800s to put together field-theory by which we understand electromagnetism and light with equations as relevant right up to this day as they were back then. Besides Sir Isaac Newton (from the 1600 and 1700 hundreds), this was the biggest thing going in science. Now the science of thermodynamics didn’t start until maybe 100 years after Maxwell, but he was quoted as saying: “The idea of dissipation of energy depends on the extent of our knowledge” which gets at another things still eluding science: where does the information go when things burn? Is it totally gone? Some might insist we know, but we don’t. Unlike gravity which will imminently be incorporated into the Standard Model as the evidence for Quantum Loop Gravity or something like it falls into place, Information Theory still eludes. It’s very related to the second rule of thermodynamics which states Natural processes tend to go only one way, toward less usable energy and more disorder. Once things are done, they can’t be undone. Information goes away in a non-reversible way. Explain that, Big Dog.

Oh yeah, the second Max of Light. Max Planck came quite a bit later (the 1800s and 1900s) and with him you basically combine the Michael Faraday experimentalist and the James Clerk Maxwell physicist and mathematician into one person. Like how Michael Faraday “saw” the invisible lines of magnetic force that compelled him to investigate (and eventually team up with Maxwell to do the math), Plank “saw” that something mysterious must be at play to explain the strange “quantized way” very dark objects (you might call them black bodies) cooled down once heated up. Imagine metal heated white-hot, then it cools down to a glowing orange and finally back to its normal color. It’s still radiating heat, however. And so on down to say absolute-zero if the metal were perfectly black and in an environment with no ambient energy. In its own way this was as mysterious and confounding as two-slits and magnetic fields. But unlike Faraday, Planck did his own math… over and over, combining it with the work of others to avoid the Infrared and Ultraviolet Catastrophes (ever wonder why the visual light spectrum evolved the way it did?).

In the end, Max Planck discovered (although he didn’t know it) that the smooth wavy continuum of Max-Light #1 (James Clerk Maxwell) was actually divvied up into chunky little bits. Ultimately, he thought it was a trick of numbers and not indicative of the nature of the real world, but what he discovered was that when a quantum field ripples into our space/time, at some scale (we call it 1 planck-unit), it’s either all-in or all-out. It’s where the math for an infinitesimally small black hole stops working and would cease to exist / collapse in on itself. Uh yeah so anyway the unexplainable thing here is the quantum nature of our universe. We don’t really know what these quantum fields are, where they come from, or most every other thing about them that exist outside the very framework within which we and our brains to think about them exist. They’re (or “it” if it ends up being just one) is outside the system.

I could write on forever, but this is just clearing my head of thoughts for the day, and not a formal article as such. If you found it, good for you. That there were two ground-breaking Maxwell’s of Light always confused me. I got it straight mostly for myself and perhaps one day for my kid when they stop having the somewhere-learned and self-imposed “I’m not a math person” an allergic reaction when I mention such things. I have to frame it better.


Sat Sep 17, 2022

Running Ubuntu 18.04 and Ubuntu 20.04 Simultaneously Under WSL 2

This latest round of wsl2lxd updates, it has stabilized. My Windows taskbar can once again look like this: a BASH Shell, Jupyter and Web browser.

Windows Taskbar Shell Jupyter Web

This is not quite religion to me now, but it is in the realm of obsessive compulsive. More accurately, I’m disciplining myself for it to be compulsive so that my muscle memory works well for me. This journal is always on screen one. This got disrupted momentarily as I worked out whether I was going to try to keep only one copy of Linux running under WSL, the Ubuntu 18.04 one that I need for running LXD and Jupyter, or the Ubuntu 20.04 that I used to run before this project and could easily bring back and run side-by-side with 18.04 because WSL easily works that way.

Remember, each ubuntuxx04.exe is a “slot” allowing one of those versions. So running 18.04 and 20.04 side by side under WSL is not a problem. So that’s what I decided to do, and now I keep this journal open on virtual desktop one all the time with no fear of my WSL / LXD install script development interfering. There was nothing worse than coming back here and feeding this shell locked up, requiring a Ctrl+Shift+W to close and the following cleanup work of .journal.md.swp file that’s left behind. I was worried about there being a performance issue, but there is none.


Sat Sep 17, 2022

Preserving ~/.jupyter config files between WSL 2 LXD installs

I have tons of work to do this weekend on an entirely other front than either my kid or this new… uh… wsl2lxd… work. It still doesn’t roll off the tongue but I want it aligned to what people are going to search on. It’s definitely lxd and wsl. LXD4WSL? That feels maybe more genuine because I’m not converting wsl to lxd. But you are going from wsl to lxd, so it is legit. And you do start out in wsl and go to lxd, so I’ll keep it. The other way around sounds too much like you’re modifying lxd for wsl. Well so long as I stay with this naming approach I’ll have such issues because I want a number between them (and not a hyphen or underscore).

Okay, next? Get one or two tiny things in today before the day gets underway proper. I also have one of those long drives to the Poconos this weekend to pick up and drop off stuff. Ugh! Four plus hours and all the related energy and momentum lost on just that. But I’ll cross that bridge. Use your good morning energy right now wisely.

Okay, realization: Jupyter uses environment variables that have been set in the shell from which it is run. This provides a way to control things about Jupyter without having to feed everything on the command that runs it. I’m shoving a lot on that command right now, especially the hash that keeps the password “foo”. It’s one of the idiosyncrasies about Jupyter on LXD under WSL that’s hard to get around. The server must be “secured” and if you’re not passing tokens (even harder to do in this context) then you’re using a password. Oh foo. Okay… environment variables to change where the config files go so you only ever need set theme to black once. And there’s that “reset kernel and clear all” keyboard shortcut I never want to set again. 1, 2, 3… 1?

Identify the default location Jupyter stores config files. Okay, per this page: https://docs.jupyter.org/en/latest/use/jupyter-directories.html

Config files are stored by default in the ~/.jupyter directory.

Not bad. So first order of business is to allow repos to contain hidden config files. I already do that with the display.sh file for passing WSL host DNS server IP for display port address for VcXsrv X-Windows server. I never thought to make it invisible. Do that… done. Okay in doing so I realize I already have a transfer directory in repos ~/repos/transfer and this is completely appropriate to contain .jupyter. It will be nice to keep all the hidden stuff (for transfer) grouped under repos. Okay… make the install scripts make that location… hmmm, I really love the symmetry of install.bat and install.sh because of how it implies the two halves of the process, what they’re doing and where they take place although it’s annoying in command-line completion.

Okay, making the transfer directory belongs Windows-side in install.bat and that’s done. Wow, this is nice. That’s also going to be the mechanism to move apt_installs.sh and requirements.txt down into LXD. Okay, now simply add:

# Allow all your Jupyter configuration (dark mode) survive reinstalls.
export JUPYTER_CONFIG_DIR=~/repos/transfer/.jupyter

…to .bash_config. Wow, this is too easy! My practice on such things while making Levinux is going a long way. Why hasn’t anyone done this before. All those poor people running Jupyter directly under Whinedoze when they don’t have to! If you’re micro soft then you whine doze. This is all too suddenly easy. It’s no wonder Microsoft didn’t include systemd on wsl. Some hurdles had to be put in place before this discovery was made and made popular. I’m in a race. Gotta get this published pronto. 1, 2, 3… 1? Okay, environment variable is activated. Directory exits. This should just work now. Do a full install test. Process?

Okay, all done. That ensures that currently Jupyter configs are working and persistent between server restarts. Now do a reinstall of the whole WSL instance, LXD instance and JupyterLab Server instance (install.bat):

Watch the bunny go down the rabbit hole… Wow if this works it’s yet another benefit of running Jupyter this way when you’re on Windows. It provides an easy way for your Jupyter configuration to survive upgrading Jupyter. You can upgrade Jupyter in location with an apt upgrade or you can just reinstall the whole thing which will have no real harm. It’ll wipe out your apt and pip installs but I’m working on that too. After they’re run, they’ll be kept in the transfer directory always accessible during reinstalls. Wow!

Okay, do a refresh on localhost:8888… back to password… hit Enter… back to white! Good, this was an old fashioned ~/.jupyter config store location. No put JupyterLab dark theme back on and run the install script again…

Do a refresh on localhost:888 which for the one person who’s following along here I’m actually doing from an installed Edge app which is much better than in a localhost:8888 tab in a browser. Bingo! JupyterLab dark theme! The config has survived between re-installs. Great success.

Think through next steps regarding apt and pip installs. This very much clears the way. I’ll drop those files directly into ~/repos/transfer. Avoid subdirectory-hell. You’re already two deep.


Fri Sep 16, 2022

ubuntu1804.exe, ubuntu2004.exe and ubuntu2204.exe Oh My!

Okay now get back to my normal workflow. I have this (my full-screen vim journal) on screen 1, albeit admittedly no longer fullscreen. I have enough reasons for the tabs of Microsoft Terminal to show enough these days to just allow it to show. It’s improving fast. There’s a “focus mode” that’s not full screen that I have to explore, and you can finally remove Azure from the profiles. Oh, and add Ubuntu 18.04 to the profiles… done. Let’s see if that survives between reinstalls. Anyhoo, back to my main work. 1, 2, 3… 1?

I’ll let the folks here have a bit of insight to my day-job. Just sanitize as I go. I keep a separate browser for certain types of work because Google login aliases under GSuite are extremely stubborn. It’s dashboard work, which in this case means an automatically updated Google Sheet. I know dashboards are so much more these days with Google Data Studio and other product, but sometimes based on the nature of the data it’s just best to plop it into a spreadsheet with some good color-coding, and that’s the case with these dashboards.

Okay, done. Didn’t really document as much here as I thought. But it does show the fact that of the many, many things I’ll be using this new wsl2lxd project for will be working on spreadsheet format things. Also I’m done my work early enough on a Friday to crank out my weekly report without it running into a weekend. I’ve been meaning to use one of these journals for that, but only with mixed success.

I can tell already that reinstalling these Ubuntu-18.04 instances for LXD hosting under WSL is something I’m going to be doing often. Many of the basics of not getting screwed by this process I have already worked out. I need to remount my data drive in that persistent way:

Mount the data drive:

sudo mount -t drvfs '\\EchidNAS\data' /home/wsluser/data

vim /etc/fstab

\\EchidNAS\data /home/wsluser/data drvfs defaults 0 0

Okay, when I go over this in a video, I have to remember the following salient points:

WSL has an odd approach to their Windows subsystem by which there is an exe for each Ubuntu found in this strange location:

C:\Users\%USERNAME%\AppData\Local\Microsoft\WindowsApps

Methinks this is something like a Web Store sandboxing location. Anyway in here we find:

…along with winget, some pythons, a MicrosoftEdge and wt. Go figure. Anyway, my speculation is that you get “one slot” for each exe in WSL so you can’t be running 2 Ubuntu 18.04’s and such. It feels like this is a way of Microsoft letting you have experience with each version Linux but to not run too many at once, giving both Microsoft and Linux a bad name. This may factor somewhat into them not (easily) allowing a Linux container system on WSL.


Fri Sep 16, 2022

Install Script and Start Script for Jupyter on LXD on WSL on Windows

The WSL 2 LXD install script is done and now I just need a way to get Jupyter started (before I do daemons which I’ll do a video on) after a reboot.

WSL -d Ubuntu-18.04 needs this in its /usr/local/sbin

until
        lxc jupyterstart 2>/dev/null
do
        sleep 1
done

Wow, it’s working. I just silenced jupyterstart with >/dev/null. Just pushed that one. It’s nice to see the visit localhost message on an LXD host login. Speaking of which, the magic command to start Jupyter in the background is reduced to:

wsl -d Ubuntu-18.04 -e bash -c "exit"

You don’t even need to do anything in the session, so you can logout immediately. It just needs to be a bash session so the .bash_profile runs, which in turn runs the /usr/local/sbin/j program, which uses that above program to use an lxd alias… pshwew!

You know those expressions that the ancient magic is often the most powerful. Well that’s poo poo’d by the modern Microsoft crowd who like being forced into proprietary power-tools just to get simple tasks done. But for everyone else, there’s Unix brought to you today in the flavor of Linux, and it’s some powerful ancient magic. I just coded up my greatest spell since Levinux, some 10 years ago. And it’s a better spell-casting and sharing environment than it was back then, so the time is ripe.

Okay, .bat files! Visit https://raw.githubusercontent.com/miklevin/wsl2lxd/main/install.bat

Select all and copy.

Go to your desktop, right-click and select New / Text Document.

Rename that text document install.bat. Be sure you actually give it the file extension. The default mode is to hide known file-extensions which will result in you naming it install.bat.txt which will not work. You may have to go to an explorer window (Win+E) and go to View / Options, click the “View” tab and uncheck “Hide extensions for known file types”.

Right-click install.bat and select Edit. Launch with Notepad.

Click in the document, paste and Ctrl+S to save it. You can then close the document. If it warns you, then you didn’t save. Use File / Save from the dropdown menu if needed.

Now just double-click the icon and hit Enter on your keyboard to allow the script to start running. Sit back and wait awhile.

There is no more interactivity until the script is done, at which time it waits for you to hit any key to release the script. The black screen will disappear.

Hit any key to dismiss the install window.

Go to Edge (because we’ll be making an App of it) and visit http://localhost:8888.

Enter the password “foo” and click the Login button.

Click … / Apps / Install this site as an app

Okay, this takes care of the install and first run. But on subsequent runs there needs to be a way to start the server.

Right-click on your desktop and select New / Text Document. Rename it to start.bat and put this inside:

wsl -d Ubuntu-18.04 -e bash -lic "exit"

Now after any fresh restart of your machine or accidental closing of the Jupyter server, you can get it running again just by double-clicking this icon.

I’m still making sure that’s solid. If it doesn’t work, open a COM or PowerShell window and type:

wsl -d Ubuntu-18.04

…which will log you into the WSL “outer” Linux whose .bash_profile should automatically start the Jupyter server in the LXD “inner” Linux.

Okay, wow this has been epic. Many echoes of Levinux. Walk away from it for a little bit. I’ve got some other work to do. Test the install process one more time just to make sure its latest status is proven working and percolating in the background for next steps… done!


Thu Sep 15, 2022

Already Renamed Jupyme to WSL2LXD

Pulling out all stops. My first version of the WSL 2 LXD install script is already pretty solid. I renamed jupyme repo to wsl2lxd which is a lot easier to remember, more searched on and probably of greater widespread value than just another way of getting Jupyter running. Act on important ideas quickly. This is all a soul-cleansing experience for me. I’m learning how to be thorough again. Nothing quite so heady since I made Levinux. Good! I need to get back into that state. A prolonged 2 or 3 years in that state, and wait and see what I can do! I was detailed (by my own hand) more than once in life recently–recently being the past 15 or 20-years. Haha, recently for me is a lifetime for others.

Okay, think! Leverage your vim spell-checking! Putting my en.utf-8.add file in my vim repo. Time to make it public again. And I’ll never need to rebuild my personal word dictionary again.

Another thing I did was activate an Ubuntu 20.04 under WSL to be separate from all my LXD work which is all centered around 18.04.

Now I can really run the wsl2lxd.bat file with no fear of loosing the ability to edit my journals here and otherwise have a stable instance of Linux always running.


Wed Sep 14, 2022

Writing Automated Install Script for LXD & Jupyter on WSL

Wow, wow, wow! In an attempt to force myself to use JupyterLab under LXC, I uninstalled Jupyter-Desktop through Windows Settings / Apps per their upgrade instructions. It’s like disabling the arrow keys in vim to force yourself to use hjkl. Find the love in things. I initially had concerns over performance and there is a wee bit of overhead lag on LXD vs. just WSL, but I’m willing to live with it for the Noah’s Ark benefit. Speaking of which, look at this great ASCII art I found. Right? Right! I’m so working that in. http://www.ascii-art.de/ascii/ab/ark_of_noah.txt

  (\    _\_(`\_ `/`                     _
 `/,-'=/`                   _,'|`._
 /'  `/`                 ,-' |||._ `-._        i_i
                     _,-' ,-' ||  `__  `-._ "=(. .)="
                 _,-' _,-'  --||- (..=`/._ `-/#\ (    i_i
               ,\ _,-'     |  )|_|-==` \_)`-/  /v  "=(. .)="
             ,' ,|/      \ |__||\ //|\\ /  / #/    ,/ \ (
           ,' ,' ((_.--._))   || )(/|\)'  /#  ;  ,/ #,/v
         ,' ,'   |`-    -'    ||/  (/\\  /  #; ,/# ,/
       ,'  ' /   |  \  */"-._ |/ _____,-''"/_)/__ /
     ,',---|((_.--._(__/    _.--""_____)-//_______"-.
      (,-.)| `-     -(  _,"_.--""    |||((   __ "-.:
    ___,/ ;|   \  */ _\'_,"  (\__/)  |||\\\ |__`,()() . \
   (,_.) (_|   (__/,'_,' /_/)=\.\. = |||||| | `( ` ``\|\)\
     ( ;.__|  _,-'_,'  =//. ==> _7)< |||||| |`` , ` *  "")
   ___\  _,--'_,-'  //_(7__/) ////\  |||||| |``` \___.--'
    \_"""_,--' <*)_//'""    )/_/-"""":|||||,""""(("-._/
    | """" ) ( _(-' _.---"\___,----. |||||| |  ,' "`._ ,((
    |  ) ( \_/'   ,'    _""   "_    `.||||| |,"\\'--._)   "._
     \ \_/<. .>""(      ( .   .)      )|||| |\\ \/,"\\ /`--._)
      <. .>|_/\|  \/     ) \,-(     \(||||| | \\)"\\ \)
      ||_/( (  |   |\/   /, \  \ )\\(:|||||,()""""-.:|
      |  \ `-\ |   |__\/,  :`/-`._____,-""_,'    ctr:|
      |"""\___,""""""""""""""\(_,( (__,-"||---"""""";
       \---""""---------""""""""````/////))----""""/
     ~~~\                         ~///////~      ~/~~~~~
         \ ~~~~~      ~~~~~~~      /~~~~/      ~~/    ~~~
     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Wow, every last thing becomes love-worthy.

Be sure to make it that re-running ~/repos/jupyme/install.sh has little to no downsides. You’ll loose your pip installs and Jupyter configuration, but I can address all that. Also, apt installs.

I want to make the bit optional graphics build.

First, make the conduit between the host machine and the LXD instance. The important thing is to simulate the tftp (trivial file transfer protocol) tricks that I’ve done in Levinux. Moving my Levinux tricks over to LXD is potentially profoundly big win… wow, releasing such potential. As simple as possible!

Hmmm, maybe even make your WSL default instance


Tue Sep 13, 2022

Switching From WSL to LXD Containers Permanently

Wow, I did some absolutely amazing work yesterday with the Jupyter install script and changing my github centralized location into repos. This will probably be my last journal entry directly on WSL (Windows subsystem for Linux) and instead all following activity will be from within an LXD container on Windows. It’s time to practice what I preach. I only have a few mappings to make on the new jupyter container by running these lxc commands from the Ubuntu 18.04 WSL instance:

lxc config device add jupyter repos disk source=/mnt/c/Users/mikle/repos path=/home/ubuntu/repos
lxc config device add jupyter ssh disk source=/home/healus/.ssh/ path=/home/ubuntu/.ssh/
lxc config device add jupyter data disk source=/home/healus/data/ path=/home/ubuntu/data/

Wow, okay this is huge for my workflow. I’m literally taking WSL out of the picture for all intents and purposes, because once you’re on a real LXD instance, you might as well be on any generic Linux. It is less compromised than WSL. All I have to do is put this at the bottom of my .bash_profile in wsl:

until
        lxc exec jupyter -- su --login ubuntu 2>/dev/null
do
        sleep 1
done

Okay, this is a test. I am continuing editing this from inside the LXD container. No more WSL… woot!

I have a huge round of work to do tonight, but if I play my cards right, I can move it all over to the Container. Push hard! 1, 2, 3… 1?

Mount my WSL locations on the LXD container. Here’s the trick to finding your Windows home location:

WIN_HOME="$(printenv | grep -o '/mnt/c/Users/[a-zA-Z]*/')"

Mon Sep 12, 2022

Going Down the WSL to LXD Install Script Rabbit Hole

Okay, I set my journal startup to go right to my MikeLev.in blog because it makes for better videos, and that rabbit! It’s really good branding. It differentiates me quite nicely. Nobody has that rabbit facing that direction because I reversed the original artwork.

Hmmm, there’s really so much potential of what to do. What NOT to do. Okay, think it through clearly. One way to smash this thing is to movement-atize all my latest work, and those few install scripts (I’ve already begun) can go a long way.

Following onto an idea I haven’t even published yet is the concept that the combination of changing expectations brought about by Docker, WSL and containers in general, no single Linux instance should be thought of as all that important. You can get it back from either a backup of the image or from a relatively simple server build script.

You don’t want to start working on preserving any particular container image until you have the server build scripts solid. That’s where the specialness really resides. The server-build scripts are the interesting the magic spells for people like me who don’t want deeper into the arcana or higher into the dockersphere.

So currently there’s really nothing special about my jupyter lxd instance. Let’s make a bash install script that will just create a new instance of a Jupyter server fully running and working on localhost:8888 on the Windows machine. Wow, nailed it! This gets installed from an Ubuntu 18.04 instance with LXD running. I should make a PowerShell script to do that next.

#!/usr/bin/env bash

CONTAINERS="$(lxc ls -c "n" --format csv | grep jupyter)"

for container in $CONTAINERS; do
  echo "Stopping $container..."
  lxc stop $container
done

JUPYTER_EXISTS="$(lxc ls -c "n" --format csv | grep jupyter)"

if [ "$JUPYTER_EXISTS" == "jupyter" ]; then
  lxc delete jupyter --force
  echo "Old Jupyter container deleted."
else
  echo "jupyter container doesn't exist"
fi

lxc launch ubuntu:18.04 jupyter
lxc config device add jupyter jupyme proxy listen=tcp:0.0.0.0:8888 connect=tcp:127.0.0.1:8888

until [ ! -z "$(lxc ls jupyter -c '4' --format csv)" ]
do
  sleep 2
done

lxc exec jupyter -- apt update
lxc exec jupyter -- apt upgrade -y
lxc exec jupyter -- mkdir /home/ubuntu/repos
lxc exec jupyter -- add-apt-repository ppa:deadsnakes/ppa -y
lxc exec jupyter -- apt install python3.10 -y
lxc exec jupyter -- apt install python3.10-venv -y
lxc exec jupyter -- python3.10 -m venv /home/ubuntu/py310
lxc exec jupyter -- curl -L -o /home/ubuntu/.bash_profile "https://raw.githubusercontent.com/miklevin/dotconfigs/main/.bash_profile"
lxc exec jupyter -- curl -L -o /home/ubuntu/.bash_prompt "https://raw.githubusercontent.com/miklevin/dotconfigs/main/.bash_prompt"
lxc exec jupyter -- /home/ubuntu/py310/bin/python3.10 -m pip install jupyterlab
lxc exec jupyter -- chmod -R 777 /home/ubuntu/py310
lxc exec jupyter -- chmod -R 777 /home/ubuntu/repos

Simply logging into this machine with:

lxc exec jupyter -- su --login ubuntu

…will cause the Jupyter server to be running. I don’t think my .bash files belong in that dotconfig repo anymore. I think I should round everything up into a repo names, oh maybe jupyn! That’s my first instinct.

Now is the time to change my big master ~/github folder to ~/repos. Just go for it. 1, 2, 3… 1? Okay, I changed a whole bunch of references from github to repos. Now go into repos and make a jupyme folder… or is that JupWyn? Wow, I want to write jupyme. It flows better. Check the googles for which is less taken. Yeah, change of decision. We’re definitely going with jupyme. It just rolls off the tongue better.

Okay, wow, I got my first version of the Jupyter install script on WSL pretty well worked out. You need to already have LXD on WSL and maybe I’ll do a script for that next.


Mon Sep 12, 2022

Developing Priceless and Timeless Tech Habits

I did a great video in the car yesterday about my theme and niche being the gradual internalization of the tools of our age, the way our ancestors internalized mitochondria and calcium. Today, we’re internalizing the tools of tech.

You can ether be an active participant in this process or a low-impact bystander. Now there’s nothing wrong with being a passive spectator, but for the spiritual money I spent for my seat in this game, I’m gonna play.

Mine is not the story of playing strong hand from the start or even compensating for disadvantage young. No, mine is the story of a slow start, false starts, more false starts, then realizing late in life what a good starting point that is.

It’s easy to stall out, and there are unlimited excuses to stay stalled out. Nobody’s coming but you to get you stared again—so it’s critical to internalize your own effective starters; and I’m going to share with you mine.

Complexity of the kickstart process is your enemy. Demoralization as you turn that ignition key because it doesn’t turn on is by definition a nonstarter. This will happen because the car itself changes.

Of all the things “done to you”, this is one of the easiest to do something about. Start crushing those variables that change the kickstart process. Develop habits that are timeless and priceless.

I propose to show how to be comfortable in a meaningful long-term muscle memory way on any hardware that provides Virtual Desktops and Full Screen mode. This is your vehicle, and I will show you the ignition.

Have a fullscreen text editor in command-line interface mode on virtual screen one. In this window have a single text file that you keep for life. Update it frequently. Back it up well. Organize it to help.

Practice achieving flow state and enter into the zone. Write and write some more and write for hours. You will now believe what you think and know—and know you need if you only let yourself process the thoughts.

As you settle in, know that while this feels like home, it’s a home on wheels. You can carry it to Mac, Windows, GNOME, KDE or any other desktop computing environment in the future.

View the desktop environment of your OS as an empty shim with nothing but a Net connection, virtual screens for fullscreen apps and a taskbar for switching between them—little more.


Sun Sep 11, 2022

Recovering Old Laptops For Video Production Purposes

It’s okay to let a few days go by without pushing out a video or two. Today I will try to, however, if even just for myself getting to a good dev point. Okay, the interest in my video about how to upgrade WSL1 is so strong that I should do a more formal “all-the-way” video about healing Windows.

Healing Windows to heal the world.

Maybe I’ll even show how to do a factory reset from Windows 11 to Windows 10. But for now, just reset Windows 10 on one of my machines and then don’t do the flurry of updates. Make sure you’re on a version lower than 18917 so that you have WSL 1. Then do a Linux install to make sure you have a WSL1 on your machine. Wow, early adopters really got fubar’d. THEY are your likely crowd because they had the initiative to try wsl back in those early days.

Okay, work on a sort of script. It’s not going to be a multiple take sort of thing because it takes too long to reset windows. Learn a bit more to say some interesting things during the video. Oh! The Wikipedia page has some interesting tidbits https://en.wikipedia.org/wiki/Windows_Subsystem_for_Linux

WSL leverages the architecture of NT to run Linux programs as special isolated minimal processes they call “pico processes” attached to kernel mode which they explain on a protection ring page https://en.wikipedia.org/wiki/Protection_ring

Okay, I’m starting out on Windows Win10_1809Oct_v2_English_x64.iso gotten through the Windows-ISO-Downloader.exe

For the longest time I resisted the concept of multiple laptops over “integration” issues. One laptop to rule them all, and all that. But the blindspot it created was what happens on a new laptop install. What happens with different versions? Different environments? Different instal presumptions? And by ignoring those questions you let manifest a nasty blindspot and set of presumptions. Nothing is that important but what you deliberately protect. And what you deliberately protect should be the corest of core and nothing more. Do not protect any presumptions of operating systems or hardware or you’ve enslaved yourself to some vendor or other.

That being said, the videos I’m trying to make for YouTube are right on the vibrating edge of WSL (windows subsystem for linux) being available. I’m working on a new machine where its version is below 18917 but with every System Update I apply, it’s creeping to that point. I’m currently installing 20H2 and I’m worried that goes past the WSL2 boundary and will ruin the video I’m trying to produce. I can always roll-back, but lost time effit!


Sat Sep 10, 2022

Let People Know How You Feel

Okay, pushed out a public post on this topic. Good.

A key thing is to not get caught up in the drama of the stories that are being told about me. Lead by example. Let the reality of the situation shine through. Do the cool things for yourself, and if need be, by yourself. Let them look upon you as having so much to learn, and let them come to you. Pushing anything upon them is likely going to be rejected given the evidence.

Just be your super-cool self. And in fact, pump-up the super-coolness.

This is done primarily through the YouTube videos.

Any single video can do the job. Plan the message (title) and the thumbnail beforehand.


Sat Sep 10, 2022

The Coddling Of The American Mind. Yup. Someone’s a Big Fragile.

I am reading The Coddling of The American Mind by Greg Lukianoff and Jonathan Haidt. Wow, this explains a lot. I can’t believe some of the behavioral patterns I see in my own child. I had attributed much of it to the home-schooling by my previous wife, but I’m now seeing that a lot of it is a product of the times. It’s a timely read.

I just finished chapter one, The Untruth of Fragility: What Doesn’t Kill You Makes You Weaker. It’s so appropriate now with my child freaking out at me for being myself, rather silly. Silliness is fine. What’s not fine is freaking out in anger to try to modify the behavior of your father, no matter how many YouTubers riled you up into righteous indignation and a false sense of entitlement.

Chapter one winds up with the following conclusions which I will do well to internalize through repeated reading.

Children like many complex adaptive systems are anti-fragile.

Their brains require a wide range of inputs from their environment in order to configure themselves for those environments.

Like the immune system, children must be exposed to challenges and stressors within limits and in age appropriate ways, or they will fail to mature into strong and capable adults able to engage productively with people and ideas that challenge their beliefs and moral convictions.

Concepts sometimes creep.

Concepts like trama and safety have expanded so far since the 1980s that they are often employed in ways that are no longer grounded in legitimate psychological research.

Grossly expanded concepts of drama and safety are now used to justify the overprotection of children of all ages, even college students who are sometimes said to need safe spaces and trigger warnings least words and ideas “put them in danger”.

Safetyism is the cult of safety; an obsession with eliminating threats both real and imagined to the point where people are unable to make reasonable trade-offs demanded by other practical and moral concerns.

Safetyism deprives young people of the experiences that their anti-fragile minds need, thereby making them more fragile, anxious and prone to seeing themselves as victims.

I own this in Audible format, but I think I’ll buy the Kindle version so I can truly internalize the teachings. I can tell I’m going to have to go toe-to-toe with coddles in the future who are telling my child they have no control over their cognitive tigers. Wow, that’s exactly the chapter 1 lesson. There is a strong belief that if you are intellectually challenged that you are being done harm. I need the counter-arguments that are very tied to having an immune system that can deal with allergens.

It is my moral duty and obligation to ensure that my child is not super-fragile and unprepared for the future. You can’t keep yourself sheltered forever, not if you want to engage with the world and life.

I just fed the kids Cthulhu stew (octopus in vegetable broth with potato’s and seasoning). Hope they like it. Even this goes a lot towards the resiliency. Openness to new things, and perchance new ideas. Open minds. Don’t knock it ‘till you try it. Put a little on your plate. Don’t make a face until you taste it. Some kids think it’s great!

Get some more thoughts together in this regard. The fragile-brigade loves to go on the attack in groups. If you can rally enough people together with false beliefs out on a witch hunt, they’re going to kill themselves some witches.

Be smarter and more aware than to let yourself get burned at the stake, LOL! It’s exactly this sort of joking they object to. Yes, it’s horrible people really got burned at the stake (and probably still do here and there through ought the world, statistically speaking), but you’re allowed to use those expressions. A person going on the attack against you for using such expressions are hollow empty people with too much time on they hands. Be better than that. Live better than that.

Rule #1: Treat Yourself Like Effing King

Life is good and much of what matters is how we vim itvimin our own minds. What’s important is what we decide is important. Society and group norms play some role, but don’t let group-think ruin you. You’re here on this world to get out of it something of your deciding, and once you decide what that is, pump it up to Eleven. That’s your reward for being alive. Eat grapes. Cuddle kitties. Do whatever makes you rejoice to be alive.

Rule #2: Don’t Let ‘Em Getcha Down

They’re going to tell stories about you. They can say anything they want, but that doesn’t make it true. One of the easiest things in the world is to make stories that make you feel better about yourself at the expense of others, vilify others, and find comfort and happiness in playing victim and finding excuses. Be better than that. Do your thing. Be your cool self, and your self is so cool that you attract knocker-downers. They can’t help it. They’d like that genuineness in themselves, but in the end they’re snarky leaches without an ounce of the stuff they covet in you. That’s why they’re shit-talking you. Know that and take it as a source of validation rather than being bothered over wanting to be liked or some stupid herd-mentality group-think.

Alternative Rule #2 paragraph:

The key thing you’ll encounter as a lover-of-life who’s met the wrong person is their attempt to end your loving-of-life. They’re jealous and a seething hatred they’ll keep secret drives then to monopolize your time and energies, sucking you dry until you are a shell of your formal self, and those rats still won’t be happy. They’ll be satisfied as a hunter is at bringing down prey but their conquest won’t feed the village. If you find yourself in that situation, I recommend honking their nose like a clown when they think you should be deadly serious and deadly drained. Let them know their best ain’t enough to take you down, then learn from your experience with them and move on.


Fri Sep 09, 2022

Writing First Version of Install Scripts for Jupyter on LXD on WSL

Pshwew! That was one for the history book. I have so much I want/need to do. All this retooling goes directly to how effective I am at my day-job. This moving to containers is a big one. If all goes well, this is the last journal entry that I’m writing directly on a WSL-instance of Linux. After that, it will all be on an LXD container instance. To that end, I have a few small touches to do. I really like how distrod works with curl to drop the install.sh file into location.

Okay, so clean up my public github repos. Turn everything to private that you can, and even start thinking about deleting the private ones that are just clutter and distraction. Do a quick pass. Definitely 80/20-rule here. Think about the experience people will have when they come in checking me out.

Okay, I trimmed back everything public on my Github.com repos that shouldn’t be there. But now I actually want to PUT a new repo there that has all my good configuration stuff. I should really only have to have a few curl statements to pull down:

It looks like I should give it some directory structure so file renaming is not necessary.

Ooh, I also want to change all my github folders into repo folders. Sheesh! Maybe that’s a violation of the 80/20-rule for now.

dotconfigs

Okay, I put the repo in location.

https://github.com/miklevin/dotconfigs

And here’s the way I’ve been getting the install file from distrod:

curl -L -O "https://raw.githubusercontent.com/nullpo-head/wsl-distrod/main/install.sh"

And so it stands to reason, I should have the URLs:

curl -L -O "https://raw.githubusercontent.com/miklevin/dotconfigs/main/.bash_profile"
curl -L -O "https://raw.githubusercontent.com/miklevin/dotconfigs/main/.bash_prompt"
curl -L -O "https://raw.githubusercontent.com/miklevin/dotconfigs/main/.screenrc"
curl -L -O "https://raw.githubusercontent.com/miklevin/dotconfigs/main/.vimrc"

And I made an install script that can be pulled down with:

From WSL:

curl -L -O "https://raw.githubusercontent.com/miklevin/dotconfigs/main/install.sh"

From LXD:

curl -L -O "https://raw.githubusercontent.com/miklevin/dotconfigs/main/lxd/install.sh"

Interesting! How far do I go with the install script? If I go too far, it might as well be a Docker image and that’s stupid. So much of the point here is to help people walk this path and have increased capabilities at the end; not to do just another apps store, snap, consumer-oriented thing. I’m giving people their forward-slashes so they can use them.

Okay, think! 1, 2, 3… 1?

Oh, of course!

So you’re on Windows on you main, daily working machine. Sure you can install Linux on some other machine or do some multi-boot thing on your main machine for a little Linux experience, but the point is to integrate Linux Terminal (the important part) into your daily life enough to start affecting your habits, achieving new normals, and ultimately making Windows or any other particular desktop environment interchangeable and optional. Once your main home is the Unix-like Linux command-line, you are impervious to disruption in the tech industry.

Yes that’s a bit claim, and you will still need to know a few other things that will be hosted through the Linux terminal, such as Python, vim and git, but these things too will be with you for your entire life free of the feature and user-interface churn that so plagues desktop software in desktop environments. You can still use desktop software, but try focusing on stuff that’s free across any desktop, such as the Photoshop replacement Gimp, the Illustrator replacement InkScape, he Maya replacement Blender, and the Microsoft Office 365 replacement LibreOffice. The main products are burdened with feature-creep to justify keeping you spending.

I started out on Adobe Photoshop 3, the first version to introduce layers. Today, the Adobe Creative Cloud is over $50/mo for the privilege of what amounts to being in the graphics arts guild. If you don’t pay the Adobe tax, you’re out, and if you’re just a casual graphics software user like me, it’s not worth it. It feels like Adobe drove me away with a stick. And that’s just the graphics software. Similar techniques are used by the vendors all the way down to the OS to keep you spending and part of their quarterly projections. That’s the main thing for them. It’s the wrong spirit to have infused into your tools.

The Free and Open Source Software movement possesses the right spirit. Learn once, run anywhere, never pay. The main argument to keep you on the proprietary stuff in my estimation comes down to convincing you of the next must-have proprietary feature, like automatic background removal in Photoshop. However the truth is these features will eventually come to FOSS software too if they really are must-haves, but most aren’t. They’re fringe-features that are fun for a time but whose main purpose is to instill in you the fear of missing out as your competitors and peers enjoy these features. It’s lack of confidence in one’s own ability they prey on, and it’s group-think they rely upon.

Ugh, that’s all too deep of background.

I need to appeal to people in general and they lack the attention span to go deep early. Jump right into it. Don’t let anything sound boring ever.

I’ve got a new machine set up.

Okay, let’s see how this should go.

The main thing is to massage your public website into the correct message. Plow everything into MikeLev.in and then let it trickle out to other sites like LPvg.org. Basically become your own competition, but don’t get hung up.

Cut this journal entry here. :w


Thu Sep 08, 2022

Run Jupyter from LXD Linux Container on Windows

This is not for the feint of heart, but I’ve made it easy as possible and it will change your life.

We’re going to start this video from the point of having LXD working under WSL2.

LXD is the Linux container system that’s better than Docker for this sort of thing.

WSL is the Windows subsystem for Linux, to which all roads lead these days—even Docker.

We go this route to make Windows fully compatible with the modern developer world.

We don’t simply use Docker because we need a full Linux system, exactly what LXD is for.

The abilities you learn in this process are timeless and will future-proof your career.

First we create an LXD container. These get automatically started. Make sure any other containers you may have running are stopped.

lxc launch ubuntu:20.04 kingcoyote

If you’re not at LXD working under WSL, refer to my videos to get here.

Add a proxy device to container that forwards localhost:8888 requests on WSL to container.

lxc config device add kingcoyote jupyme proxy listen=tcp:0.0.0.0:8888 connect=tcp:127.0.0.1:8888

Log into container and sudo apt update and sudo apt upgrade.

lxc exec kingcoyote -- su --login ubuntu
sudo apt update
sudo apt upgrade -y

Add the deadsnakes personal package archive (PPA) to your Ubuntu repo system.

apt install Python 3.10 and it’s virtual environment manager, venv.

Use venv to create a virtual environment in your home folder.

sudo add-apt-repository ppa:deadsnakes/ppa

sudo apt install python3.10 -y
sudo apt install python3.10-venv -y
python3.10 -m venv ~/py310

Create and edit your .bash_profile to activate the venv every time you log in.

vim ~/.bash_profile
i
source ~/py310/bin/activate
[Esc]:qw[Enter]

Log out and into the container to ensure your profile-edit and venv are working properly.

exit
[Up-arrow+Enter]

Your prompt should look like:

(py310) ubuntu@kingcoyote:~$

Install Jupyter Lab with pip, generate config file and set a password and make a repo directory.

pip install jupyterlab 
jupyter server --generate-config
jupyter server password
mkdir ~/repos

Change directory to /usr/local/sbin and sudo vim jn and put this script inside to run jupyter in a way that the proxy can reach. In other words the command forces Jupyter Server to run using [IP]:8888 instead of localhost:8888. It also kills dead screen sessions and prevents double-running.

cd /usr/local/sbin
sudo vim jn

Copy/paste this script into jn:

# Put the following lines in /usr/local/sbin/jn
cd ~/repos
screen -wipe >/dev/null 2>&1
if ! screen -list | grep -q "jupyter"; then
    screen -dmS jupyter /home/ubuntu/py310/bin/jupyter lab --ip 0.0.0.0 --port 8888 --no-browser
fi

Add this code to jn, save and exit. You must now make this script executable.

sudo chmod +x jn

Edit this line into your .bash_profile

jn

Log out of container and back in to test your edits and ensure jupyter’s running in a screen. Try the following command:

exit
[Up-arrow+Enter]

screen -ls

If it shows as a screen then you can log into it with the following command:

screen -r jupyter

Ctrl + A (and while still pressing Ctrl), tap D to detach from gnu screen session.

All good? Okay, go to a Edge in Windows and visit http://localhost:8888

Enter password. Allow browser to remember password.

Switch theme to dark mode.

Go to Edge menu and turn page into app. Allow it to add to Start Menu.

Quit browser and run from Start Menu. No password!

Switch app to full screen mode.

Check Terminal for forward-slashes.

You can try making a notebook and:

import sys

sys.version

sys.executable

import os

os.name

os.getcwd()

Thu Sep 08, 2022

Thinking Through Jupyter Video

Okay, I want to do this Jupyter video right. I want to do it in a way people can follow along step-by-step. That means take the /data folder out of the process because most people won’t have the NAS network share, and replace it with a more traditional symbolic link. Don’t use github because that has a special meaning. Microsoft themselves does similar tricks using a wsl network location on Windows 11, but that’s branding. Go generic. Perhaps it’s time for a ~/transfer location. Yes! It’s the same as the “t” from FTP and TCP/IP. People will instantly get it.

I know I have a big slide to educate all about Jupyter, but I may make that another video. It helped clarify my thoughts and even figure out how to do it, but… but… oh, context is so important. Maybe I shorten it or just fly through it. Dazzle them. Blibvert an education into their minds that they need even if they don’t retain it all at first.

There are so many potential paths here. It’s confusing enough. Power through it with the straightest line. Introduce as few new concepts as possible. Make a new repo that stuff can be curled from. Use the same trick that you’re using to get the distrod scripts to get:

Let’s make some strong nicknames to frame your current round of work and set yourself up for spinning successful sequence/playlist combos that’ll work well in both social and search. Each sequence/playlist gets a very strong nickname. Try to go with single words. I know I’ll never penetrate search on single-word stuff, but it’s snappy and memorable and will make for good navigational sections on my website. Map each sequence to a section. Have a limited number of sections. Let the cream rise to the top.

Yeah, that’s the ticket. Keep all the main and secondary navigation links on your website flexible. Swap things in and out. Never change URLs. Lean towards short, single-word directory names. Expose no file extensions. Start working your site itself into your videos. Keep tweaking things towards the righteous feedback loop. Tap deep long and hard-won knowledge and know-how I carry around. Cut the catapult ropes. Release potential. Shock people with instilling a “where has this guy been?” feeling.

Okay, so this upcoming video. I don’t think I’m going to be shooting it today. I want to get my single-take practice in, and finish the script for that matter. Avoid all the distractions and tangential paths that exist on this route. It’s a trap! Make this your break-out video. Make this a must-watch video for everyone who uses Jupyter.


Wed Sep 07, 2022

Getting Ready to Teach Installing Jupyter Right

I’ve done quite a run of videos lately. I’ve even deleted a few super-short posts here so as to not litter this blog (even more) with disjointed and incomplete thoughts. All my focus is going over on those vids and I only just come over here to type when I have to think things through. First, I feel I should preserve this most magical incantation which is becoming the .bash_profile of new LXC Linux containers under WSL that I’m creating more frequently for testing.

# EXECUTE THESE FROM WSL TO GET COMMON LOCATIONS
# lxc config device add containername data disk source=/home/healus/data/ path=/home/ubuntu/data/
# lxc config device add containername dotssh disk source=/home/healus/.ssh/ path=/home/ubuntu/.ssh/

# EXECUTE THIS TO LOGIN TO CONTAINER
# lxc exec containername -- su --login ubuntu

# EXECUTE THESE FROM INSIDE CONTAINER
# sudo apt update
# sudo apt upgrade
# cp ~/data/home/.screenrc ~/
# cp ~/data/home/.bash_prompt ~/

# sudo add-apt-repository ppa:deadsnakes/ppa
# sudo apt install python3.10
# sudo apt install python3.10-venv
# python3.10 -m venv ~/py310

# Log out and into container (to execute new .bash_profile)

# pip install jupyterlab

export SCREENDIR=$HOME/.screen
source ~/py310/bin/activate
source ~/data/display.sh
source ~/.bash_prompt

This gets things right up to the point where I apt install x11-apps, Firefox or things like it. It’s clearly in prep for Jupyter, which is the really big win that awaits. That plus nbdev and I just may spark a revolution. It’s the best way to install Jupyter on Windows. All roads lead to WSL2 (Docker in particular), so I figure why not?

For the Jupyter install, so far I figured out that I can get it to work with the Linux Firefox browser under either WSL or LXC under WSL. But only in the WSL-only case can I get it to come up under localhost:8888 on the native Windows side under Edge, and an Edge (fullscreen) app in particular where it both looks and performs best. That’s my goal, but with it being served from LXD under WSL. There are snafus.

The place it appears to break down is serving it from LXD. There’s some special Microsoft WSL magic going on making localhost on WSL also localhost on Windows. That’s a gift. Take it! Stick with the defaults and lean into what Microsoft’s trying to make easy to occur. Nobody’s trying to make hosting under LXD under WSL easy. That’s in my court. Stick close to the defaults and sprinkle as little magic pixy dust in that’s actually required to get the job done.

I have to understand the networking context of both better, but especially of LXD default, which is to use that lxbr0 bridge. Think! What do you need to know? I need big picture like the way I understand QEMU as a virtual LAN with its own internal IPs and DHCP server. Which of these is doing that and with what variations on the theme? Okay, let’s list what we know. For the sake of clarity, focus on IPv4. Take note of the IPv6’s, but write things out based on v4 observations, because it’s easier to visually recognize subnets and such.

My machines all have a public IP, which is actually the IP of my ISP network drop and router. It is also an Internet gateway making all machines on the inside have that IP for public request purposes. The WiFi router itself does the LAN trick, so all machines on WiFi as physical hardware gets assigned an internal IP. Internal IPs can be seen by going into PowerShell and typing ipconfig.

Okay, typing ipconfig from PowerShell shows me my main internal IP subnet which is the same thing used on my NAS, phones and anything else connecting to the Internet through the router. They’re all dynamically assigned an IP on the main outer network in my house. But there’s also another IP that’s shown here, and it’s also an internal number and labeled Ethernet adapter vEthernet (WSL). However, it end in a .1 for its last octet which is very suspicious. Windows-side stuff is going to act as the router (and Internet gateway) for WSL-side stuff. So we’re a bloodhound on the trail of clues.

I fire up WSL and type ifconfig from in there. Yes, it’s ipconfig on Windows and ifconfig on Linux. Go configure. Now ifconfig shows two interfaces as well, the first being a private IP octet that shares only the first two octets with the second private IP from Windows-side. Curious! The second interface listed is just the traditional lo: loopback 128.0.0.0 there’s no place like home. So that doesn’t tell us anything more about LXD yet.

Okay, I boot and log into the LXC container and see that ifconfig isn’t there. So I sudo apt install net-tools and check again. Ooh, a completely different internal IP different even from the first octet. This thing is deeply isolated. No Microsoft “make it easy” tricks here. I’m going to have to bridge, nat, proxy or whatever on my own to get this thing accessible WSL-side, and thus Windows-side. My gut tells me it’s going to be lxc config commands that do this. Stay away from editing routing tables. That stuff is moving targets, nftables vs. iptables and all that. Yuck. I’m not a netadmin.

Okay, deep breath. Think! What are people searching on that I should be searching on. What’s the winning concept here? Hmmm. I believe it must be general hosting from an LXD container. Normally that’d be a security risk I’d avoid, preferring and leaning into the localhost loopback modes, but in this case even if you expose it to the outer network, it’s still isolated by being part of the WSL inner network. There’s no harm in it! Wow, okay. So it’s exposing LXD to the world for hosting purposes I’m interested in. And in doing so, we may just be leaving localhost behind as a Jupyter address. We’ll see.

Hmm, go look at the lxc command with lxc –help. Well, this is illuminating. There’s a lot of stuff in there, even aliases. Okay, think! Try lxc info… wow! Yeah, I’ll be able to do the Jupyter video tomorrow. There’s a light touch waiting here. It may be setting public to true. There’s a few other leads like proxy.

Ah ha! A key thing to do is:

lxc config show quadling

Interesting! Okay, now research how to run Jupyter from an IP instead of localhost.

jupyter server --generate-config
jupyter server password

And now it’s accessible from Linux Firefox with the IP:

http://10.36.5.171:8888/

And now we have the basis for an lxc config proxy:

lxc config device add quadling jupyme proxy listen=tcp:0.0.0.0:8888 connect=tcp:127.0.0.1:8888

Success!

It is just at the price of having to set a password on Jupyter.

Okay, document these steps in your video notes then copy the unified video notes back here and use it as your starting point.

# EXECUTE THESE FROM WSL TO GET COMMON LOCATIONS
# lxc config device add containername data disk source=/home/healus/data/ path=/home/ubuntu/data/
# lxc config device add containername dotssh disk source=/home/healus/.ssh/ path=/home/ubuntu/.ssh/
# lxc config device add quadling port8888 proxy listen=tcp:localhost:8888 connect=tcp:localhost:8888

# EXECUTE THIS TO LOGIN TO CONTAINER
# lxc exec containername -- su --login ubuntu

# EXECUTE THESE FROM INSIDE CONTAINER
# sudo apt update
# sudo apt upgrade
# cp ~/data/home/.screenrc ~/
# cp ~/data/home/.bash_prompt ~/

# sudo add-apt-repository ppa:deadsnakes/ppa
# sudo apt install python3.10
# sudo apt install python3.10-venv
# python3.10 -m venv ~/py310

# Log out and into container (to execute new .bash_profile)

# pip install jupyterlab
# jupyter server --generate-config
# jupyter server password
# jupyter lab --ip 0.0.0.0 --port 8888 --no-browser


export SCREENDIR=$HOME/.screen
source ~/py310/bin/activate
source ~/data/display.sh
source ~/.bash_prompt

Fri Sep 02, 2022

From Zero (Windows) to Linux In 8 Steps

category: windows

Okay, plan out this video series.

Dive right in with removing OneDrive. Done. These blip-videos are in the perfect aspect ratio for YouTube. I save directly from OBS to mp4, which means I can upload directly to YouTube without any editing.

In my journaling / blogging / vlogging / jekyll / github pages system, I have now added the category “windows” which will represent this series. This first page in the series will both be my planning and the optimized page on this first very important step in the series: removing OneDrive.

Next?

Go down Settings / Personalization

It’s time to plunge into the Settings program, adjusting the Taskbar personalization, among other things. Marketing-oriented stuff is cleverly hidden under such settings as Startup. Make sure to nix those intrusive effers.

Okay, putting it all together:

  1. Uninstall OneDrive
  2. Unpin everything from Taskbar
  3. Delete all Tiles from Start menu
  4. Change Settings / Personalization
  5. Turn of Weather, Cortana, Search & other Taskbar clutter
  6. Bring back app icons you want to Start Menu & give shortcuts
  7. Use Virtual Desktops
  8. Install Linux

Okay, I made this series of videos. Embed them all here with their headlines, and use that as the starting point for your new sequence-generator under your content management system. This is key to organization, thus story-telling and thus breaking out and blowing up on YouTube and other outlets.

Fight the Chaos of Windows 10 : Use Virtual Desktops & Linux

Uninstall OneDrive From Windows 10

Unpin Everything From Your Windows 10 Taskbar

Delete All Tiles From Windows 10 Start Menu

Windows 10 Settings Personalization Taskbar Changes to Focus

Turn of Weather, Cortana and Notifications in Windows 10

Launching Windows 10 Apps With Keyboard Shortcuts And No Special Software

Use Windows Virtual Desktops and Show All Icons on Taskbar

Make Windows Allow Focus Get In The Zone Achieve Flow

What Does Installing Linux on Your Windows 10 Computer Really Look Like?


Fri Sep 02, 2022

Fight the Chaos of Windows 10 : Use Virtual Desktops & Linux

category: linux

This is the beginning of a reboot of my “Make Windows 10” usable series, now with an emphasis on fighting the chaos. I’m trying to make my videos short and to the point now, consistent with shrinking attention spans and the need to contest shrinking attention spans. Eliminate distractions and get into the zone. Use Virtual Desktops and the Windows Subsystem for Linux ( WSL ). Strip out the tiles from the start bar. Turn of the clock. Remove all Taskbar icons. If you’re a Jupyter Notebook or JupyterLab user, run the server Linux-side under GNU Screen and get all the advantages of forward slashes and Linux FOSS repos under Jupyter, setting the stage for true automation.


Fri Sep 02, 2022

Your current video card driver does not support this NVENC version please update your drivers.

I was getting ready to start my reboot of settling into a new Windows system yesterday on a laptop I reset to Windows 10 factory default and decided to go with OBS for screen capture, which I was able to get from the Microsoft store. But upon trying to start a capture, I was immediately hit with the message:

Your current video card driver does not support this NVENC version, please update your drivers.

Now whenever I get such a precise error message, I’m going to shoot a video to show how to fix it:

Unfortunately, I had to install Camtasia Studio again for screen capture because of a chicken-and-egg problem without being able to use the Open Broadcast Software until this driver update was complete. Happily, I was able to download the video driver from the NVidia driver from their site. The hardest part was figuring out which driver I needed for my laptop, which turned out to be the NVIDIA GeForce GTX 1060, which I could tell from going to Windows Device Manager and expanding Display adapters.


Thu Sep 01, 2022

OpenPyXL for creating a formatted Excel file from Python

I need to focus on some day-job stuff. I’ve got some openpyxl work to do. Do it by hand, but use your by-hand process to plan the automated version.

1, 2, 3.. 1? Create the “blank” file with openpyxl.

First we do the example from https://openpyxl.readthedocs.io/en/stable/

from openpyxl import Workbook
import datetime

wb = Workbook()
ws = wb.active
ws["A1"] = 42
ws.append([1, 2, 3])
ws["A2"] = datetime.datetime.now()
wb.save("sample.xlsx")

Okay, confirmed working. For the next steps, we go over to the docs at https://openpyxl.readthedocs.io/en/stable/

We’ve got pandas support so inserting dataframes into a tab is no problem https://openpyxl.readthedocs.io/en/stable/pandas.html

Okay, now I’m stepping through each item in a loop and outputting a different tab into the spreadsheet:

from openpyxl import Workbook
from openpyxl.utils.dataframe import dataframe_to_rows


wb = Workbook()
list_of_df_variations = []
for variation in variations:
    variation_path = Path(f"{prefix}{variation}")
    Path(variation_path).mkdir(parents=True, exist_ok=True)
    df_filtered = df_unfiltered[df_unfiltered["Variations"].str.contains(variation)]
    raw_pivot = Path(f"{variation_path}/raw_pivot{data_extension}")
    print(raw_pivot)
    list_of_df_variations.append(df_filtered)
    df_filtered.to_excel(raw_pivot, index=False, float_format="%.2f")
    ws = wb.create_sheet(variation)
    for r in dataframe_to_rows(df, index=True, header=True):
        ws.append(r)
xlfile = Path(f"{prefix}{name_space}-ranklayer.xlsx")
wb.save(xlfile)

The dataframes are quite big so this takes awhile to run. The documentation says this “streams” the data into the file, so there should not be exploding memory issues, except insofar as the dataframe itself may be large.

And if you’re actually looking at this code, it won’t run without all the setup work. There’s a pre-existing dataframe called df_unfiltered which is a super-set of all the variations. It has a Variations column, which is a comma separated field stuffed with variations, which you can think of as tags or categories. The loop simply makes a smaller df for each variation or tag in the (also pre-existing) variations Python list.

After that’s done I’m going to delete Sheet1 in a second pass:

# Delete Sheet1
from openpyxl import load_workbook
wb = load_workbook(xlfile)
sheet.get_sheet_by_name('Sheet')
wb.remove_sheet(sheet)
wb.save(xlfile)

Ugh! This takes so friggin’ long! My files are like 70MB large. It’s much better to just put this tab delete in the loop.

from openpyxl import Workbook
from openpyxl import load_workbook
from openpyxl.utils.dataframe import dataframe_to_rows


wb = Workbook()
list_of_df_variations = []
for variation in variations:
    variation_path = Path(f"{prefix}{variation}")
    Path(variation_path).mkdir(parents=True, exist_ok=True)
    df_filtered = df_unfiltered[df_unfiltered["Variations"].str.contains(variation)]
    raw_pivot = Path(f"{variation_path}/raw_pivot{data_extension}")
    print(raw_pivot)
    list_of_df_variations.append(df_filtered)
    df_filtered.to_excel(raw_pivot, index=False, float_format="%.1f")
    ws = wb.create_sheet(variation)
    for r in dataframe_to_rows(df_filtered, index=False, header=True):
        ws.append(r)
xlfile = Path(f"{prefix}{name_space}-ranklayer.xlsx")
sheet=wb.get_sheet_by_name('Sheet')
wb.remove_sheet(sheet)
wb.save(xlfile)

Okay, one giant step towards automating one of my bigger SEO deliverables.


Thu Sep 01, 2022

Planning Multiple Blogs Within One Github Pages Site

OMG, Sept 1st! Knock off one of your “Every Little Thing” that must get done. Test secondary sequences within Github Pages. The best instructions on how to do this appears to be https://www.garron.me/en/blog/multi-blog-site-jekyll.html

Yeow! Not as elegant as I had hoped, and it has serious ramifications on my category approach to making sub-blogs. It’s the same thing! Ugh. Okay, so let me think. This means that all posts are sort of thrown into one big vat and “default” prev/next arrows string them all together in one big sequence. Okay, this’ll be fine for bringing back my old content. I guess there’s no harm in allowing URLs to change now that my old content is effectively dead on the Internet with Google’s many algorithm changes over the years that have attacked the long-tail. Only “verbose” searches will turn them up.

This being the case, it’s probably best to “clean up my site” and the git repo that contains it by getting rid of all the separate date-named folders and index.md files that constitute my old blog after my WordPress export from HostMonster, a old fashioned CPanel web host I used before my switch to Github Pages. I pay my $100/year Microsoft tax for unlimited private repos and collaboration, and as a thrown-in feature, I also get unlimited hosting of static websites. This is effectively how I used everything I hosted on HostMonster, so I figure why not? So I sucked over all the data and transformed it into markdown arranged into files just-so to reproduce my original URLs. Yeah, that’s the kind of stuff I just do for fun.

I did this before I understood the Jekyll _posts directory to create blogs and had I known, I’d have just used the blogpost file naming convention and had a much more tidy directory structure within the repo. As it was, I created a series of year/mo/blog-title/index.md locations. I mean, tons! And I can undo that. That’ll be my morning project before work gets underway, and the result will be you seeing the list of posts on my /vlog/ page expand… into the past woooooo!

But first, I’ll clearly be thinking a lot more about blog categories to get these lists under control. Always, I’ll be using this one long journal.md file as my master list of blog posts. So I’ll be pulling all the content from those directory structures into this file. It’s a giant concatenation, or consolidation of content from the past, if you will. I even have a much older Webmaster Journal from my Active Server Pages days that I may do this with, but I’m not sure how far back I want to go yet.

There’s a lot of tiny issues to work out, but I don’t have to work them out all at once. I can have confidence it’s going to all come together and have the flexibility to be gradually refined over time. Some issues that come to mind are:

Organization is key to so many things. If you can’t even keep yourself organized digitally, how can you expect to do so in real-life where things are that much more difficult? At least with digital, you have just one place to look (maybe 2 if you consider laptop and phone). But within laptop and phone, no matter how many remote, cloud and other locations you keep and organize things, because you access them through laptop and phone, they all count as being in a single place (your digital portal).

Alright, let’s discuss your digital portal. What’s a digital portal? It’s the platform such as laptop, phone and tablet you use to access the digital world&151;a world which, although it exits within the real-world, operates as something of a separate sub-world&151;a nested-reality, if you will.

Wisdom of the ages! That’s what I want to come across on my website. Freedom of thought. Immunity from group-think. Seeing things closer to objectively than comes naturally.


Wed Aug 31, 2022

AI Will Do SEO Better Than Any Human Ever Did

I’ve got to use my public website more and to grater effect.

The power to prevent pander… LOL! I’m an SEO and taking the diametrically opposite moral position as the field of SEO, in which we pander to what the user wants, so we can step into their search path. Blech! It almost makes me want to vomit. I can use my lifetime career in SEO to teach people what the first wave of AI is going to be used against them.

AI panders to get you to stay on Facebook or whatever site and drive up the money they’re making off of you.

AI is the ultimate SEO.


Wed Aug 31, 2022

This Is Where I Reversed The ASCII Rabbit

First I used vim block select with Ctrl+v and then j & l to highlight a block of rabbit. I then used ![Space]rev to reverse the characters. I then changed the direction of all mys slashes and parenthesis. And hocus pocus alakazam! The rabbit’s running in the right direction. No more going backwards for me.

<!--
             /)   (\             
      /)\_ _//     \\_ _/(\      
  ___(/_ 0 0         0 0 _\)___  
*(     =(_T_)=     =(_T_)=     )*
  \  )   \"\         /"/   (  /  
   |__>-\_>_>       <_<_/-<__|   
-->

This was to update my public journal banner to this:

<!--                                                                  /)
      __  __ _ _          _               _          __        /)\_ _//   
     |  \/  (_) | _____  | |    _____   _(_)_ __    / /    ___(/_ 0 0     
     | |\/| | | |/ / _ \ | |   / _ \ \ / / | '_ \  / /   *(     =(_T_)=   
     | |  | | |   <  __/ | |__|  __/\ V /| | | | |/ /      \  )   \"\     
     |_|  |_|_|_|\_\___| |_____\___| \_(_)_|_| |_/_/        |__>-\_>_>    
                                                        
     Chase the rabbit to the wonderland of Linux, Python, vim & git -->

:# Sat Aug 27, 2022

What’cha Gonna Do When You’re Fifty Two?

Happy Birthday to me! I offered Adi Great Adventure tomorrow but they said they’d prefer a Michael’s crafting weekend. It almost brought tears to my eyes. In a distant second was learning how Princess Leia knows Obi-wan Kenobi. After having cancelled almost all my online subscriptions over money, Verizon offered my 6 months of Disney+ free for signing up for broadband after my move. I watched the 1st 2 episodes of She Hulk and then Part 1 & 2 of Obi-wan. Many tears. I can’t believe the world has turned this way. All my childhood loves have turned mainstream. People being in love with their phones is a lot like Amiga computers being in everyone’s lives too. I can’t believe how the world has turned to geeks. Wow. The greatest birthday present ever. And now I’m going to go to sleep listening to particle physics seminars on Audible. Wow, I was going to start my little video series, but this has been a nice birthday present to myself. I also stocked the house with food really well including the biggest New York strip steak I’ve ever seen. Dinner tomorrow night. Mmmmm.

Okay, birthday day thoughts:

What’cha Gonna Do When You’re Fifty-Two?

Nahhh, more likeL

What’cha Gonna Do @ Fifty-Two?

I need to push forward operation ELTgd. That’s “Every Little Thing gets done”. It’s all those little things in life that would so permanently move your life forward if only you buckled down and did them. Excuses, artificial dependencies, and lunging at neurochemically addictive distractions are the enemy of operation ELTgd. Deal with them. Talk it out loud, that’s such a good step. It’s a “POWER GOOD” step, which is another principle percolating around in my mind:

POWER GOOD

Yes, to do good takes exerting some power, and then results in more power. That’s the essence of a virtuous feedback loop. Power the good. Channel goodness to be powerful. Use that power to do good. Good for yourself, good for those in your immediate life, and good for the world.

Don’t engage in pissing matches with those who do not even merit your time. You can identify them because they are the inverse of power-good. They’re power-bad. The have a downward-spiraling corrosive and corruptive feedback loop. You are the average of the top 5 people you let be a part of your life, you say. Ugh, that explains a lot. You got some fixin’ to do, Mr. Michael. I’m not sayin, I’m just sayin.

1, 2, 3… 1? Don’t be objectionable. Oh, well everybody’s going to always be objectionable to someone, and often times that means you’re on the right track. So let me correct that. Be unobjectionable (albeit still somewhat controversial) to the right people, and be completely objectionable in a righteous and anyone examining it would realize I’m doing the right thing sort of way.

Yup.

Happy Birthday, Mike. These are the sort of realizations that are very long in coming. I totally have to organize this blog. That would be such a good birthday gift to myself. I still have to do quite a bit of writing for my boss for my weekly report, and it’s going to be a good one both for him and for me. It’s going to be part of my birthday gift to myself. The quiet, competent, insightful and foresightful company-man. Watching out for myself, my career and my skills, while still doing what’s best for the people nice enough to keep me on the payroll and appreciate my work.

Maslow hierarchy of needs is greatly met by this 100-year-old publisher who was once-upon-a-time a competitor to Asimov and Analog magazines. They’re not anymore, but some of that spirit continues. The fax holding company that bought it all up at least rebranded themselves back to the name of their greatest acquisition, that 100 year-old (paper) publisher. Now if only we could get that property back that competitor bought that bears our name.

Refine things, and let things get refined, over time. You can’t get it all right on one pass and trying to do so is consistently demoralizing, sucking the love out of otherwise love-worthy things. So don’t bite off more than you can chew.

Go make yourself a Bagel-All-The-Way for your Birthday. A Birthday Bagel! Use the belly-lox on your birthday bagel. Lots and lots of capers and red onion. Layer on the Philly cream cheese. Mmmmm. And of course, on a big toasted Everything Bagel.

Never not be dispensing wisdom and working on your website. Your website is now where you organize your wisdom, because markdown, Jeklyll, Github Pages and the like. I have Python worked into it too because of the blogging system that pushes this out. Don’t forget to refine your category system… today! THAT’S your real birthday gift to yourself. So much follows a rockin solid category system.

Also finish fleshing out the subnavigation of my site.

Publish my vim custom dictionary, LOL!

Give the different things you’re focusing on good, strong nicknames.

Categorinator vim dictionator


Fri Aug 26, 2022

Plot The Awakening / Rig For YouTube Success

A micro-blip 1-min video every day. That’ll give you time to edit. Also make it vertical format friendly for TikTok and such.

Assume each video is the first time anyone ever saw you or is being exposed to your message. Idiot-proof each vid to let folks start at the beginning.

Output the master sequence once for each big platform, with the page-embeds. DON’T CREATE DUPLICATE CONTENT.

Connect the dots between the righteous feedback loop projects.

Hold yourself to a rigid schedule of a video per day.

OTHER TOPICS TO INTERJECT PyPI 2FA Fiasco

Create a path that even the biggest idiot can follow.

VIDEO PER BLIP-MESSAGE

You are the subject of a mind control battle between Apple, Google, Microsoft and Amazon. In my day it was between General Electric, Westinghouse and Disney, there being 3 TV channels as there were. It’s been a joy to watch the shift.

That we live in a world where Disney owns Marvel and Star Wars is chillingly cool. That the mind control folks compete to bring ever more powerful magic into our lives is nothing less than, well, miraculous. The phones in our pockets are magic wands.

Mind control starts with putting hardware in your life whose startup procedure and operation is under the control of someone who had something to sell. You have been charmed and now the priority of those who’ve done it is to keep you enthralled.

The Silicon Valley kids have wrestled the baton from the military industrial complex. Control of the hardware and OS makes Apple and Microsoft heir apparent. Both Google and Amazon are surprises due to their adept control of the cloud and surprise hardware.

Microsoft was almost squeezed out of the game, but then started making mice. After that, game consoles and laptops. Now their product designs are more innovative than Apple’s. To top it off, they bought Github to remain in the lives of nearly all developers.

Because hardware is expensive and requires economy of scale to compete, you have no small-players or dark horse heroes on your side. Oh, except Richard Stallman, Ebon Upton and the few who have brought us the free and open source movement—no longer just software, because Raspberry Pi.

The British ARM architecture of our mobile phones has clobbered Intel, but is still proprietary. We need free and open source processors printable in home fab kits to secure our freedom in the digital age. We also need better mesh networking and mobile grid tech.

If we don’t destroy ourselves, we’ll get there. But like in H.G. Wells’ Time Machine, society will split into the mind-controlled Eloi sheep and the savage cord-cutting off-gridder Morlocks. The trick is to peacefully coexist and for us Morlocks to prevent better than our leader, RMS.

Windows 10 is still the reality for most folks. But let’s start you on your way to freedom and open source (or FOSS). We can do this while keeping you on whatever desktop environment you’re most comfortable.

This is the first in a series of quick videos in which I’ll teach you how to make the command line interface (or CLI), your news home. It’s achievable. It’s a new normal. I’m going to start you off with the best modern habits for focus and productivity.

Modern Desktop OSes have gone to war against your ability to focus. Windows 11 had taken away the ability to remove the clock and a few alert icons from your taskbar. You can auto-hide the taskbar but that too interrupts focus.

I recommend you stay on Windows 10. It lets you get down to a blank desktop like this and desktop transitions like this. The icons are for keyboard shortcuts I’ll speak more in later. If you’ve already upgraded, don’t worry.

The end game is to have your first desktop running a full-screen terminal with vim. But don’t worry, I t’s not terminal. It will only feel that way at first. Virtual desktops are a feature built into all modern desktop OSes, and that’s our next step.

Stop using Alt+Tab or the taskbar to switch apps. Make apps full screen and use one per virtual desktop. Add more like this. Use Win+Left & Right to switch. Use Win+Tab to see all your desktops. Move apps between desktops like this.

Next delete absolutely everything from your start menu. We’ll put things back as you need them. Delete all icons from desktop. Go into settings and [get instructions]. Hide all alerts like this, and finally turn off your clock.

All good? Good, you’re ready to focus. What we’re doing is fighting chaos. Agents of chaos want to reduce your ability to focus, go into the zone and be effective. They lack this ability themselves, and you developing it scares them.

Those who push back against you taking this next step, internal or external, are not your friends. They fear your change from a static to a dynamic personality. Use it to identify those who fear your growth.


Fri Aug 26, 2022

Knowing Protection & Emotional Protection

Not everything you think you’ve fallen in love with has to survive, be featured in your life, get shown on your website, live on in your prayers or whatever.

It’s not an easy pull to swallow, but reinventing yourself as you see fit can sometimes only happen when you paint over the canvas, and that’s okay. Take a snapshot of the prior paintings if you need, but them throw them in the archive for the occasional rosy reminiscence.

It’s fine to reminisce if it’s only for fun and you keep it in perspective. Such activity literally lets them back into your lives. Consider for a moment the possibility of the eternal soul. Entertain the notion that we may all be different versions of the same being. Are you still okay with what you’re doing? Okay, then we’re good.

We’re born not knowing shit—but maybe perhaps some little inkling based on our 22 trait-bundles of our chromosome pairs inherited randomly from one or the other of our parents. There’s that vibe, then there’s the vibe in the womb and the vibe of the circumstances you’re popped out into. Still for the most part, you don’t know shit.

And so sometimes you become a little shit. You might. It think so now, but absolutely horrid behavior is unsustainable. It will catch up with you and wrack your soul. From 25 to 28 years old sensitive artistic-types who’ve been dishonest with themselves clock-out, no matter how much material success they’ve enjoyed. It’s the great twenty-eight. I could quote musicians out the wazoo, but for me it’s Reddit founder Aaron Swartz. It’s not merely mental imbalance. It’s a snap-back effect from realization of life decision behavior that’s too crushing for sensitive types.

Survive. We all go through some variety of the great twenty-eights. Be deliberately callous and insensitive at times. That doesn’t make you bad. It makes you a survivor, and life is good and worth living, and so protecting yourself emotionally actually makes you good too. Grey stone on, my friend!

Anyone accusing you of being an a-hole for protecting yourself during such times is the true a-hole. It’s called projection. They’ve found certain behavior to work well for themselves, evoking sympathy and attention. Seeing others exhibit such behavior, or poses an immunity to it, pisses them off to no end. It’s an immediate “trigger” and gets their accusations flying. Listen closely. It’s actually them telling you what they hate most about themselves. They live inside their heads, in denial, and quickly protect their fragile egos by projecting outwards onto others any feelings that strike a chord and get close to the heart of their own issues.

Once upon a time you had more reason to be satisfied with the lot you were given in life. You were born somewhere into the social hierarchy of a tribal society of probably around 100 creatures, not too dissimilar from chimps.


Fri Aug 26, 2022

Get the LPvg Message On-Point with a series of 1-minute videos

I did a pretty awesome job planning a sequence of videos that will help me blow up here on YouTube, which is now finally worth doing because for the first time, I have a mission and purpose that’s worthwhile.

Previously, my floundering led me through the field of search engine optimization (SEO), but I ascended to the top of my field. I had a Web 2.0 app (back in those days) that helped bloggers figure out what to write about to grow their natural search. Hittail had a 15-year run. I was a vice president of the PR firm that launched Amazon. I was at 360i, the hottest agency in NYC when they invented real-time marketing for Oreo you can still Dunk in the Dark, before they sold out to Dentsu. I helped Fortune 500s with their natural search woes from the backroom. I resisted the rock-star thing because I’m a craftsperson and the love of the work has made me shun the spotlight. But still, from those backrooms, I was at the top of the field.

But now I’ve got something better. SEO is a shallow, hollow, invisible-hand manipulative field. I think I would have hated almost anything in the fields of sales or marketing for the same reason. We’re here on this Earth to love our experience and to love others. In the fields of technical endeavor, at least now for me, that means free and open source software (FOSS).

That’s been a long discovery in the coming. I had a very false start and red herring in the form of the Amiga Computer, and Commodore in general. Misplaced love, my friend. Misplaced loyalty. The Amiga was easy to love, and Commodore as the creation of a Holocaust survivor was easy to embrace. Neither were worthy of the piece of my life I gave them, but it helped make me who I am and I’m happy with who I am, so I’m grateful to both and all the people involved, no matter how colossal of a-holes some of them turned out to be&151;a lot of it rubbing off on me I’m told, hahaha!

Beware wolf in sheep’s clothing. And because of projection, you know that’s precisely what I fear most in myself. And as an SEO, you’d have had good reason. It’s real invisible hand stuff.

Invisible Hands Seo Liquid Television

Redemption, redemption, redemption!

Linux, Python, vim & git!

Yup, that about sums it up. If I were to let myself expand on that, I’d be writing for hours and I don’t have time because I want to get that video series started, and my day job.

Suffice to say, it is a form of modern literacy that’s very much tied to tools, and the internalization of tools into our very bodies in the way our ancestors did with morphogenesis to make cell walls and calcium to make bones. Language later and code currently!

And so… and so… most people will never sit through my long videos.

Most people will never sit through my short videos.

My most popular video ever was 1-minute long! I didn’t even talk… I guess that says a lot, LOL! Oops, best not “out loud”… laughing silently to myself.

Okay, so this video series is going to be a sequence of 1-minute videos. Each video, a minute or less with no talking. Each one a magic trick. Each one heavily edited ‘cause there’s no way I’m going to be able to do them all in single takes.

Okay, so I’m going to break down that list into a series of 1-minute videos, each one carefully optimized, keyword-selected, title-written, description-crafted, yadda yadda blah blah. Guess I’m still and SEO.

This is why I went through these past 25-years or so as an SEO (yes, I’ve been doing it since AltaVista as the Inktomi moderator on Jim Wilson’s searchengineforum.com. Long in the tooth, you might say. I’ve even been making home-grown static site generators (SSGs) like Jekyll and Hugo since those days in the late-90s. Wow, if only I formalized and promoted any of my innovations from the past… but that’s the past. Learn from the past! The time wasn’t right back then. Nobody was going to write XSLT transformations to accomplish the same thing they’re getting with markdown and canned templates through Github today. The stars have to be in correct alignment and all the detailed little nuances and subtleties have to be just right.

So the trick is to glean what the stars are in alignment for, to get all the subtlety and nuance correct, and to promote it well. 1, 2, 3. Okay,let’s see.

Linux, Python, vim & git as a single platform-like thing for development and delivery, and as an alternative to vendor schlock, is it. The time for LPvg has come. It’s a legit movement identity, acronym, yadda yadda.


Fri Aug 26, 2022

Sub-Linux Sub-Love So-Learn

Back on June 30th, 2006 when I was twenty-six years old, I joined YouTube and made this:

The first ever YouTube video was uploaded on April 23, 2005. So I was on the YouTube platform just over a year after their first video upload, and have been more or less on it ever since. So yeah, I’ve been around the block a few times and see the trends before they’re big pretty consistently. I just don’t go all-in because… well, 2 roads diverged in the woods and in the end, I do what I enjoy without pandering to the path others covet.

Today, I’m jumping on the Linux-via-Windows bandwagon. I’ve had a difficult time figuring out what to call Linux under Windows, but it finally struck me. It’s sublinux. I’m always quite a bit ahead of the trends. I’ll do it again. And so it’s time to do my schtick again, but this time it’s for myself and this time I think my mission and purpose is quite a bit more worthwhile. I just might go all-in helping people escape Windows, Mac and proprietary software in general, jumping on the free and open source software movement (FOSS), the first thing since the Amiga computer that’s love-worthy. And I’m coming to realize that the Amiga wasn’t all that because of unwise-love for hardware/graphics and proprietary. FOSS is far more love-worthy than the Amiga ever was, and it perturbs a little piece of my soul to say so. Atari hardware engineers were just the best (tear to my eye).

Okay, so enough living in the past. Nobody cares about the past. The road to happiness is living in the moment and caring deeply about something with a care and love that is not misplaced. Ah, misplaced love ‘tis the story of my life.

Beware falling in love with the wrong things. Beware infatuation and soul-swallowing love being interpreted by your higher-function brain as real love. Your higher function brain should realize that real love is 2-way. If the platform (or whatever) is only getting something out of you after the initial infatuation fades, than you’re being taken advantage of by the vendor (or whatever), and you don’t owe them anything for that “first dose free” love-bombing they laid down on you with their smooth scrolling bitplane graphics, that were destined to go obsolete for lack of ability or willingness to grow. Is it so much to implement retargetable graphics, or install the layers that provide some flexibility over time.

This is my daily writing.

This is my thought-processing.

I talk a lot about a daily journal in vim so you can get in your vim practice without having to worry about coding. This is it.

Well, we technically are coding. We are encoding our thoughts. We’re doing it light and easy, breezy, with the occasional asterisk for markdown formatting. No HTML. We get in the flow here in vim. No time to slow down for editing. That’s exactly my my reticence to edit video. Life’s too short to edit video. I’m living life for me and editing is for others.

What’s that you say? There are others out there following along who care?

Oh, hello Calvin Irby. Your comments are the best and fuel my motivation. BASH on my friend.

Zoranoa01, keep the comments coming and I’ll keep making videos just for you. I wish I could reach everyone while they’re going through the learning-phase you’re at.

antikoerper256 I’m glad I lit that spark. Let’s fan the flame.

Jens-Ole Frimann always glad to hear from you. I think vim was actually written in 1987 originally, so it may have a year on you. I know you’re younger than vi.

DesktopChronicles I always appreciate your feedback and perspective. Nice to have folks around who go back to my Levinux days circa 2013, which although 7 years ago is still only my half-way point here on YouTube. Chronicle away!

Limitless 1… you’re welcome.

Scott M A Bell Labs Research veteran amongst my viewers? I am darn honored. I hope I contribute something of value to someone who’s lived on the true mothership. When you talk, I listen.

The list goes on, but you get the idea. I’ve switched from listening primarily to search hits on the website (now not provided), vomiting content onto the web to pander to visitors teaching others how to do the same, to talking instead primarily for myself on the YouTubes, to realizing there’s real human beings out there and the communication feedback loops are wonderfully better than they were back in those early days, so Maslow hierarchy of needs stuff possible.

You are my developing to be my community, and it’s time I embraced it.

I’ll still be myself, vomiting up content onto the web, because that’s a big part of who I am. I journal in vim, and so long as the information is not excessively proprietary or personal, I figure hey, why not? A little proprietary and a little personal is okay I think, because hey, there’s got to be something you get out of it, right? Why whitewash everything?

So, maybe a little bit of SERP-scraping techniques here, and a little bit of life learnings resulting from the hurt of betrayal there. But always with a positive spin! We live in the present, not the past. We learn from the past to improve our future. Rafiki is perhaps one of the most important Disney characters to ever appear, and I’ve been taking his advice to heart a lot lately.

Mike Levin Learn From The Past Rafiki


Thu Aug 25, 2022

Planning a Video Series That Drives Message Home

I’ve done a 3 videos in a row using Google Sheets. Feedback has been good. I can distil my message down very effectively in that format, and I’m sure reach a broader audience than with my long rambling format. I still like not editing video. Even this format, I generally do in one take. Life’s too short to edit video, unless you’re sincerely trying to make a living off it. I’m not. I’m in it for higher levels on the Maslow pyramid of needs, like community and stuff.

The long unedited and unscripted videos still let me just do my work and make videos out of it. But with a little planning, vision and revision, I can reach more people more effectively. Next step is to try getting the best of both worlds, and to push myself forward with a series people can follow. They’ll open up with a script laying out what I’m trying to do and why. Then it’ll do it. It’ll be a good tutorial series for the most important thing in tech, plotting your way off of both Microsoft and Apple systems.

I’ll use YouTube playlists and pages on MikeLev.in to compel people into the correct linear sequence. Yes, think in terms of linear sequences of videos! Think in terms of how to keep people on those linear sequences. Drive up view-times per video and subsequent follow-up views of related videos. Lock ‘em in a chute. Microsoft does it at a massive profiteering level. Why can’t I for some social good? Make a page on the site that lays this all out, both to help me to keep on track and for those in my audience who might be interested.

It will mostly be about settling into Windows 10 (again) and giving it a Linux CLI and getting on vim for daily use with virtual desktops. It’ll start with that PC where I reset it to Windows 10 factory default. Then I’ll move along to a full WSL2 install with fullscreen Terminal (again). We generally have vim loaded on that full-screen for a daily journal. And yes, that’s the main message. I think it seems to be emerging as the main thing, but I’ll have “aside” videos where I:

Other topics


Fri Aug 19, 2022

I’m Mike Levin helping you fight obsolescence

Woo wee! I got another video slammed out using a script that’s been running through my head. I think doing the overhaul of MikeLev.in using simplecss.org has really got my creative juices flowing. It’s a catalyst. I’ve got to keep that momentum going now. Every main page of my site will have a video which I’ll probably write a script for.

Hi, I’m Mike Levin helping you fight obsolescence in your personal technical skills by helping you see what tools to take up and why. Linux and vim are the way to get started, and you can on your Windows system today simply by changing your built-in default command-line from MS-DOS and Powershell to Linux by opening a Powershell and typing wsl –install.


Thu Aug 18, 2022

Internalizing Tools Through Eepigenetics

An article like this will start out here in my journal. It was actually copy/pasted out of the SimpleNote app that lets text flow astoundingly smoothly between mobile and desktop. Capture your thoughts any way possible. The most convenient and love-worthy way possible. Just capture your thoughts. Then copy/paste them into your idea-processing systems. Then if anything is destined to come of it, continue the idea on its journey to its ultimate destination. Here I believe it will be as non-blog “main” content on my freshly facelifted website. This stuff shouldn’t have to be searched for in Google. Rather it should be stumbled upon by anyone stumbling upon my website. Hmmm, remember StumbleUpon?

I’ve p laced my chips and it’s time to shit or get off the pot. SEOs never met a metaphor they wouldn’t mix. While I’ve been an SEO for most of my adult life, it wasn’t my first choice. Science and being a scientist was, but apparently I lacked the passion and commitment to pursue it. Listen to this other Mike Levin the microbiologist, something of a hero of mine and someone I may never had discovered had we not shared the same name. He topped the search engine standings for while until the congressman from California buried us both. Sigh. We’ll, who’s searching on my name anyway? It’s time to grow through a fine-tuning of my mission in life and what I’m here on this world to accomplish.

First off, I’m a dad and having brought another life into this world, helping my child with their journey is always my number one priority. But everyone’s journey is ultimately their own and I cannot let theirs subsume mine. That’s been one of my biggest stumbling blocks, because I’ve allowed not my child, but other adults to rule me for the better part of my adult life. I was still a man-boy when I came to New York in my late twenties and found myself hunted and gold-dug. It was not my first time falling victim to shortcut-seekers in life and narcissists trying to turn their men into self-objects. I was just still unable to recognize it for what it was and I fell into one of the big pitfall traps in life. Don’t lose yourself to someone else.

We’re all born with a hardware vibe. It comes from a sequence of information encoded into molecules that builds you within a system of profoundly complex relationships and interdependencies. No one is an island, but we seem to be born that way as a single cell borrowing some of the 22 trait-bundled from your mom and 22 from your dad. Just add the material world. By 2 or 3 you’re beginning to walk and talk. You couldn’t even open your eyes or pick up your head at first. Then you’re riding a bicycle and driving. Humans incorporate tools into their bodies and redefine their own being, leaving that hardware vibe they’re born with in the dust, a mere echo of who they are today. Every day, you evolve.

That aforementioned Mike Levin covers it well. DNA isn’t the whole story. Nor is your human-like consciousness that’s reading this article right now. Now, every cell in your body is its own distinct organism with its own sort of brain. We’re all just a big cooperation of those critters under the guidance of our executive function that lifts us just a bit above other animals in our capacity for self-determination—but not much. Our inner lower-form animals from which we’re built, the worm and the fish and the frog and the lizard, are still much more in control than we like to give them credit for. Our executive function’s main function sometime seems to be an apologizer and rationalizer for all their shenanigans.

But humans are able to grab themselves by their own bootstraps and lift themselves above the dog-eat-dog animal fray. We have the ability to flip-off the autopilot switch and lift our consciousness a wee bit out of the now moment to think about past, present and future as if they exist and matter. Killer cats hunting your tribe in the night got you down? Your watch guard falling asleep on the job tearing apart the tribe’s trust, cohesion and the gene pool? Maybe those wolves that eat at the trash pile aren’t so bad. They bark like a mofo at the slightest perturbance. Let’s raise one of their pups and see if they can’t ale for a better nighttime alarm than Joe the deceased. Ye ol’ technology.

Tools are like that. They get added to the tapestry of complex interdependence that makes us us. Dogs aren’t in our DNA, but they might as well be, like bones and eyeballs and fingers and all those other tools we actually have incorporated into our physical bodies over time. We didn’t have rigid inside parts as the single celled blobs we started out as. At some point it tasted silicon and said yuck! I don’t want to replace my flagellum with that. It would shatter like glass. But the chalky foamy metal calcium which is also available in slightly less abundance? Yum! I’ll make me some shields and swords and stilts out of that stuff. Random mutations? I think not. Intelligent design? Of course, but not from some dirty. From the organism and a system of non-genetic inheritance that allows it—called epigenetics.

We evolve ourselves through force of free will. Random circumstance might lead to an organism having some proto-intelligence insight, but that calcium gets put to use as some quite literally internalized tool. It’s like a amoeba sucking in a hammer floating around in the primordial goo. The amoeba’s like now I have a hammer, Ho Ho Ho. Watch out paramecium. Same thing happened with organisms internalizing whole other organisms making animal-B a permanent ongoing part of animal-A as is the case with the mitochondria found my the thousands floating around as little energy-giving powerhouses in each and every one of the cells that we’re built out of. We don’t need chlorophyll for photosynthesis like plants because one of our ancestors gobbled up an anaerobic little critter that got passed down without it being encoded its genes.

Scale-up to human size like today in your life. Same thing. Sure, you’re born with your 23andme profile. Luck of the dice, as it were. The first hand you’re dealt. Factor in where, when and to what circumstances you’re born and you’ve got one nutty game of poker in the game of life. Games all just channel aspects of life so it’s no big surprise eternally popular ones like poker are channeling some of the big important points. We’re all dealt some hand, but how we decide to play that hand is up to us. Blame is for losers. You’re in the game, aren’t you? You want to play, don’t you? Maybe you’re just here as a curiosity seeker and your hearts really not in it. Well then, die young. Maybe you’re really into it but what happens early in the game was so awful and unlike your expectations that you… we’ll, you again, die young. The great twenty-eights of deeply emoting passionate artists and all that. Winehouse and Leger and Hendrix, oh my!

Fpzt Like any game, there’s ups and downs. The risk and the thrill and the discovery is what makes it fun. It’s not that we all need to be dopamine seeking extreme sports nuts. No, maybe serotonin is your drug. Adrenaline junkie or warm and fuzzy cozy cuddler, we each find things in the game we value, our own answers to questions we didn’t even know (in our human consciousness) that we had. Yeah, yeah, this is all right on the edge of spiritualism and that eternal soul stuff. But who knows? Certainly not us, at least not for sure no matter what any religion or scientist tells you. That’s what makes the game so great. Ultimate answers are withheld until the end. Maybe. Maybe forever and there’s just oblivion. Who knows? But don’t let that get you down. You’re alive now and I’d call that nothing short of nigh impossible. The odds were against it and every little experience you have as a conscious being is gravy. So show a little gratitude because, you know, just in case.

I’m over fifty and just getting started. I’ve scavenged my environment and found my tools. I’ve tasted glass and I’ve tasted solidified foam. I’ll build my bones from foam, thank-you-very much. If you’re using VSCode, you’re building your house from glass. Or would that be straw? Either way, beware the big bad wolf of time and vulnerability to change. Gonna incorporate VSCode’s source into your DNA? You’re gonna also need a platform that supports a web browser there in your DNA too. Oh, and NodeJS running in the background as a server. A pointer device, don’t forget that. And not the “main”VSCode because the license won’t allow it, so you need the free and open source version. And once VSCode is internalized, are you going to use it to keep a journal? A to-do list? You know, all those things you can do with text files in your life that’s so valuable? What’s that? The user interface is made mostly just for coding, so you’ll need something else too?

Well, why not just start with the eternal and we’ll-licensed vim as your text editor so you can build your house from bricks? Little choices like which text editor you use manifest in big ways in your life, consequences cascading down to all the little things in ways you’re hardly aware—like a fish not understanding water. All that stuff that vaguely bothers you today might be because you never took up vim as your text editor. It’s “good enough” and ubiquitous (always there) because it’s smaller version vi is part of the Unix standard. Oh, you should take up Unix too—or Linux rather because Linux has broader hardware support today, better free and open source software repositories you can immediately tap into and start using, and a system for scheduling and automation called systemd built-in that’s accessible without too much effort.

All these tools I’m advocating are free and open source and of the eternal type. The Unix philosophy https://en.m.wikipedia.org/wiki/Unix_philosophy states much of it clearly. What’s not clear is further stated well by the Free Software Foundation https://www.fsf.org/about/ Tools like software becomes a part of you, a bit of your evolving vibe and who you are as you learn and master it. One should evolve tools and things built from them that will shatter like glass inside your body when our under any stress, like the glass blowers needing to make money off of you, or the ravages of fashion, fads, trends and time. Unix, and thus Linux, is greatly impervious to the ravages of time. Or maybe more accurately is aging well, delivering on its Noah’s Ark-like promise of portability and survival. Python’s not as fundamental yet, but PERL missed its chance as the default language of Linux distribution, so Python is on its way. Lastly, git. Always lastly, git.

Nothing’s perfect, least of all git. git reset –hard HEAD^^^ …which is read while pantomiming a head-smash on each caret. The same guy who made the Unix-clone Linux made git. He named Linux after himself (Linus Torvald’s) and in one of the greatest acts of self depreciation, self-awareness and redemption for that massively megalomaniacal act (he didn’t know it would become so popular), he also named git after himself. Linus is an enormous asshole. He suggests it’s surprising dumb people are still alive. He’s gentle Finnish demeanor belies the raging opinionated git beneath. He wouldn’t still be in the game as the leader of his team if he wasn’t. Linux wouldn’t have had the massive uptake if he wasn’t. Linus isn’t perfect, but when good-enough goes free and open source mainstream with longevity, you’ve got bones. Lean into it.

Unix was perfected by the same Bell Labs people as who made Unix in the form of the Plan 9 OS. Who uses Plan 9? What’s best does not always win. Even stuff with the best marketing does not always win. What wins is what’s good-enough, is released early enough so that it becomes adopted by a critical mass of people, and which becomes too important to fail. If the original providers of the tech disappear or don’t license it well enough, it’s reverse engineered like Linux or picked up by someone else and continued, like vim. The original vi text editor program on which vim is based was written by Bill Joy, one of the no longer active grandparents of Unix. But Dutch programmer Bram Moolenaar picked up the pieces circa 1987 on the Amiga computer and the world is a different and better place today because of it.

Vi is not as clumsy or random as VSCode; an elegant weapon for a more civilized age. Bill Joy actually said he would have written vi differently because we are not as resource strapped as we were back when vi was written—the age of timesharing, expensive computer time, and painfully slow dial-up connections. Every keystroke and every bit mattered. Vi was thus constructed as an efficient “language” to control text from afar. j is down and k is up. Sure, we might be able to stream movies forever today, but what if you want to edit that textfile on Mars or that device you managed to connect to in order to change a config file? Vi and it’s descendants like vim and nvi are relevant in every use-case including making your daily common text-editing work easier. It’s got a steep learning curve, but once you get into the zone with vim, it’s like a superpower and nobody and nothing can take that capability away from you.

Python? Oh, Python. How Guido can Rossum misnamed you. He’s Dutch and not Scandinavian, but he shares the Viking proclivity to name the icy place Greenland and the green place Iceland. Python is not the constricting alpha-male macho bro language the name implies. Python is soft and gentle, once embracing you in its coils it will adopt you and raise you as its own, making you in many ways as powerful as a C programmer, but without a lifetime of commitment. If you want C for ultimate power, you can have it. It’s built-in. But if you only want to kick the tires of coding here and there, or just get your work done while learning as little as possible, it’s got you covered there to. Under Jupyter Notebook, the “Hello World” program is just that. ‘hello world’. You don’t even have to press the Shift key. And perhaps the greatest argument in its favor that it’s something you should maybe consider trying is that it is still as popular as it is today and only growing, even in the face of JavaScript, the one language to rule them all.


Thu Aug 18, 2022

Monitoring Github Pages github.io System For Errors

Okay, continuing operation ELTgd. Put that framework in place on my website for a unique and awesome experience. Make sure it serves the user as much as me. It’s a big part of building my tribe and fanbase and securing my retirement gig. A perpetual ability to earn and a practical alternative to savings. Evolve the CMS quick. 1, 2, 3… 1?

Make slice.py and skite.py easy and fast to advance. Less stuck in concrete and more flexible and fluid. It’s all about the MikeLev.in site right now. Allow the other sites to break for a bit. You can bring them back to life easily under the new new system. I believe it was only June that I made the Prev/Next link CMS sub-system to Github Pages github.io Jekyll Liquid Ruby static site generator. Copy/paste skite.py and slide.py. Yeah, git but sometimes you need multiple version in the same repo. I should think about branches, but for now…

This also sets the stage for pages that are pseudo-dynamic. They’re really static, but something on a scheduler can always update a page.

Okay, the args passing can be simplified, but don’t touch it for now. That’s the easiest place to break things. Instead find where blog.md is output and simply comment it out. Output the same file to _includes/posts-main instead. That way you can have a system:

And then replace main with “old” for the old blog and such. I can eventually move the old blog over to this system too. That way all the post includes sort together alphabetically and I can have a different one for every category.

Okay, edit out the git commands so I can rapidly iterate right here releasing this blog.

Okay now edit out the inserting of the frontmatter

Okay, now create a new blog.md that uses that as an include.

Yup. Now edit the git commands back in and add one for _includes/*

Wow, okay that was remarkably easy. ELTgd!

Say a few words about that!

Hello World! You’ve found a site like no other. I have got to great lengths to improve my technical abilities in ways that will be forever relevant and of value in just about any industry, any era and yes, even any life.

Life changes and you can’t predict everything. So your tools should be of the sort that serve you in that sort of situation, meaning any sort of situation. Can you imagine using VSCode for your journal, resume, todo lists and the like?

Okay, the wait is agonizing for these updates. I at least found the Github deployment link: https://github.com/miklevin/MikeLev.in/deployments When it reads “7 hours ago” I know the latest changes have not been pushed out. I know I can look at the published page itself, but this is more reassuring in case an error occurs or something. I used to just watch my email for the Github Pages error messages. This is apparently a big change because it’s taking forever to render.

Okay, think through next steps. This is an amazing approach. It means that anything I want more-or-less dynamic in nature, I can have macros in vim like I do for this blogging system, or I can put on a Linux daemon scheduler per all my recent videos on the topic. It’ll be a real connecting of the dots for my YouTube audience, LOL! They didn’t see that coming. Hot topics.

Oh, no wonder it’s taking so long! The page build was not successful. Hmmm, can I not use markdown in an include?

Follow the links! Make sure you can see the errors.

Drill down on Repo.

Find the red-X on the hash. Click it.

Find the red-X AGAIN. Expand it and drill-down on details.

Okay, it was the html comment tags I was using to keep old liquid layout logic around for reference in my default.html. Not worth it due to rendering errors. Deleted them. Re-rendering.

Okay, many fewer errors which made it easier to find this one:

Liquid Exception: Could not locate the included file 'posts-main.html' in any
of ["/github/workspace/_includes",
"/usr/local/bundle/gems/jekyll-theme-primer-0.6.0/_includes"]. Ensure it exists
in one of those directories and is not a symlink as those are not allowed in
safe mode. in blog.md

Oh! I named it posts-main.md and not posts-main.html. Fix that and re-render.

Okay, I think there’s 2 locations that need to me monitored on Github after a Pages push. The repo’s main page will show the red x and the /deployments page from that repo will show the deployment status (7 hours ago, etc.). So all you do is go to the repo page and make sure there’s no red x, then slap /deployments at the end of the URL and you can see whether and how long ago it was deployed. If not something like 1 min ago, then go back and check for the red x. I can smooth this workflow out soon, but for now… ugh… for now.

Success!

Game changer, for sure.

Output markdown into Github Pages Jekyll Liquid markdown _includes folder from applications on your desktop or from a Linux system daemon scheduled service on a Linux server. You’re already doing the former with this blogging system and the sooner you get to the later the better.

The site’s starting to actually look good. Put a “git button” in. It looks incomplete without it. And that’s what responsive design is for anyway. All those big buttons disappear as it gets smaller, but remain mobile-friendly clickable links. So expanding main nav… oh that’s in _layouts/default.html, but I see now it should be externalized into _includes/mainnav.html.

Nice. Now I can cd into _includes type:

vim *nav.html

Okay, now actually put a git.md in place, LOL!


Wed Aug 17, 2022

Reworking Site

Wow, another day for the history books. Yesterday I tried simplecss and it went well. I have a whole new look for my website and some advantage of a css framework without the enormous chore of taking up bootstrap or foundation. Keep it light, my friend! All you want to invest yourself into now is Linux, Python, vim and git. However, there’s a little bit of Jekyll skills you need, which is mostly Ruby templates. I’d like to be using Hugo, but lean into Github. Microsoft owns Github and Github favors Ruby for their static page generator system. Lean into it.

The biggest downside right new seems to be how long things take to generate. That’s a royal pain, but it’s the least of various evils at the moment. Don’t bother to install Ruby and Jekyll locally just yet. That rabbit hole goes deep. Get the success-assured experiments done.

I’m testing the Jekyll include system to create secondary navigation buttons. If this works, I will be able to put all the secondary navigation logic into one file. I may consider externalizing primary nav from _layouts/default.html as well. I think I may make the primary Nav:

MLSEO Linux Python vim

At least everyone will “get it”. I could go with the made up words of Levinux and Pipulate, but I’m trying to step into the path of traffic here.

Hmm, looking at the Simple CSS styles, I see that there is a flex layout if I wrap things in ul’s or ol’s. I’m not sure I want to now. Thinking I didn’t have that put me on this other route. It’s the path less travelled, but let’s see where it takes me. I’m testing using two headers. I’ve had enough of hidden menu items. Let’s just put it all out there and see what happens.

I’m really digging this approach. Try some groupings:

Interesting! I should to use the lowercase versions of these as my categories in my blogging content management system. If I get the groups I’m labeling in the markdown frontmatter in sync with site hierarchy, things become much easier. Single words! All lowercase. If I have to expand it, I can always do it later with code. Okay, think!

I’ve got the nav and subnav that I want for the “Home” section, all but the blog page which will require some reworking of my blogging system. Wow, but it could be pretty easy. I can do stuff like this now without rabbit holes.

But I’m too tired to do it now and I may have a big day ahead of me tomorrow with the kid coming over.


Wed Aug 17, 2022

A Day In The Life of An SEO SERPs, Python Pandas & SEMRush

Okay, part of making Linux, Python, vim & git love-worthy is making even just your day-job, the things you do to pay for it all, love-worthy as well. You can’t let what you “have to do” make you lose your love for what you “want to do” just because they employ and tap the same tools and capacity in you. Make what you have to do into what you want to do through Jedi mind tricks. This is the exercise you’re looking for. Okay, 1, 2, 3… 1?

Fixed locations for Virtual Desktops: Journal, Jupyter, Web. Yup. Okay, always nice to close everything and make a fresh start as if a system reboot after closing everything so things don’t auto-re-open. Then launch each screen’s item. I’m on screen 1 with the journal here now. 1, 2, 3… screen 2! That’s Jupyter.

Hmmm, there;s always a scavenger hunt at this point. What did you use the last time you did this project. Are you going to reuse the old code or just do it from scratch. If reusing the code, are you going to do it in-location where the project was last time, or are you going to start fresh in some new location? If you start fresh in a new location, are you going to copy/paste bits of your old code or import it like a Python package for reusable components? You always know something better if you write it from scratch. The question is exactly how much from scratch? We don’t burn our computers from sand, so we’re always choosing our shortcuts.

I find one of the strongest approaches is to start fresh every time. Give the project a strong nickname. If that nickname already exists in your repos, rethink. If it doesn’t, get started in a new folder, directory, repo or whatever name you want to use. It should probably be child to a ~/github folder so that you can make good evaluations regarding name collisions. If you’re searching through complex hierarchies for past work, you’re making a mistake. Keep it all flat under /home/[username]/github/. If you’re on Windows, install WSL so you have such a location.

Create the new repo folder directory. Give it a good nickname. Create the new .ipynb Notebook. Rename notebook from Untitled.ipynb and give it a good nickname too, maybe the same as the folder. That’s always a more important step than it would seem.

Move the data you’ll be working with into that folder.

Okay, it was like 5GB of data on the network and the Windows-based copy crapped out on me so I just restarted Windows. Okay, what’s the easiest way in Python to step through a set of files in the folder you’re in?

from os import listdir
for file in listdir('.'):
    print(file)

But we only want those that end with .db so this list comprehension:

[x for x in listdir('.') if x[-3:] == '.db']

And so the above becomes:

from os import listdir
for file in [x for x in listdir('.') if x[-3:] == '.db']:
    print(file)

And now that we know that, we can really just reduce it to this line to give us a list containing all the database file names:

dbfiles = [x for x in listdir('.') if x[-3:] == '.db']

Each of these is a sqlite database that used pip install sqlitedict to create a key-value style database. And to load them back in, we need that import. We can do the import and test that we can load each database with this. They’re big files, so I need to put in that timeout:

from sqlitedict import SqliteDict as sqldict
for dbfile in dbfiles:
    with sqldict(dbfile, timeout=10) as db:
        for key in db:
            data = db[key]

This will also give you some idea of how long it’s going to take to spin through the data because it actually loads each record into memory.

Okay, now given that the data is the entire Python response object of scraped search results and you have an extract_serps function on-hand such as this one:

import re

def extract_serps(text):
    rv = False
    try:
        div_pat = re.compile('<div class="yuRUbf">(.*?)</div>')
        divs = re.findall(div_pat, text)
        lot = []
        for div in divs:
            pat_url = re.compile('<a href="(.*?)"')
            url_group = re.match(pat_url, div)
            pat_title = re.compile('<h3 class="LC20lb MBeuO DKV0Md">(.*?)</h3>')
            title_group = re.search(pat_title, div)
            try:
                url = url_group.groups(0)[0]
            except:
                url = ""
            try:
                title = title_group.groups(0)[0]
            except:
                title = ""
            lot.append((url, title))
        rv = lot
    except:
        pass
    return rv

…then you can make sure you can extract the serps from the data and that it’s coming out properly.

from sqlitedict import SqliteDict as sqldict
for dbfile in dbfiles:
    with sqldict(dbfile, timeout=10) as db:
        for key in db:
            data = db[key]
            if data:
                serps = extract_serps(data.text)
            raise SystemExit()

Yeah, that’s the stuff. This is friggin’ performance art. I can feel the dopamine rush kicking in. If you’re someone in life that feels the impostor syndrome in what you do, take up Python. Get to know it and learn to think in Python. Become freely expressive as you would be in normal writing. Become freely expressive (like this) in normal writing, for that matter. It’s all the same thing. Okay, next step? Well, the data only contains a list with links and titles, and we need to know what domain the SERPs came from in the final transformed data format, and also the SERP positions. On any given iteration of the above loop, the position in the serps list actually is the search engine position. Given Python’s zero-based indexes, you just have to add 1 to know the SERP position. Oh, the database key which is actually the key in the key-value pairs also needs to be moved into the transformed output data. It’s essentially going into Excel format where there are no database keys, so the key becomes just another column.

We get the domain name from the name of the database file on each outer-loop iteration. We could extract it from the link each time using urlparse, but that’s way too much overhead. We can just use slicing to do that. We’re sitting on the dbfiles list of database filenames, so we can do this to test our thinking:

dbfiles[0].split("-")[1][:-3]

Bingo! Okay, so in the loop we just replace dbfiles[0] with dbfile:

from sqlitedict import SqliteDict as sqldict for dbfile in dbfiles: domain = dbfile.split(“-“)[1][:-3] with sqldict(dbfile, timeout=10) as db: for key in db: data = db[key] if data: serps = extract_serps(data.text) raise SystemExit()

Now there’s a tricky piece to wrap your mind around. We’ve got to add a third loop because we have to transform the shape of each row in the serps list. We can do that in a way where we can inspect the first row we encounter:

from sqlitedict import SqliteDict as sqldict
for dbfile in dbfiles:
    domain = dbfile.split("-")[1][:-3]
    with sqldict(dbfile, timeout=10) as db:
        for key in db:
            data = db[key]
            serps = extract_serps(data.text)
            for serp in serps:
                row = serp
                if row:
                    raise SystemExit()

But the row needs some data appended to it, so we add the domain, keyword and serp position to each row. If you’re doing this in Jupyter, you can just inspect the row value afterwards to see the first row you’re going to encounter. But all these rows need to be thrown into a table. Okay. Notice I check to make sure data is there and how I turn the row into a tuple on the table append to be kind to my laptop’s memory. I’m also moving all my import statements to the top of the Notebook, so you won’t be seeing them in the following code snippets anymore unless there’s a new one.

table = []
for dbfile in dbfiles:
    domain = dbfile.split("-")[1][:-3]
    with sqldict(dbfile, timeout=10) as db:
        for key in db:
            data = db[key]
            if data:
                serps = extract_serps(data.text)
                for i, serp in enumerate(serps):
                    row = [domain, key, i + 1] + list(serp)
                    table.append(tuple(row))

I had to convert the serp row into a list. It was previously a tuple which is great for memory, but you can’t add tuples to append row elements. You can with lists, so I do. And guess what? We’re sitting on top of a table that’s all ready to be transformed into a pandas dataframe. And so there will be a new import!

import pandas as pd
df = pd.DataFrame(table, columns=["domain", "keyword", "positon", "url"])

Do you see why nearly doing a project from scratch every time is often the right way to go? It forces you to think through the project and have a deep familiarity with the data and the process. It’s not big codebases or millions of lines of code like this stuff used to be in the past. It’s just easy breezy conversation.

Now if you have Excel, you can load the CSV directly into Excel to see how we did:

df.to_csv("serps.csv", index=False)

And there’s 2 things I notice. First, some html is slipping into the title column and second, there’s some gobbledygook that’s clearly indicative of an encoding error. So as usual we grab for utf-16 on the encoding:

df.to_csv("serps.csv", encoding="utf-16", index=False)

Well, that worked but we lost our columns when bringing it into Excel. This means that the CSV file-format handling got confused. You know the best way to deal with that? Ditch CSV, especially now that Pandas supports exporting to native Excel xlsx file format directly, though you do need to pip install openpyxl if you haven’t already. The new file-save looks like this. You should delete the old serps.csv.

df.to_excel("serps.xlsx", encoding="utf16", index=False)

And that did it! But now to deal with the html slipping into the title tags which is still a problem. Now there’s more efficient ways to do this with compiling the regex pattern and maybe making a strip_html function, but I’m in a hurry.

table = []
for dbfile in dbfiles:
    domain = dbfile.split("-")[1][:-3]
    with sqldict(dbfile, timeout=10) as db:
        for key in db:
            data = db[key]
            if data:
                serps = extract_serps(data.text)
                for i, serp in enumerate(serps):
                    url, title = serp
                    title = re.sub('<[^<]+?>', '', title)
                    row = [domain, key, i + 1] + [url, title]
                    table.append(tuple(row))

Wow okay yeah, that did it. And it was all on a dopamine rush of the thrill of being able to do this “from scratch” without scrambling for this and that. This is such a big thing with tech today, everyone wants to reuse code at too high an abstraction-level, the abstraction level of easy breezy communication. I would not have liked to have rewritten the extract_serps function, so that’s the bit I reused. Nothing else.

Okay, there’s another caveat to this project. I have another site.db file that I’ve been hiding because not all the results are from the same domain as they were with the previous databases (restrained by a site: google search modifier as they were… gee, did I forget to mention that?). Okay, so here’s a more typical situation which I also happen to have to handle on this project where the domain in the search results may be anything. Now normally you can just use urlparse from the standard library for this sort of thing, but I prefer pip install tldextract because registered domains, or “apex” domains, are actually rather tricky given all the country double-TLDs like .co.uk. For example:

from tldextract import extract
extract("blah.foo.co.uk")
# Results in:
ExtractResult(subdomain='blah', domain='foo', suffix='co.uk')

And so we modify the upper code. This is where those allergic to retyping code or copy/pasting blocks instead of writing functions (the disciples of the DRY school, for “don’t repeat yourself”) will freak out–especially on that RegEx bit which I totally could externalize into a function. But to them, I say I’m not DRY. I’m WET. We enjoy typing. Let me repeat myself: we enjoy typing! Nyah nyah nya nyah nya!

dbfile = "./site/serps.db"
table = []
with sqldict(dbfile, timeout=10) as db:
    for key in db:
        data = db[key]
        if data:
            domain = "foo"
            serps = extract_serps(data.text)
            for i, serp in enumerate(serps):
                url, title = serp
                parts = extract(url)
                domain = f"{parts.domain}.{parts.suffix}"
                title = re.sub("<[^<]+?>", "", title)
                row = [domain, key, i + 1] + [url, title]
                table.append(tuple(row))

Alright, all the data is transformed. Now, don’t lose your momentum. I’m coming down from the dopamine rush, but that’s a problem in the final mile. This kind of work isn’t done until it’s done. And just because the part that you now think is the only sexy, exciting part of the project is over doesn’t mean that the next step can’t seem just as sexy and exciting to you with the right mental tricks.

Now is a good time to pat yourself on the back and give yourself your own big attaboy’s. Close all windows on all your various virtual desktops except for the first 3, which contain journal, jupyter and browser, respectively. Think! Getting down to just those 3 screens with those 3 apps is a great way to evaluate next steps… of course!

Open Google Sheets. Close all browser tabs except for tab 1 containing Google Sheets. Find the old deliverable that was like this one I’m working on.

There’s a 500 keyword list that this project is based on. I need to pull metrics like search volume against all those keywords and this is a case where there’s no reason to hit an API. The Web user interface is plenty. SEMRush has a bulk keyword took that takes 100 keywords at a time, entered comma separated in to a field that detects the commas and changes the UI to a cool numbered keyword-per-line web form. Nice responsiveness. Anyhow, I need to shake & bake the keyword list into groups of 100. Here’s the code I used for that:

from os import listdir
import pandas as pd
import warnings

warnings.simplefilter(action="ignore", category=UserWarning)

# Use the SEMRush bulk keyword analysis tool at
# https://www.semrush.com/analytics/keywordoverview/bulk/?db=us
# pasting in 100 keywords at a time. Watch the trailing comma.

with open("keywords.txt") as fh:
    keywords = [x.strip() for x in fh.readlines()]

for i, keyword in enumerate(keywords):
    keyword = keyword.replace(",", " ")
    if not i % 100:
        print()
        print()
    print(keyword, end=",")

semrushfiles = [x for x in listdir(".") if x.find("_bulk_") > -1]

dflist = []
for file in semrushfiles:
    df = pd.read_excel(file, engine="openpyxl")
    dflist.append(df)

df = pd.concat(dflist)
df.to_excel("keywords.xlsx", encoding="utf16", index=False)

Pshwew! Okay, it’s hours later during the day. Busy, busy, busy!

It’s time to move all this data into Google Sheets and turn it into a great big pivot table. Pivot tables in Google Sheets is pretty bleeding edge.


Wed Aug 17, 2022

Bring Pipulate & Levinux Back Into The Picture

See how little time you can launch a new page on your new MikeLev.in site and restructure navigation.

Okay, switch your “About” link on your main nav to Levinux. Make a levinux.md page and put it in place. Move the about link to footer.

Okay, that went super-well. Do the same for pipulate.

I see now that the best thing to do is to develop pipulate.com and levinux.com both a little bit.

The pipulate.com site is terrible. But I’ve got to start differentiating my site with made-up words because I need to go the reputation route over the generic search term route. Google has just changed that much and my GSC data shows me that.

Focus on your day-job intensely as soon as you have your success here. I may even roll it back to the old navigation after the satisfaction of just seeing these 2 buttons ready to go. Use that flexibility! Move this over to the public journal. This stuff is of interest. Rapid manipulation a website thanks to a new minimal CSS framework.

Okay, but the Pipulate and Levinux pages aren’t ready. I can’t keep them on the main nav just yet. Move those links down to the footer to keep me aware of the urgent need. I need a clearer vision of where I’m going with all this.

The world is just changing so much! I’m super-glad I made my shift to Linux, Python, vim & git. That’s going to keep me relevant long into the future. But I have to use those things to develop Levinux and Pipulate into more mainstream things. And I need to get back my PyPI account access. 2FA really screwed me there.

The nbdev upgrade to version 2 really screwed me too, especially on my CMS system. I’m going to basically rebase it in a brand new repo. No, I was able to delete the contents of .gitattributes and I could git commit and push again. Okay, the old CMS skite repo is safe, but I have to revisit very soon.


Tue Aug 16, 2022

Why Pages Aren’t Usable On Mobile, Fixing!

Okay after yesterday’s only so-so venture into classless (a.k.a. class-less) CSS frameworks, it’s pretty obvious there are many and they’re not really classless. When you give up all the messiness of defining HTML element classes, what you get back is a framework opinion which in many cases you’re going to want to immediately override, thus having to employ classes again which seem to be there in the classless frameworks anyway. Haha! It’s the same old framework dilemma. You don’t get something for nothing unless the something you want is exactly what the designer of the framework offer as their default.

Okay. So it’s once again the least worse of several evils decisions. What do I really want in my classless frameworks. What are the priorities of the “opinions” I want to see expressed as the classless default? Hmmm.

First, I want a fluid layout also known as a liquid layout. I want a header for brand identity and links across the top. I want those links across the top to become hamburger links when on mobile with the brand identity of the banner still displaying. I want an accordion menu down the side next to the main content when on mobile, but that accordion to become “stacked” as some sort of expandable options immediately below the now-hamburger menu next to the brand identity when on mobile. I think it’s a pretty common layout and desire.

I can’t spend too much time on this right now, because day-job and life. But what I do want to do is to bank an irrevocable baby-step forward. I need to document something here as a better starting point for my next go-around. Put the style of code you’d like to use here.

I’m actually rethinking already. All that fluid liquid layout stuff seems to be a tripping point. That’s where so much opinion is inserted. If I just give up a left-rail that pops to not be one on mobile, things become much easier and I can give up class-full frameworks masquerading as class-less ones, like picocss which I tested yesterday. Simplecss is looking simpler.

Okay this is where we become intrepid. I’ve experimented with the frameworks without a webserver using the file:///C:/[path] protocol and a meta refresh element to keep reloading the page. The simple bare-bones minimal classless css template that I used to test and assure that simplecss wasn’t terrible is:

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <link rel="stylesheet" href="https://cdn.simplecss.org/simple.min.css">
    <meta http-equiv="refresh" content="1" />
    <title>Future-proof your skills with Linux, Python, vim & git.</title>
</head>
<body>
  <header>
    <nav>
      <a ahref="#">Linux</a>
      <a ahref="#">Python</a>
      <a ahref="#">vim</a>
      <a ahref="#">git</a>
    </nav>
    <h1>LPvg.org</h1>
    <p>Future-proof your skills with Linux, Python, vim &amp; git.</p>
  </header>

  <main>
    <p>Resist obsolessence.</p>
  </main>

  <footer>
    <p>LPvg.org</p>
  </footer>
</body>
</html>

And then I slammed that into MikeLev.in’s Jekyll _layouts/default.html file making minimal adjustments to keep all my site’s content exposed. I don’t have Ruby Jekyll installed locally, so I have to actually push the files up with a git commit and push every time I test the real thing. This is as opposed to the file:/// approach where I just mock up a local html file. This isn’t so easy when your site is in markdown and transformed through the github.io Github Pages static site generator system. You don’t have the html versions of the files locally for testing in a browser, so it’s more of a push-and-pray approach, which you can do when you’re not a big player. One issue is because MikeLev.in is so big (so many individual markdown.md files) it takes a few minutes to generate and slows down the rapid iteration cycle. So be it. Rapidly iterate on experimental files locally and transpose what you learn into the live site.

Okay, I’m going in there all intrepid. I’d use the bull in a china shop expression were it not for Myth Busters. Wow did that make an impact (unlike the bull who walked around all gentle). There’s not really much from the old design that I’m determined to keep. Alright, I’m really enjoying this. My last go at all this to get my reversible logo to rotate was cool and a good experience, but misguided for SEO. I have to stick to the mainstream to make Google reward me in SEO. What I did actually really killed my SEO rankings. Maybe stick a screenshot here. Mobile design readability does have a huge impact on SEO.

Mobile Design Readability Huge Impact On SEO

That was the trend and Google outright tells you why pages aren’t usable on mobile.

Why Pages Aren't Usable On Mobile

That really is the onus of this entire endeavor of switching over from my spinning logo experiment to plane vanilla CSS with big clickable links.

The nav font sizes have been a problem with simple.css. There’s different ones used between minimal css file and readable one. Switching to readable one. I’m going for big fat readable fonts now. I also see a font size measurement I’m not used to, rem. I guess it’s relative em, relative being to viewport or device or something. Looks like a good idea. Fonts get bigger and SEO should improve.

Had to remove poem (will it work here?):

I got a 🎤.
I like to talk.
I love to talk into the 🎤.
I love the 🎤. I’m 🎤 lovin’
I’ve got some tales.
So, let’s begin!

—Mike Levin, 2022

Just forge on! Focus and think.

It’s worth noting that you’re going for an ultra-simple design not merely to make bigger more mobile friendly design to fix your SEO error in Google Search Console, but also to just make the overall site more flexible to organize. Don’t change URLs but do change click-paths, especially from the homepage. This project is in support of my new ELTgd initiative. It feels good. It felt good. Every little thing gets done.

I got some pretty amazing work done with my Github Pages Jekyll sub-CMS system last month? A few months ago? It’s all a blur with the move. But I do know it’s got an edge of desperation of getting the most important projects done soonest and most up-front so they can work for me producing compound interest. But that won’t happen with small unclickable mobile links that Google doesn’t like. Weakest link in the chain principle.

Next weak link in the chain? Better categories! Better rapid adjustment of categories. Better exposure of categories from the homepage. Better tagging of categories in this file.

I have to be careful when git mv’ing files. If I move index.md to index-back.md and cp index-back.md index.md in order to preserve git histories on old files, I have to remember to git add the new index.md back otherwise it disappears from the site! Ugh.

I do very little like the mainstream except when I need to in order to get the same benefits the mainstream is getting, especially if it doesn’t corrupt and compromise my special habits that give me the competitive edge. Wow, I should really develop some of these journal posts more into featured blog posts. I should have the concept of featured blog posts versus the long-tail archive. I should re-read some of my old journal-style posts to “extract” these ideas. I should do the list processing from long lists to shorter lists in that distillation process that I’ve advocated for years. I should hittail.

Okay the quote tag doesn’t work well with the new css framework, so try a paragraph tag with the class notice the way simplecss.org does on their homepage. Follow their lead until I have to diverge. I want the accent color of the navigation elements so I’m adding simplecss.org’s /assets/style.css to my own. Interesting! Getting there.

Look at the Jekyll plugins available to me. These are turned on by default:

jekyll-coffeescript
jekyll-default-layout
jekyll-gist
jekyll-github-metadata
jekyll-optional-front-matter
jekyll-paginate
jekyll-readme-index
jekyll-titles-from-headings
jekyll-relative-links

The page https://docs.github.com/en/pages/setting-up-a-github-pages-site-with-jekyll/about-github-pages-and-jekyll says this:

You can enable additional plugins by adding the plugin’s gem to the plugins setting in your _config.yml file. For more information, see “Configuration” in the Jekyll documentation.

For a list of supported plugins, see “Dependency versions” on the GitHub Pages site. For usage information for a specific plugin, see the plugin’s documentation.

I didn’t get the accent color working, but I’ve got a fabulous start in the switch over to a simple css framework to bring back my site’s SEO performance. At least a little bit. I have to work on click-paths!

Okay, this is coming along really well. The call just started.

I am experimentally removing these from my CSS in case they interfere with the accent color:

a:link {
  color: lightblue;
}
a:visited {
  color: darkgrey;
}
a:hover {
  color: white;
}
a:active {
  color: white;
}

You’re going to have to switch to your day-job stuff real soon now. Focus! Get this to a place you’re happy with but don’t be a purist about it but don’t be a purist about it.

I’m not selling you anything but for the ad bucks I make on my YouTube, which isn’t much. Advice I give is because I only engage in work that feeds my soul and makes me happy. I find joy in sharing that with you.

I’m not selling you anything but for the ad bucks I make on YouTube. Advice I give here is because the work I do feeds my soul and I find joy in sharing that with you.

Beware, this blog is not for you. It’s every little thing I do. The content here is not prepared, so turn back now and please be spared.

Wow, I need a way to insert a message at the top of my blog index page. That’s under my CMS system. Think!

Look on my macro what happens when I hit @p.

Ugh! Okay, I fixed it in the .py files but nbdev is no longer generating the .py files from the Python Notebook. I’m going to have to be very careful to not lose these updates.


Mon Aug 15, 2022

Choosing Best CSS Framework For Github Pages and Jekyll

Let’s push and push and push on that YouTube front. Wrap in Reddit. Twitter, Facebook and all that is nothing compared to the YouTube/Reddit front. I’ll maybe announce my videos on Twitter and keep the old Levinux crowd in the loop with the MikeLevinux page, but that’s out of consistency, loyalty to those audiences and helping search a wee little bit.

Oh, search. Yes, I’m always reaching out to people who search and not surfers. It’s seekers I’m seeking and not slaves to the YouTube algorithm. The YouTube algorithm has rarely been kind to me and I should assume it never will. It’s for those who predict and pander. Ha ha, and that’s from an SEO who makes a career out of showing others how to do exactly that. Perhaps I should practice more of what I preach just to bank a little more online success and keep my skills sharp. Hmm, yes maybe I will. Why am I pushing on the YouTube front if not for that? Oh yeah, it’s more for attaining levels on the Maslow hierarchy of needs. I think I’m seeking “my people”.

Here’s a journal entry that started out on the private side and I moved over to the public side because this is precisely the sort of process-sharing I wish to reside here. There’s plenty of weakest link in the chain analysis to do for my YouTube, Web and social media presence. The main thing is that I’m working on my tooling an thought-processing and general skills. I’m NOT working on actually boosting my public-facing presence. The two probably should go together more than I’ve liked to believe recently, and so to have credibility on the tooling, thought-processing, skills front I need more credibility on the public-facing front.

Sighhh, okay so be it.

1, 2, 3… 1?

I need even better organization on the public-front that performs better in Google Search and reflects the current actual state of my thinking. It needs more flexibility and less fringe design wackiness. And if I’m going mainstream, I should go as mainstream-but-timeless as I can. That means not Google’s Material UI nor any of the cool-kids modern JavaScript frameworks. Instead, I should probably stick to those that will give me a modern look, be most search-friendly, been around the longest, and perhaps allow me to produce the most clean, vanilla HTML code. Specifically bootstrap.js, the old Twitter original library that kicked off the movement, comes to mind. Does it meet the criteria? Will it force me to use sass pre rendering tools? If it does, will Jekyll do the heavy lifting for me (it’s pre-rendering anyway).

Quick googling shows Bootstrap vs. Foundation, Bulma, Semantic and UIkit. Look at which uses not class propagating HTML and is friendly with Github Pages and Jekyll. I want as simple, flexible and future-proof as possible. Ugh, the tutorials on the topic are ugly. All this stuff is ugly. I’m not thrilled with how the Web works with HTML, the DOM, JavaScript and CSS. But that’s the downfall of every tool, evolved, reality-based and compromised. Idiomatic approaches are only clear in hindsight and by that time the pee is in the pool and isn’t coming out. Okay, so live with it. Which is the solution I can live with most happily over the years? Chosen tools become a part of you, so choose carefully.

The best approach is to throw a throwaway domain to the cause. Put something out with the least code. Use elements to do the styling. Anything inside a nav element gets styled as the nav I define. I shouldn’t have to liter everything up with btn classes, yuck! Less code. More abstraction. Less classes, more purposeful elements.

Foundation! Ugh, it only took me minutes looking at Bootstrap and all those btn-this and btn-that’s to become fatigued before I even downloaded the includes. Do not fall into the btn-trap! The word is semantic markup. Stick with the most clean, semantic, plain-vanilla HTML as you can. That’s future-proofing. Boostrap is the opposite of future-proofing, mixing all its special implementation code in with your main content.

Beware rabbit holes! If any of these CSS frameworks lead you down a rabbit hole, back off and rethink!

Okay, follow the love. I’ve got a few competition projects right now, including ones I have to get done for work. Is there a really quick one-off I can do? I need an experimental sight where I can experiment with CSS frameworks under Jekyll without impacting my main site. I should also point out why this is one my mind.

Let’s develop the lpvg.org site. It’s one I need to develop soon/fast to lead this movement anyway. Do the first baby-step of this project then back off and do different work. 1, 2, 3… 1? Go visit site and see if there’s anything there. No, not even my CMS system. Perfect, so it’s a fresh start. 1, 2, 3… 1? See if I have a repo yet. The folder is there, but it’s not a repo yet. Make it a “Hello World” Github Page.

Think about organizing your recently deployed categories feature of your blogging software. Life is a lot of little journeys. Here are some of mine you might be interested in. That’s not for this project, but that’s definitely something to keep in mind. Drop it into your todo list at the bottom of this document.

Ugh, okay, do the baby steps. I activated lpvg.org on Github pages. I put a _layouts folder in place and a default.html file in location. I stripped it down to the bare essentials.


Sat Aug 13, 2022

Why Linux Fights Obsolescence and LXD is Key

I just pushed out a video on LXD being a sort of Noah’s Ark. My stuff’s never going to naturally light the YouTube algorithm on fire. I’m too niche. My content and message, even if it may someday have some mainstream appeal, doesn’t today, but I can’t let that stop me. I really don’t want to go the whole entrepreneurial route or anything over committal to have supplementary income. It’s got to be a natural side effect of what I do anyway. This publishing as a side-effect of just plain old life is definitely the way to go. I just have to make those videos more successful.

Okay, weakest link in the chain principle. I could do better research on the keywords and phrases before I push out new videos, tweaking the titles and descriptions this way and that to pander and appeal. It’s the same as my main field of SEO in which you step into the path of the search traffic that’s already there. Search patterns are search patterns and as a publisher you can either choose to walk into the path of those search patterns or not. And there’s my admission that even on YouTube, it’s mostly still about search for me and not auto-suggest.

Okay, so let’s do one of those weakest link in the chain analysis just like you would have done with clients back in the day before you became an in-house SEO. Hmmm. Okay, there’s bigger search than YouTube and it’s called Google. And I could have greater connection between my YouTube and Web content, reinforcing each other in any way that they could reinforce each other. And of course one of my greatest strengths is that I just naturally produce content when I think things out because I think in vim. In other words, I type directly into a text-editor like I’m talking to myself because it helps me think. But that’s not the weak link. The weak link is not tagging, organizing and publishing that content well, so as to step into the path of searchers. I can fix that.

The world is also considerably different than it was during the days I got into SEO (late 90s!) and I should give things a fresh think. Lots more people are searching, lots more people are searching in Google, and lots more people are searching in things other than Google. There are quite a lot of search streams, so there’s more places to be found, more outlets to have to target, and more competition in each of those channels. Many paths lead to spinning your wheels, just like it’s always been in SEO.

One must choose the work carefully in order to have a good chance of having an effect. There’s also the question of doing more work on fewer attempts (higher quality) or less work on more attempts (lower quality, but better odds). I should endeavor to do both (especially more of the quality stuff) without letting the frequent attempts “pull down” the quality stuff. And when I say “quality” I really just mean more highly produced to appeal to the attention-challenged mainstream. My niche crowd recognizes the quality in the higher frequency content. There’s also a certain charm to the genuine experiences and the rough cuts.

The wheel-spinning is sort of like the more frequent publishing with a lower resistance to publishing. There’s lots of analogies here. The one-arm bandits with frequent play and smaller payouts versus blackjack or poker with larger play-time but bigger payouts. Either strategy can and should be able to work, but you have to be engaged in both to have good data coming back for feedback loops. Yes, feedback loops. I have to establish more addictive (both to me and my audience) feedback loops. Engage in lots of small, interesting projects which compliment each other and over time create a snowballing effect.

Another interesting point is that I’m in the middle of the weekend now and I have a couple of projects for work that I still wish to slam out. In fact, I really need to because the day job pays for everything else. Another way of looking at it is that my weekend discretionary projects keep me qualified for my day job, but still. It’s good to please the boss and I need to please the boss a little bit more. We’ve got a really awesome team that’s getting most of the attaboy’s these days. I’m not that worried, but it’s always a good time to put more tension in the machinery. Get the “easy one” done which really could help the boss still even this weekend.

Other projects to think about are using the 24x7 scheduler I just created for a series of interesting little projects. Create strong nicknames, imagery and visuals around… oh my, the next iteration of Pipulate, of course! Wow, I need to get my login back on PyPI. I got cut-off because of their sudden 2FA implementation. I was careless and lost the app that had my 2FA in it. Not for any other services, but only PyPI, the only one without some sort of automated backup system tapping email or mobile. Ugh! Everything’s an interesting story, even this. You should make a video out of your PyPI experience.


Fri Aug 12, 2022

Run LXD Linux Containers on Windows WSL 2 Ubuntu 20.04 and 22.04

There’s no place like home. Yet, we lose our “home” a lot in life. Settling quickly and happily back into a new home is one of the great skills in life, be it in the physical world or the digital one.

Re-establishing all the familiarities and getting things running smoothly again is challenging in either case, but it is a skill one can benefit from improving. To that end, I’ve prepared this little guide.

GUI’s Are Syntactic Sugar (Not The Real OS)

What’s home your digital home? Windows? MacOS? Point-and-clicking? drag-and-dropping? Locations of things changing on you with every version upgrade? Not me. I’m using stuff that hasn’t changed much since the 1970s and 1990s and is still going strong. The locations of your keys on the keyboard don’t change (much) and software interfaces are superbly static so we can be the dynamic ones. Fixed locations are better than floating ones when it comes to improving over time, and that’s what the type-in user interface provides.

Better still, the type-in OSes are generally free and open source, certainly contributing something towards their eternal timelessness. While proprietary Windows and MacOS go hopelessly obsolete, the Linux Terminal only gets better with age.

Windows and Mac desktops are not really even OSes. They are syntactic sugar sprinkled on top of what really matters, the text-based OS underneath. Apple proved this in 2007 when they switched their proprietary OS9 to UNIX-base OSX and hardly anyone noticed. But today most Mac users interact only with the Cocoa puff and not Darwinian evolution below.

Apple prisoners stay prisoners because they can live only in the graphical user interface where they’re shaken down for money every few years. Steve Jobs used to pry the arrow-keys off of keyboards he hated the command-line interface so much. Separation from the “real OS” is very intentional, and you can fix that.

Particular Instances of Hardware Is The Enemy

Even those who have made the break from GUIs still have problems because of particular instances of hardware, virtual machines or even containers. The crash of a piece of hardware in your life can be devastating, a lifetime of work or memories lost. And the “cloud” isn’t the entire answer, either. Just having a little more control is.

My feeling is that you’re not free from the tyranny of vendors until you can quickly rebuild your home from parts, and the home you build is mostly a text-based OS in which I guarantee you can learn to be comfortable. Like anything else, it takes practice.

Developers Know This. Microsoft Knows This. The Time is Ripe.

Developers know this and demand their Linux tools like gcc, git, apt, pip, npm and all the other things that frustrate Mac and Linux users. They’re all going to Linux anyway and Microsoft knows this so has been working on their Windows Subsystem for Linux (WSL) for quite some time.

Windows Subsystem for Linux is finally ready for prime time, and so ironically Microsoft themselves is giving us all our best hope of finding our freedom and independence from Microsoft, if only we can figure out how to use the tools properly and form the right habits. Build your house from bricks and don’t let the Big Bad Wolf blow it over again, obsoleting your old GUI-based software.

Vendors, Vendors Everywhere; Can’t Escape Them For Now

To achieve this lofty goal, there are unfortunately vendor components we’re just not going to be able to get away from today. We will ditch Windows and Github in the future and I’ll take you there. But for right now, today there are a few vendor-provided pieces we’re going to need, such as Github or Bitbucket for private git repos in a remote repository.

Unless you’re ready to make the switch cold-turkey to Linux, we’ll also be using Windows for a little while longer. It gives us our warm-and-fuzzies, games, drivers, readily available working systems, compatibility with the office or whatever. There’s a thousand tiny reasons starting out on Windows reduces the pain. And it doesn’t have to be multi-boot anymore. You get Windows and Linux at the same time–the best of both worlds.

Sprinkling On The Magic Fairy Dust

Fast-forward in your mind to where you’re familiar and comfortable with the Linux Terminal text-based interface to computers. Even there you won’t be able to sit down at just any Linux Terminal and be 100% comfortable. There will be a few bits of customizations to sprinkle in; a mapped drive here, a text-editor configuration file there. These are mine. It is the magic fairy dust I sprinkle into a plain vanilla Linux system to make them home.

The list of essentials may be different for you and it evolves over time. None-the-less, these are the parts that get recombined into whatever. It can be onto the bare metal hardware, a virtual machine or a container. Once you have the skills to recombine them, it doesn’t matter. Just keep your ingredients safe (and up-to-date) somewhere. Git repos are generally good for everything except for the .ssh keys and other login credentials.

To Map Locations Or To Copy Files, That Is The Question

Three of these are file locations while the rest are just individual little files that just get dropped in the ~/ (home) folder of any Linux system. The dot in front of the filenames means the OS hides them (invisible files) by default. Config files are usually like that.

The 3 file locations are handled each in different ways. If there’s a “host” machine of any sort involved, I like to reuse the host’s .ssh and github folders so I don’t have to make multiple copies of keys all around and so that I don’t have to keep git cloning out of Github, but either one can be produced from scratch rather easily.

A Tale Of Two Hosts: Windows & Linux

I’m using Windows machines as the host these days. This is because Windows hardware is cheap and abundant, pre-installed, has lots of drivers and other support, keeps me compatible with the office, and a thousand other little reasons that just make this today’s reality. It’ll change in the future, but for now I use the Windows Subsystem for Linux (WSL 2) to host Linux containers. These Linux containers get the magic fairy dust sprinkled onto them to make them home.

The next concept is a bit of a leap, but hang with me. Windows should not directly be your host, because it’s not timeless and eternal enough. Linux should be your host, so our first priority is to get a sort of Linux host running side-by-side with Windows under Windows Subsystem for Linux (WSL 2). We will then use this WSL2 Linux instance to host subsequent Linux containers onto which we will sprinkle the magic fairy-dust making them home.

I’ve documented plenty how to get WSL2 running in the first place. I may add those instructions or hyperlink them here later, but for now once you have a WSL2 Linux instance running, you can map locations from Windows host over to Linux host like this from within the WSL2 Linux Terminal, changing usernames to your case:

ln -s /mnt/c/Users/winuser/github/ /home/wsluser/github
ln -s /mnt/c/Users/winuser/.ssh/ /home/wsluser/.ssh

These are the easy ones because those locations really reside on the real hardware of the host machine. The ln command makes UNIX-style symbolic links. They’re one of the million-dollar tricks of Unix/Linux and you should google them to really grok it. It’s a gift that this works in this context. Network drives aren’t so easy.

Persistent Mounting of Microsoft Network SAMBA/CIFS Shared Drives

Network drives reside somewhere else on your network than your actual laptop. Why would you do this? Your laptop should not be your single point-of-failure. You should start getting yourself used to working on different physical hardware since your home’s about to float now anyway. Don’t go through all this trouble and then make any single laptop all that precious. Use a NAS and back up its storage to 2 additional locations. And it’s easier to get a shared network drive in your home than you think with such products as QNAP or Synology NAS (network attached storage).

I like to use a location called ~/data. If I haven’t explained it yet, ~/ always means “home”. When you see ~/, think /home/username/. The tilde is just a shortcut. And to map a network drive to such a location, here’s what we do.

cd ~/
mkdir data
sudo mount -t drvfs '\\EchidNAS\data' /home/wsluser/data

There’s some special Microsoft magic going on here with that drvfs fileformat it’s defining. From the page https://docs.microsoft.com/en-us/windows/wsl/wsl-config it says:

DrvFs is a filesystem plugin to WSL that was designed to support interop between WSL and the Windows filesystem. DrvFs enables WSL to mount drives with supported file systems under /mnt, such as /mnt/c, /mnt/d, etc.

Suffice to say it works and is the key to moving files around easily between the 2 hosts, and in the future between containers and other physical computers on your network. Of course adjust your paths and usernames to match:

Unfortunately, that network mount will be lost between every reboot without editing the /etc/fstab file to make it persistent. So this line must be edited into /etc/fstab. It needs admin privileges to edit, so I sudo vim /etc/fstab.

\\EchidNAS\data /home/ubuntu/data drvfs defaults 0 0

At this point, you can exit your WSL login, restart WSL and go back in to make sure you still have your ~/data drive:

wsl --shutdown
wsl -d Ubuntu-20.04

Sprinkle Magic Fairy Dust From ~/data/home To ~/

Now that you’ve mounted your network drive where you keep all those precious components you can build your house from and made sure the link will survive a reboot, you can with confidence copy stuff from it home:

cd ~/
cp ~/data/home/.* .

You can exit and log back into this new Linux host to make sure your environment is fully activated. In particular that will run your .bash_profile. If you have stuff in your .bash_profile file that doesn’t exist (yet), you’ll get errors. I typically activate a Python venv (virtual environment) from in there, so I have to comment that line out.

Now I’ve Got A Linux Host… Ho Ho Ho

This is a nice, solid Linux host machine for your Linux containers. This is not your new actual home. This is a Linux host under which your LXD Linux containers will be created and treated as home. There’s a subtle point to get here. It’s all about the application program interfaces (APIs). APIs are the real tools here. You learn an API and internalize it as a part of yourself like learning a new language. You don’t want it going obsolete on you all the time and be left speaking a dead language… like Windows. You want to be speaking Linux to create and manage other Linux.

Many will suggest at this point (and Microsoft certainly would) that you should just use this WSL Linux instance directly as your new Linux home. The problem being that you’ll continue nurturing your Windows dependences. There’s tons of subtleties and nuance using a Windows machine as a Linux host directly. You’ll be building your house from straw because down this path, you’ll be becoming expert at the Microsoft WSL API and not the LXD API. What you really want to do is learn the LXD (Linux container host) API, which I expect will eventually stabilize be mainstreamed into Linux the way the Linux Kernel Virtual Machine (KVM) did. LXD is from Canonical, the people who make Ubuntu. It still has to officially win the battle for the kernel and ubiquity in other distros, but it’s well on its way.

But What About Docker Vs. LXD?

Most people will be advocating Docker over LXD at this point because of its enormous uptake over the past decade. And yes, Docker is awesome and won’t be going away. But Docker is for Apps. LXD is for complete Linux systems. Or in other words, Docker is for the “possessions” inside your home but not for the entire home itself. LXD is for your entire home.

Docker can fake it, but you feel the faking. Once a particular home is “perfected”, a docker image can be turned into a wonderful distributable plug-in image so anyone can plug it in and enjoy that home. So docker is great for app store ecosystems. But if you want to start customizing that home and making it yours (like a developer), it’s done with a series of layered-on or composited overlays. Docker makes a sort of nonvolatile or immutable core image

Once you’re in an LXD container in comparison, it’s exactly like being in a regular Linux system. Just Google LXD vs Docker and you’ll get the gist.

Installing LXD Containers on a Linux Running Under Windows WSL 2

This is where it gets tricky. I’ve got quite a bit to expand here in this article related to mapping in drives from the Linux host to the containers.


Thu Aug 11, 2022

Use Python to Cycle Your IP with HMA VPN Software and Windows Automation

Wow wee! This is not a nut I thought I’d crack. Actually, someone else cracked the nut and it was October of 2021. I wish I was staying on top of this topic to spot it sooner, but automating stuff on the Windows desktop isn’t too hard using pywinauto. I actually try to avoid it because it’s a lot like web browser automation, but without the cross-platform advantage and ability to run it headlessly on a server. But sometimes you just have to automate Windows… in order to get a new IP from your HMA VPN software, LOL! Read into that whatever you want, but I am an SEO (search engine optimizer) after all. The solution this problem is here:

https://stackoverflow.com/questions/69705923/control-windows-app-hma-vpn-using-pywinauto

…and the actual code that he posted is:

from pywinauto import Desktop,Application
vpn_app = Application(backend="uia").start('C:\Program Files\Privax\HMA VPN\Vpn.exe')
dialog=Desktop(backend="uia").HMA
panel0=dialog.Pane
# Command to connect / disconnect the VPN: connect_button.click()
connect_button=panel0.ConnectButton
# Command to change the IP address: changeIP.click()
changeIP=panel0.Button5
# Check VPN state:
# 0 if disconnected
# 1 if connected
print(connect_button.get_toggle_state())

# Command to connect / disconnect the VPN: connect_button.click()
connect_button=panel0.ConnectButton
connect_button.click()

# Command to change the IP address: changeIP.click()
changeIP=panel0.Button5
changeIP.click()

# Check VPN state:
# 0 if disconnected
# 1 if connected
print(connect_button.get_toggle_state())

Tue Aug 09, 2022

Switching a Python Scheduler to Huey Task Queue Using Crontab API

Here’s a video I didn’t think I was going to do. I was going to switch over from one Python Scheduler to another, dealing with the changing “API-language” this entails, going from the very Pythonic “for humans” API of pip install schedule to the much less Pythonic crontab method of Huey, a scheduler with is also really a data pipeline task queue system like the much more popular Python Celery package but with much lower dependencies, so you can run it without all this devops stuff (tech liability).

Over my last several videos, I went from the pip install schedule package from PyPI that has an API that beautifully looks like this:

schedule.every(10).seconds.do(job)
schedule.every(10).minutes.do(job)
schedule.every().hour.do(job)
schedule.every().day.at("10:30").do(job)
schedule.every(5).to(10).minutes.do(job)
schedule.every().monday.do(job)
schedule.every().wednesday.at("13:15").do(job)
schedule.every().minute.at(":17").do(job)

# Or alternatively, use Python decorators:

@repeat(every(10).minutes)
def job():
    print("I am a scheduled job")

Task queue (for reliability). To pip install huey whose scheduling portion of their API looks like this: Huey is patterned after the much more popular Celery API.

@huey.periodic_task(crontab(minute='0', hour='3'))
def nightly_backup():
    sync_all_data()

@huey.periodic_task(crontab(minute='*/3'))
def every_three_minutes():
    print('This task runs every three minutes')

So you see, Schedule allows per-line standalone scheduling instructions with a well designed “API-language” plus the convenience of Python decorators where those “for humans” language conventions carry over.

Huey on the other hand only allows function decorators, which in and of itself is not a bad thing, but the API follows the long-standing Unix conventions of the cron program. Now while it’s generally good to follow the Unix/Linux API-standards, scheduling is one of the more difficult places to do so.

Switching my Munchkin container that currently uses the Python PyPI pip install Schedule package over to Huey is the subject of this video.

Just get to it and make it happen. Do a series of quick, small passes even if you’re producing videos to document it and help people at large dealing with these same issues.

How to test that your scheduler is running when all you have is periodic scheduling and you want a scheduled even to happen moments after you start the scheduler?

There were stumbling blocks having to do with “exact scheduling” vs. periodic scheduling, but we got it to work. Nuances:

Don’t forget to subscribe!


Tue Aug 09, 2022

TechSmith Camtasia Took Away Screen Recorder Hide Taskbar Icon Feature

How to record your day-to-day work honestly and without much editing.

Can you imagine the video editing just on this sequence if I weren’t just showing you everything? 2x longer? 10x longer?

This allows me to go at a very fast pace.

I’m uncompromising on certain things (like minimal editing)

I’m compromising on certain things, like dealing with quirks, feature roll-backs (come on TechSmith! Don’t do that) to get a type of software that isn’t (yet) better in the Free & Open Source Software world.

I may learn Blender to see if it has these TechSmith Camtasia screen recorder & editing features.

I just recorded a video that explains why I won’t edit my videos just to hide the Taskbar Icon for the Screen Recorder.

This was my first video recording with the Camtasia 2020 recorder, while doing the editing with Camtasia 2022. Why? Because only Screen Recorder 2020 or earlier has this feature, an important reason why I use(d) it, and only the newer editors make the user interface fonts large enough for my poor old eyes to read.


Tue Aug 09, 2022

Windows Terminal Stuck Full-Screen, No Tabs & process exited with code 1

Press Ctrl+Shift+W to force the window to close. This replaces the ancient Alt+F4. Why Microsoft changed a forced-close to Ctrl+Shift+W while they kept the fullscreen keyboard shortcut as Function+F11 is a mystery. I’m guessing that locked-up Windows Terminals happen so often that the Microsoft programmer in charge of it decided to give it an easy-to-press keyboard shortcut.


Sat Aug 06, 2022

Turning Python Huey Data Pipeline into Schedule Repo Alternative

My previous videos walked through setting up a scheduler and a proper data pipeline dask queue on two different Linux containers on my Windows system. I kept these processes separate because I understood:

pip install schedule

…bit I didn’t understand:

pip install huey

In so many ways they are the same.

In the case of Schedule though, you simply feed the file that imports schedule and defines the scheduling to the Python interpreter:

python scheduler.py

…however, in the case of Huey, they have a huey_consumer.py file that is magically in the command-line path after a pip install huey. That’s a bit freaky to me, but nbdev does it as well, which I’ve been using for awhile now. However nbdev eliminates the .py file-extension and makes it feel like a native OS command (across OS’s, which is an awesome trick). Huey doesn’t hide the .py extension. My ~/py310/bin directory currently contains:

(py310) ubuntu@Huey:~/py310/bin$ ls -la
total 56
drwxrwxr-x 1 ubuntu ubuntu  222 Aug  1 19:25 .
drwxrwxr-x 1 ubuntu ubuntu   56 Aug  1 19:23 ..
-rw-r--r-- 1 ubuntu ubuntu 1987 Aug  1 19:23 activate
-rw-r--r-- 1 ubuntu ubuntu  913 Aug  1 19:23 activate.csh
-rw-r--r-- 1 ubuntu ubuntu 2055 Aug  1 19:23 activate.fish
-rw-r--r-- 1 ubuntu ubuntu 9033 Aug  1 19:23 Activate.ps1
-rwxrwxr-x 1 ubuntu ubuntu  978 Aug  1 19:25 huey_consumer
-rwxrwxr-x 1 ubuntu ubuntu 1555 Aug  1 19:25 huey_consumer.py
-rwxrwxr-x 1 ubuntu ubuntu  238 Aug  1 19:25 pip
-rwxrwxr-x 1 ubuntu ubuntu  238 Aug  1 19:25 pip3
-rwxrwxr-x 1 ubuntu ubuntu  238 Aug  1 19:25 pip3.10
lrwxrwxrwx 1 ubuntu ubuntu   10 Aug  1 19:23 python -> python3.10
lrwxrwxrwx 1 ubuntu ubuntu   10 Aug  1 19:23 python3 -> python3.10
lrwxrwxrwx 1 ubuntu ubuntu   19 Aug  1 19:23 python3.10 -> /usr/bin/python3.10
(py310) ubuntu@Huey:~/py310/bin$

Interesting! There’s a huey_consumer file but without a .py extension there, but it’s still Python. It’s different from the .py file too.

Huey is an alternative to Celery and Luigi. I wonder if those are this screwy. Huey is screwy. The naming conventions boggle. But I have to wrap my mind around it. Forget the huey_consumer that has no extension for now.

The demo.py file that I made per the minimal configuration in the documentation looks like this:

# demo.py
from huey import SqliteHuey
from huey import crontab

huey = SqliteHuey(filename='/tmp/demo.db')

@huey.task()
def add(a, b):
    return a + b

@huey.periodic_task(crontab(minute='*/3'))
def every_three_minutes():
    print('This task runs every three minutes')

And this is turned into a service with a file in /etc/systemd/system/ named huey.service which looks like this:

[Unit]
Description=Run Python script to handle scheduling

[Service]
Type=forking
Restart=always
RestartSec=5
User=ubuntu
Group=ubuntu
WorkingDirectory=/home/ubuntu/github/demo/
ExecStart=/usr/bin/screen -dmS huey /home/ubuntu/py310/bin/huey_consumer.py demo.huey
StandardOutput=syslog
StandardError=syslog

[Install]
WantedBy=multi-user.target

And so moving this work over to my other container that has only primitive scheduling is just a matter of doing a pip install huey and putting those 2 files in place. But instead of 2 scheduler services, I’ll replace the old one with Huey. I wish huey had such a good API for setting scheduling as pip install schedule.


Fri Aug 05, 2022

How to Share an SSH Key with Linux Container to Eliminate Passwords

Nut asks: exist this code on github?

Well, no. But that’s a good idea. In fact, it’s good inspiration for a video because nothing is straight forward. In this case developing on a freshly created Linux container that’s not been fully set up as one of my development systems, trying to get the code up to Github, the challenge is password challenges… Github won’t know who I am connecting in from the container, as I would be. Normally I’m not challenged for Github passwords because I’m sitting on my keys found in my .ssh directory.

Oh, I have the keys still. It’s just that they’re on the “host” system and not the container off of which I’m working. But how to put the keys where I need to override… oh, should I just copy/paste code off container to where keys are set up already? That would be so easy, just to copy the text over. Even without shared drives, you could copy/paste the text of the key-files over using the Operating System’s buffer, copying form Notepad and pasting to vim. But no! Show the people proper 3haring of keys through folder mapping and re-use. People have to know how to share their keys from Windows or Linux host systems to their Linux Containers.

Get your thoughts and notes together. Make a nice video. Okay, here it is:

Currently, you’re running on a server called Huey. Huey has your pip install huey work. The question was asked from the video about sending emails from a Linux services, so that’s the code running on the Munchkin container. Having the host machines .ssh location shared to the container is the desirable solution here because you don’t want to copy your keys all over the place. Endeavor to keep your keys in one place with a good secure backup somewhere and other. But then just map that .ssh location into wherever else needs it. Avoid key duplication.

Okay, so how do we see how lxd containers are configured?

lxc config device add Munchkin dotssh disk source=/mnt/c/Users/mikle/.ssh/ path=/home/ubuntu/.ssh/

Okay, so I can create the repo right on the container where the deployed code exists. That’s a great way to get the latest authoritative known-working code, but what about the Jupyter Notebook from the Windows-side? Well, once I clone the thing back down from Github Windows-side, I can certainly add it. I can maybe even use nbdev_clean_nbs to strip metadata out of the Notebook if nbdev lets me do that on non nbdev-init’ed locations. Try and find out. Video-time? For sure.

Yes, Nut. The code is on github.

This was an extremely important exercise for me to go through, because I’m advocating moving to Linux Containers, and when you do important work on that container, you need a way for it to migrate onto other containers, your host, etc.

I have a github directory that starts out very Windows-centric, because it’s “native” location is:

C:\Users\[username]\github

But which from Linux systems running under Windows Subsystem for Linux (WSL) is located:

/mnt/c/Users/mikle/github

Not all one-off containers need my entire github folder accessible, so that’s one mapping I don’t enforce (the way I do ~/data). I need to move my data around by reference more than I need to move my github repos around by reference. It’s okay to copy (especially via Github.com itself) git repos around.

So let me pull the scheduler repo I just made on Github back down on my Linux host machine.


Thu Aug 04, 2022

Lightweight Python Data Pipelining With Huey (to Replace Scheduler)

Time to finally do the Huey work!

Get to know Huey, from scratch. Set it up from documentation. https://huey.readthedocs.io/en/latest/guide.html

This is in addition to the PyPI/Github documentation with the repo. https://pypi.org/project/huey/

These documents each give slightly different insights as to a system like this. While it has many similarities to the scheduling system I just set up over on the Munchkin container, there are differences.

Foremost, there can be tasks that are not necessarily scheduled that are just sort of sitting there awaiting further instructions. The example they give is this:

# demo.py
from huey import SqliteHuey

huey = SqliteHuey(filename='/tmp/demo.db')

@huey.task()
def add(a, b):
    return a + b

Okay, so the file is named demo.py. There is something called the Huey Consumer, whose file is huey_consumper.py. And it apparently gets invoked as:

$ huey_consumer.py demo.huey

I have to try that. That makes me wonder. I guess huey is installed in such a way as that script file can be invoked as a program. I know such tricks exist as is used all over nbdev, but usually they’re sbin-like and don’t use file extensions. Either way, it has to be tried to be understood. Does the command-line lock-up until a Ctrl+C? This looks like in ideal candidate to be called through systemd. Such things should be services. Perfect.

Okay, so the next step is clear.

Start a video. Capture the experience. Follow the Huey tutorial.

periodic_task(validate_datetime, retries=0, retry_delay=0, priority=None,
context=False, name=None, expires=None, **kwargs)

huey_consumer.py demo.huey

Notice this is not called the same way as scheduler.py. Huey gets called through huey_consumer.py, and in the example they don’t give a path leading to it. This is typical when importing pip installed packages from within Python code. But when it’s on the command-line, it means stuff has been added to the path or put in the existing path.

To figure out what’s really going on here, we have to list the contents of the path variable in the Linux container’s environment variables with printenv. Oh! It’s important to note that we get a different path variable based on whether we have our Python virtualenv activated or not. Here is our default path without source ~/py310/bin/activate:

PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin

And here it is with my py310 venv activated:

PATH=/home/ubuntu/py310/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin

And then we simple replace colons with line returns, and here they are:

/home/ubuntu/py310/bin
/usr/local/sbin
/usr/local/bin
/usr/sbin
/usr/bin
/sbin
/bin
/usr/games
/usr/local/games
/snap/bin

Okay, that won’t take long to figure out. Not many choices. And since it was something pip installed with venv isolation, the money is on it being in the first entry. I guess it’s also worth noting that once we had the Python venv virtual environment activated, we could have also used the whereis command to get the path to the executable:

(py310) ubuntu@Huey:/etc/systemd/system$ whereis huey_consumer.py
huey_consumer: /home/ubuntu/py310/bin/huey_consumer.py /home/ubuntu/py310/bin/huey_consumer

But it’s good to know what your complete path is, so printenv is a bit more important than whereis, but knowing both is great in situations like this.

Okay, so the path is tested. I can launch huey_consumer.py demo.py from the command-line and even funnel it through the GNU screen terminal server, and so success is assured in turning Huey into a monitor-able Linux daemon .service file. And here it is:

[Unit]
Description=Run Python script to handle scheduling

[Service]
Type=forking
Restart=always
RestartSec=5
User=ubuntu
Group=ubuntu
WorkingDirectory=/home/ubuntu/github/demo/
ExecStart=/usr/bin/screen -dmS huey /home/ubuntu/py310/bin/huey_consumer.py demo.huey
StandardOutput=syslog
StandardError=syslog

[Install]
WantedBy=multi-user.target

And in fact it worked!


Thu Aug 04, 2022

Use Perceptual Image Hash as Database Primary Key for Cats

Next step? Switch pip install scheduler to pip install huey!

But first! I started showing Python persistent dictionaries yesterday using sqlitedict and sqlite3. I eluded to the fact that:

And it’s done. I captured the session for your YouTube viewing pleasure. Go through this video and you’ll see how we:


Wed Aug 03, 2022

Use Python Decorators For Linux Service Scheduling

We’re almost at data pipelines and using Huey which I already installed from PyPI with pip install huey on one of my Linux containers. Before I get there, I need to take advantage of one more feature of PyPI’s schedule module that I’ve been using for the past few videos.

Also solve that annoying problem of having to wait a full minute for every test. I should be able to schedule a test (like emailing myself pictures of cats) x-seconds after I run the scheduler, and it should be able to only run once. Get that x-seconds wait and the run-once logic done.

Then switch the scheduling to decorators, setting things up for the Huey switch-over, which exclusively uses decorators.

Get ready for the livestream. On every one, try to:

For Joseph, here’s the 2 files that pull this off on a Linux system. First is put under /etc/systemd/system/scheduler.service:

[Unit]
Description=Run Python script to handle scheduling

[Service]
Type=forking
Restart=always
RestartSec=5
User=ubuntu
Group=ubuntu
WorkingDirectory=/home/ubuntu/github/scheduler/
ExecStart=/usr/bin/screen -dmS scheduler /home/ubuntu/py310/bin/python3.10 /home/ubuntu/github/scheduler/scheduler.py
StandardOutput=syslog
StandardError=syslog

[Install]
WantedBy=multi-user.target

The other file is the one referred to by this.

import shlex
# import schedule
from time import sleep
from os import environ
from sys import stdout
from subprocess import Popen, PIPE
from datetime import datetime, date, timedelta
from schedule import every, repeat, run_pending, CancelJob


pulse_count = 0
print("The pulse service has started.")


def run(command, cwd=None):
    process = Popen(
        shlex.split(command),
        stdout=PIPE,
        cwd=cwd,
        bufsize=1,
        universal_newlines=True,
        shell=False,
    )
    for line in process.stdout:
        line = line.rstrip()
        print(line)
        stdout.flush()


def seconds_from_now(secs):
    today = date.today()
    atime = datetime.now().time()
    asoon = datetime.combine(today, atime) + timedelta(seconds=secs)
    return asoon


@repeat(every(10).seconds)
def hello():
    print("Hello World")


@repeat(every(5).seconds)
def pulse():
    global pulse_count
    pulse_count += 1
    anow = f"{pulse_count} - {datetime.now()}"
    with open('/tmp/scheduler.txt', 'a') as fh:
        print(f"{anow} is written to /tmp/scheduler.txt")
        fh.write((anow) + '\n')


@repeat(every().day.at(seconds_from_now(11).strftime("%H:%M:%S")))
def sendmail():
    print("Sending email")
    pyx = "/home/ubuntu/py310/bin/python3.10"
    cwd = "/home/ubuntu/github/scheduler/"
    cmd = f"{pyx} {cwd}sendcats.py"
    run(cmd, cwd=cwd)
    return CancelJob


while True:
    run_pending()
    sleep(1)

What I’m showing here is generic automation tech. It’s the new stuff sweeping across all Linux distros because systemd. Before systemd, it was easier to do WebDev on Linux than SysAdmin work on Linux because of how painful the precursor to systemd was. It’s called SysV. It’s an “init” system, meaning controlling what happens after a hard reboot. In other words, it’s your startup procedure. Learning systemd lets you control your startup procedure with the language of your choice. I’m using a Python scheduler. Let cronjobs be a thing of the past. Arcane knowledge not necessary! I.e. you don’t need to learn BASH Script! And you still get the service enable/disable, start/stop/restart API of any Linux services. It’s awesome.

I am almost 52 y/o. I am not a software developer. I have no developer related degree. I feel this coding stuff is just basic literacy. I code like I write. I love to write. I love to express myself. What I code is for myself, even when I code for my job. There’s a lot about this that “internalizes” like a martial arts skill. You never stop learning and getting better with a small tool-set. If you’re forever improving on a small tool-set, you don’t get frustrated as much as someone one who has to keep relearning just to keep up with the youngun’s.

There always something new & love-worthy to learn under Linux, Python, vim & git. Opportunity arises when things like systemd land and nobody knows it.

I LIKE THESE

THERE ARE OTHER THINGS LIKE THESE, BUT NOT FOR ME

THESE MAY NOT BE PEE IN THE POOL OF TECH

This video is very much mission accomplished. Next?

Technology that you don’t think will disappoint you will.

Some people need to master their tools. Some don’t. I do. The tech field is hard for me. It’s not so hard for the “multi-lingual” types. I find multiple languages difficult. I’ve tried. My different languages would be:


Tue Aug 02, 2022

Ubuntu 18.04 vs Ubuntu 20.04 for LXD Under WSL2, Wizard Defaults

Regarding the need for Ubuntu 18.04 for a easy LXD install, let’s do a test. It should be easy, because WSL2 & distrod & containers, am I right?

In this blog post I have to satisfy my curiosity as to whether the distrod trick is possible on Ubuntu 20.04. I know so much more about checking and activating systemd now than in prior videos, this one is worthwhile just to target that particular subject. Is systemd easy to install on WSL2 instances of default Linux Ubuntu 20.04, and once it is can you apt install lxd or should you use the snap store, and if you do will it work? The idea is to make the lxd init process go smoothly through the wizard process. If not, there’s too many technical decisions to make setting up the container host.

sudo apt update
sudo apt install lxd
lxd init
[Enter]
[Enter]
[Enter]
[Enter]
[Enter]
[Enter]
[Enter]
[Enter]... through the wizard!

Yep, I just tested it with Ubuntu 20.04 which comes pre-installed with lxd, the init process doesn’t let you just hit Enter for the defaults through the wizard and have it work. There’s magic numbers to figure out to answer the wizard to make it work under Ubuntu 20.04. Yet it works with all the defaults perfectly well under Ubuntu 18.04.

And THAT’S why I advocate sticking with 18.04 for now to get started with LXD right away. Yes, that’s it. Get started with Linux Containers right away, through the destined to be more popular LXD than Docker. While yes, I quite understand and easy Docker images are to host and run, can I have root? May I drop .service files into /etc/systemd/system? As Mickey, can I instantiate a broom and take control of its actions?

Is strive here and now to show you how to do this, for I need to do it for myself and I would like to begin leaving a trail that could be followed by myy child and others needing a way onto being among the privileged class of the information age.

Just know how to read and write general tech, the parts which together do enough to get by and are likely to be around for a good, long while.

Regardless of what the YouTube comments say, I am not Gilfoyle. I am just a determined user of average ability who over time managed to wrangle enough comfort-level in vim to make the switch to mostly Terminal. Add determination and stick-to-itiveness to average ability, and most people I believe could master enough Linux command line to get by. Terminal is less terminal than you may think. *nix survive.

Microsoft’s now on the Linux bandwagon. If that’s not a sign you should be too, I don’t know what is. But the message is a little different, and beautifully Microsoftishly embrace-and-displace than many I think were expecting. Linux Desktops? Who needs ‘em? Let’s be honest here folks, a visual desktop, anything with a Graphical User Interface (GUI) is going to be wired to an application programming interface (API) that could be anything in the back-end. We saw Apple make the switch from their proprietary MacOS 9 to OSX, which was Unix, back in 2007 and hardly anyone noticed. Microsoft could totally rewire Windows to a Linux-based backend, replacing Windows such as it were. Windows would be to Linux what Apple’s Cocoa is to Unix.

The world’s response would be a big yawn. By that time, Windows being dead would be over. You use what you use as a graphical windowing and virtual-desktop managing environment. Who cares how things get done which provide for you a standard Linux Terminal? Let it be Windows 10, Windows 11 or whatever out there if it gets you good hardware and general support today, if it also gets you genuine Linux today?

What I’m advocating rather easily with practice, just like anything else. What I am not is any sort of super hacker or industrial strength tech guy. Take up the stuff today that’s setting in as “standard”. Okay, so Linux won. It takes awhile to admit it, but now with systemd falling into place, I can show people some neat tricks they didn’t know were so easily and readily available to use, if only you knew the secret magical incantations.

I advocate figuring out those incantations and the general rules surrounding them, to instantiate instances of generic Linux computers, upon which all resources are isolated. Then there’s some habits to form so that you can “live and work” inside those containerized computer units.

I further advocate doing this natively and directly on your laptop. I advocate not using cloud resources or cloud providers whenever possible. Instead, use Cloud APIs (application programming interfaces), but let them be wired to local services whenever possible, either right on your laptop or on a home server running behind your home router; something like a QNAP or Synology NAS (network application server). Learn on laptop, push to “local” cloud.

Learn about and appreciate all the plethora of web services available from Amazon, Google, Microsoft, and endless smaller service providers, but try to avoid building dependencies to them into your software or systems. Of course with Microsoft having bought Github.com and npm company behind the free NodeJS default repo, it’s pretty hard to avoid Microsoft dependencies.

I have to admit that with the world turning decidedly Apple and Android because of Mobile and Microsoft missing the boat that the days of Microsoft were numbered. Surely Linux Desktops would set in and make everything Microsoft did consumer-facing look unnecessary. Netscape posed a similar threat evolving the browser to replace the desktop too fast, and was so snuffed. It took Google with ChromeOS and Android devices getting larger to make Microsoft jump. Even Steve Ballmer, unenlightened predecessor to Satya Nadella, couldn’t stop it.

You should be able to infuse text with power in a generic spell-casting sort of way.

Anybody should be able to learn and be taught this generic tech system through the popularly available means that the era and conditions allow. In general, it should be some sort of screen and keyboard at very minimum. Making it a touchscreen would be nice with stylus support for drawing would be a big plus. I like stylus as a machine interface device a lot. Simulating drawing and buttons seems a natural way to interact with machines.

So I prefer Microsoft hardware over most because of the quality of host hardware for the money, the sort of guarantee it would work. I like a baseline level of product quality as nice as the Apple experience, to which I had become accustomed. I loved the early Macbook Airs, circa 2010, and still have them in use as quite capable laptops. I don’t have time to be a hardware hacker or deal with driver issues, so Apple appeals to me.

But ironically enough, despite Apple making the move to Unix in 2007, one of the disadvantages of Apple is it not being Linux. Linux has the big official software repos, Debian-based, RedHat-based, and seemingly now a distro-independent snap version. There’s contending “App Store” tech for Linux to get rid of distro-dependent installers like apt, yum and pacman (Debian, RedHat and Arch, respectively) and just have one “snap install” version that works across any of them.

And that works. And that’s Docker container-driven tech. And it’s a small miracle in and of itself, but it’s not what I’m interested in. I’m interested in making things easy, not harder. I don’t think we should have to be of some supernatural tech-power like Gilfoyle to be everyday effective. No, we should be able to be of rather average intelligence and ability, and still be able to automate-away.

Gilfoyle Quote

“What do I do? System Architecture. Networking and Security. No one in this house can touch me on that. But does anyone appreciate that? While you were busy minoring in gender studies and singing acapella at Sarah Lawrence, I was getting root access to NSA servers. I was a click away from starting a second Iranian revolution. I prevent cross site scripting, I monitor for DDoS attacks, emergency database rollbacks, and faulty transaction handlings. The internet, heard of it? Transfers half a petabyte of data a minute, do you have any idea how that happens? All of those YouPorn ones and zeros streaming directly to your shitty little smart phone day after day. Every dipshit who shits his pants if he can’t get the new dubstep Skrillex remix in under 12 seconds. It’s not magic, it’s talent and sweat. People like me ensuring your packets get delivered unsniffed. So what do I do? I make sure that one bad config on one key component doesn’t bankrupt the entire fucking company. That’s what the fuck I do.”

―Gilfoyle, “The Cap Table”

That being said, I like to get my spell-incantations down and enjoy my spell-casting platform. Like it or not, one of the best starting points for spell-casting platforms is still Windows. Just go buy a laptop. What are you going to get. Just go buy a modern game. What type of computer is it for? Windows won a long time ago, and we’re living the repercussions. And those repercussions are mostly about drivers and economy of scale for game developers. You need to target a money-spending market that exists for commercial products. Sorry, that’s just the way the world works.

Printer drivers, Bluetooth support or whatever. Windows has it working out of the box and most Linux distros will leave you device-less. You’ll be lucky if you get your audio working. Leave it to the proprietary folks to sort it all out. That’s what you’re paying for, and who cares if that part is Windows, Linux or Apple. So long as when it comes speaking the standard languages of tech, there’s something there to listen to you and carry out your instructions. So let Windows be a host to a Windows subsystem, so long as that subsystem is there and not full of show-stopping gotcha’s.

As opposed to Gilfoyle or Mr. Robot who live the glamorous life of some super-hacker, I myself just struggle to get by as a blue-collar mechanic of tech. I try to be creative, but it’s my mechanical skills that come in great handy when often carrying out the vision of others. Product-visions are a dime a dozen. Everyone can have a good idea. It’s having the technical skills to carry it out (and in some ways, think it out properly) are more valuable. Those are practice-over-time craft skills.

And those have all to a great degree just fallen into place because Linux systemd. There’s a story here that needs to be told and perchance become a little more mainstream. It’s not as sexy as Web development and NodeJS, but it is sexy in its own way, taking control of that Frankenstein moment of giving power to hardware and making it come alive… but under your instructions… because you listened to me here and now and started learning systemd.

It’s not hacking. It’s Mickeys Sorcerers Apprentice 101. Take control of the hardware as its being powered on and has not acquired any pre-existing state except for some immutable nonvolatile boot process that hands off control over to parts you directly control. How is that not Tech 101? How is it that life begins at some such-and-such paid-for (eventually) cloud service providers these days instead of your local and usually quite formidable hardware resources? No reason, but for lack of knowledge that such a path exists.

Let me show you that path. This is just basic literacy these days. What I’m showing you having to do with text-based Linux is your birthright as a human being to the free and open source software licensed specifically so you can freely use it. Jump on this evolutionary bandwagon in which you run code that can keep running, maybe forever with some systems we can build here together.

That’s not enterprise scaling network hacking tech I’m talking about here. It’s technique and capability of another sort… a basic sort. One that allows us to automate like Mickey animating the brooms in Sorcerers Apprentice, but done correctly. We address the issues systematically and in the appropriate sequence and context to allow folks to jump onboard the spell-casting bandwagon, and do it right.

That path begins with the certain acceptance of cheap, capable hardware brought to you in part by unfair trade practices pressuring original equipment manufacturers (OEM) like Gateway and Dell to only include Microsoft software as the pre-installed operating systems of all computers they sold (i.e. no Linux-based laptops) or else they’d get disadvantageous pricing on Microsoft licences versus other OEM competitors that yielded to Microsoft, ensuring an industry-lock on pre-installed OS.

That gave us the Microsoft-dominant world we have today, like it or not. Get on the mainstream or suffer the consequences. We’re all jumping on the Linux bandwagon together here, you of the pure-Linux dogma religions, those of the Microsoft Windows System for Linux (wsl) path (like me of late), those on Macintosh who have a pure Unix Terminal which can at least homebrew it from a community-driven FOSS repo, those on ChromeOS who have a Debian Terminal and the same apt system as Ubuntu. There’s even non-Mac Unix camps out there based on FreeBSD, derived from ye ole Bell Labs UNIX by Ken Thompson rewrote by Bill Joy, the same guy as who wrote the precursor to my beloved vim.

You know where much of my need for hardware-love has been displaced to? Python and vim. The vim/Python combination is awesome, because that’s sort of all you really need… up to a point. If you already really know what you’re doing, you don’t need the experimental “feeling around until it works” mode of operating that JupyterLab provides. The path I advocate takes advantage of the host environment, be it Windows, Mac or pure Linux Desktop by calling for JupyerLab-Desktop as a way to gain your Python legs. Even now after being capable of doing Python development work in vim, I still prefer JupyterLab due to the wacky-awesome browser-based REPL (read, eval, print-loop) time-freezing environment that is the iPython project for humans.

Did I lose you? Sorry, things got awesome with JupyterLab-Desktop and the world hardly knows. It’s the on-ramp to the Linux, Python, vim & git future-proofing I advocate, without the steep initial learning curve of vim. Still learn vim. Just learn Python through JupyterLab, and then keep it always-open as a sort of calculator with wonderful copy/paste capabilities. Copy/paste your working Python code into a .py-file. In the future, learn how to do that automatically from a magic fairydust-sprinkled Jupyter Notebook, using the pip-installable nbdev framework. Yuck, frameworks. Okay, I’ll take the nbdev one. The benefits are just too great.


Wed Aug 03, 2022

You Get Blinded By The Hardware

Beware. Your thinking is easily predisposed when you take up new information-soaked subject matter. Tech is very much one of those things. It’s rife with religion and dogma and loyalties and resistance to change.

And why not? Once you’ve bought or learned or deployed some tech, you shouldn’t have to worry about whether it will still be in style 10 or 50 years from now to continue to be supported and kept running. Or should you?

Is tech supposed to last forever to get the most out of your investments, or should tech due to its rapidly changing nature always be considered disposable? Maybe some tech is intended for a 2-year run so it doesn’t matter what you used because everything will get scrapped and rebuilt every few years anyway.

What about the employees and volunteers who went into making those platform-depended short-lived tech? What about those individuals who got into a field dominated by tools destined to change in 10-year cycles, like Adobe Flash? Or developers for a particular era phone or game console? Are such folks supposed to retrain, relearn and adapt?

If entire tool-sets get replaced every 10 years, and it takes about 10 years to master a master-able skill, then you master it just in time to be obsolete. Will seasoned individuals on their way to mastery forever be knocked back to equal footing with the latest crop of ambitious youth hopping on the new bandwagon?

The answer relies in the details of what tech skill sets you took up. It also depends on whether you fall in live with tools and enjoy the process of them fading into the background while your personal powers expand. Simple people gravitate toward particular tools and cumulative mastery while others can take up any tool and just use it a little while and move into the next later and be equally happy. These are 2 different vibes, and they respond different as platforms change.

Did you jump onto the bandwagon of some sexy new hardware and get excessively attached to those attributes of it bound to go away? Did you feel betrayed? Are you slow and weak on the uptake but strong in the long-haul like a D&D wizard? Do all the rules suddenly changing piss you off? And is it even worse because it doesn’t seem to bother anyone else? Maybe it was platform games to 3D. Maybe it was DOS to Windows. It’s always something and it’s always tied to evolving hardware tech trends.

That’s what happened to me with the Amiga computer. I fell in love with it. With her. It was an exquisite hardware platform and easy to fall in love with. They mythology of its creators is beautiful. The ties to the Atari hardware lineage. The development. The coprocessors had girls names and made the thing run smooth—far more smooth and slick than anything else of the day. The moderate success. It’s doom-sealed long-term fate by the very things that gave it a short-term boost. When it went away, I was crushed.

I am not the only one. Closet and not so closet Amiga freaks exist out there in great number, even to this day around 35 years since its invention, keeping it alive in the shadows and on keychains. UNIX on the other hand is older at over 50 and still big—very much mainstream and not in the shadows. Together with it’s progeny Linux, it’s the winning tech platform. It’s the one not driven by hardware. It’s the one with the “virtual machine”-like C programming language defended from the BCPL programming language that made the particular hardware ir runs in of much less consequence.

Bandwagons are often driven by hardware. Hardware will always improve. The move from vacuum tubes to transistors to integrated circuits continues. Hardware is what gets dropped into our hands and enables us—the smartphone, the laptop, the tablet, the game console. Our personal relationships easily get formed around particular clusters of atoms. We love our hardware years at a time. Interaction with particular hardware platforms defines phases of our lives. The Nintendo years. The Windows laptop years. And so on.

But these are the very things destined to change dramatically generation to generation, driven by Moore’s Law and other pressures. So any technical-like craft, such as using a text editor or particular language, will just change because the hardware changes? Should everyone with ambitions aligned to technical mastery of a craft still have to be exactly as dynamic as the competition and profit driven industry they got involved in?

What if musicians had to keep learning new instruments? What if athletes had to keep learning new games and equipment? It seems ridiculous, but that’s much the situation we have in tech with LAMP yielding to ASP yielding to ROR yielding to NODE. And that’s just Web Development. Are we in tech resigned to eternally have to be switching tools, techniques, habits, and junk any deep mastery we acquired over the years, denied the satisfaction of long term hardware-connected mastery enjoyed by athletes and musicians? Generally, yes.

Eventually hardware breaks down so bad it has to be replaced. But what if that old tech is just no longer available? What about all those processes that were running in the old hardware? Can you just never get it running again? That situation must be recognized beforehand and mitigated by decoupling the processes and data from particular hardware. Don’t fall in love with hardware. Hardware will forever let you down.

Maybe your code should be running in 50 years. Who’s to say not? Do you really even have much control over what code and where your code is running now, today? Do you even have code that you might want to run at this moment?

If not, let’s get you coding!

Either way, let’s predispose your thinking properly regarding particular instances of hardware and how you can make them less important to you than they tend to become. And if that sounds anthropomorphically cruel to you, let me remind you that laptop is not your baby. It’s factory base and replacing in every sense, except for the information that resides on it. It’s the information and how to make it come alive again that’s important, and not the particular laptop.


Tue Aug 02, 2022

Send Email With File Attachment From Python

I did the work of expanding the sendemail.py program to:

I didn’t show a video of this because it was basically just a bunch of tedious code wrangling, but now I’m up to the point of including a zipped file attachment containing multiple items.

I’ve done the setup work for that in Jupyter and am ready to move it into the LXD Linux container.

I need to expand the sendmail.py file a little bit in order to attach the file. I can do that for the video.

Get the bit of code that’s not in the Jupyter Notebook yet and put it here.

mimecats = MIMEBase('application', 'octet-stream')
with open(Path('cats.zip'), 'br') as zfh:
    mimecats.set_payload(zfh.read())
encoders.encode_base64(mimecats)
mimecats.add_header('Content-Disposition', "attachment; filename= %s" % filename)
msgdict.attach(mimecats)

I completed this project and here is the code:

import shlex
import schedule
from time import sleep
from os import environ
from sys import stdout
from datetime import datetime
from subprocess import Popen, PIPE


pulse_count = 0

def run(command, cwd=None):
    process = Popen(
        shlex.split(command),
        stdout=PIPE,
        cwd=cwd,
        bufsize=1,
        universal_newlines=True,
        shell=False,
    )
    for line in process.stdout:
        line = line.rstrip()
        print(line)
        stdout.flush()


def hello():
    print("Hello World")


def pulse():
    global pulse_count
    pulse_count += 1
    anow = f"{pulse_count} - {datetime.now()}"
    with open('/tmp/scheduler.txt', 'a') as fh:
        print(f"{anow} is written to /tmp/scheduler.txt")
        fh.write((anow) + '\n')


def sendmail():
    print("Sending email")
    pyx = "/home/ubuntu/py310/bin/python3.10"
    cwd = "/home/ubuntu/github/scheduler/"
    cmd = f"{pyx} {cwd}sendcats.py"
    run(cmd, cwd=cwd)


print("The pulse service has started.")
schedule.every(10).seconds.do(hello)
schedule.every(5).seconds.do(pulse)
schedule.every(1).minute.do(sendmail)

while True:
    schedule.run_pending()
    sleep(1)

Tue Aug 02, 2022

Sending an HTML Email with Embedded Image From Python

What a great success that last round of work was, getting a Linux daemon (service) to send emails under a Python scheduler. Now it’s time to send an attachment.

That’s necessary for your next round of “day job” work, and you have to get the pattern into this current “Linux daemon pulse” project as a sort of copy/paste template location. I imagine this project could work as a template for about a zillion different projects that people want/need to do around the world. It’s kind of funny such a great approach to solving these problems is found in my humble little channel here. I don’t find this stuff anywhere else. Am I the only one doing it and blogging and vlogging about it? Maybe.

Hmmm. But what about HTML email? That’s going to be necessary too, but which should I do first, attachments or HTML email. Start with HTML email and see how easy it is to add on the attachments. Get the stuff ready you’ll need.

The sendmail program is currently this:

import smtplib

msg = '''Dear You,

Let's make this message multi-line, but still short and sweet so that
the entire example can fit on screen.

Mike Levin'''

with open('mail_from.txt') as fh:
    email, paswd = [x.strip() for x in fh.readlines()]

with open('mail_to.txt') as fh:
    mail_to = [x.strip() for x in fh.readlines()]

server = smtplib.SMTP('smtp.gmail.com', 587)
server.ehlo()
server.starttls()
server.login(email, paswd)

BODY = '\r\n'.join([f'To: {", ".join(mail_to)}',
                    'From: %s' % email,
                    'Subject: testing',
                    '\r\n', msg])

try:
    server.sendmail(email, mail_to, BODY)
    print ('email sent')
except:
    print ('error sending mail')

server.quit()

I love the simplicity of the thing. I almost fear introducing HTML email and file attachments, throwing off the beauty. But the modifications to get the supporting libraries changes it like this:

import smtplib
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart

# Create the plain-text email
msg_text = '''Dear You,

This is a plain text email where the line returns are preserved
like this.

Mike Levin'''

# Create the HTML email
msg_html = '''<html><head></head><body><h1>Dear You,</h1>
<p>This is an HTML email where we don't have to worry about line
returns because html will make its own line wrap decisions.</p>
<h3>Mike Levin</h3></body></html>'''

with open('mail_from.txt') as fh:
    email, paswd = [x.strip() for x in fh.readlines()]

with open('mail_to.txt') as fh:
    mail_to = [x.strip() for x in fh.readlines()]

server = smtplib.SMTP('smtp.gmail.com', 587)
server.ehlo()
server.starttls()
server.login(email, paswd)

msgdict = MIMEMultipart()
msgdict.preamble = 'This is a multi-part message in MIME format.'
msgdict['From'] = email
msgdict['To'] = ', '.join([x for x in mail_to])
msgdict['Subject'] = "HTML Test Email"

# Create plain text and HTML alternatives
msg_alts = MIMEMultipart('alternative')
msgdict.attach(msg_alts)
msg_alts.attach(MIMEText(msg_text))
msg_alts.attach(MIMEText(msg_html, 'html'))

try:
    server.sendmail(email, mail_to, msgdict.as_string())
    print ('email sent')
except:
    print ('error sending mail')

server.quit()

So far so good. But what’s the point of an HTML email if you don’t embed images? So let’s modify the above to include an image. Assuming the existence of the image file in the code directory, here’s the sample code to do an HTML-style embedding of an image into the email:

import smtplib
from email.mime.text import MIMEText
from email.mime.image import MIMEImage
from email.mime.multipart import MIMEMultipart

# Create the plain-text email
msg_text = '''Dear You,

This is a plain text email where the line returns are preserved
like this.

Mike Levin'''

# Create the HTML email
msg_html = '''<html><head></head><body><h1>Dear You,</h1>
<p>This is an HTML email where we don't have to worry about line
returns because html will make its own line wrap decisions.</p>
<h3>Mike Levin</h3><img src="cid:image" /></body></html>'''

with open('mail_from.txt') as fh:
    email, paswd = [x.strip() for x in fh.readlines()]

with open('mail_to.txt') as fh:
    mail_to = [x.strip() for x in fh.readlines()]

server = smtplib.SMTP('smtp.gmail.com', 587)
server.ehlo()
server.starttls()
server.login(email, paswd)

msgdict = MIMEMultipart()
msgdict.preamble = 'This is a multi-part message in MIME format.'
msgdict['From'] = email
msgdict['To'] = ', '.join([x for x in mail_to])
msgdict['Subject'] = "HTML Test Email"

# Create plain text and HTML alternatives
msg_alts = MIMEMultipart('alternative')
msgdict.attach(msg_alts)
msg_alts.attach(MIMEText(msg_text))
msg_alts.attach(MIMEText(msg_html, 'html'))

with open("mike-levin-logo.png", 'rb') as fh:
    msg_img = MIMEImage(fh.read())
    msg_img.add_header('Content-Disposition', 'inline', filename='Mike Levin')

msgdict.add_header('Content-ID', '<image>')
msgdict.attach(msg_img)    
    
try:
    server.sendmail(email, mail_to, msgdict.as_string())
    print ('email sent')
except:
    print ('error sending mail')

server.quit()

Phwew! Almost there. The final step is to send a zipped attachment. I thing that may be a little too much for this post. I’ll cut it here and make a zipped attachment into the next post.

Here it is in final form. This code probably better belongs on the next blog post but I put it here for continuity.

import shlex
import schedule
from time import sleep
from os import environ
from sys import stdout
from datetime import datetime
from subprocess import Popen, PIPE


pulse_count = 0

def run(command, cwd=None):
    process = Popen(
        shlex.split(command),
        stdout=PIPE,
        cwd=cwd,
        bufsize=1,
        universal_newlines=True,
        shell=False,
    )
    for line in process.stdout:
        line = line.rstrip()
        print(line)
        stdout.flush()


def hello():
    print("Hello World")


def pulse():
    global pulse_count
    pulse_count += 1
    anow = f"{pulse_count} - {datetime.now()}"
    with open('/tmp/scheduler.txt', 'a') as fh:
        print(f"{anow} is written to /tmp/scheduler.txt")
        fh.write((anow) + '\n')


def sendmail():
    print("Sending email")
    pyx = "/home/ubuntu/py310/bin/python3.10"
    cwd = "/home/ubuntu/github/scheduler/"
    cmd = f"{pyx} {cwd}sendcats.py"
    run(cmd, cwd=cwd)


print("The pulse service has started.")
schedule.every(10).seconds.do(hello)
schedule.every(5).seconds.do(pulse)
schedule.every(1).minute.do(sendmail)

while True:
    schedule.run_pending()
    sleep(1)

Tue Aug 02, 2022

From Sending Email in Jupyter to Sending Email in Linux Service

Ever need a running process to reach out and let you know how things are going? Per Zawinski’s Law, the time has come the Walrus said to make our Python Linux daemon service able to send emails using Python’s standard SMTP library and the gmail email servers. So long as we send from a gmail account, we have reliable and sufficiently permissive infrastructure to tap. Jumping into WebDev already? Take a moment to learn Linux basics with me, especially the world-changing Linux service management system called systemd.


Mon Aug 01, 2022

Write a Linux Scheduler Service in Python

These are the notes for a more involved video soon to come that takes this scheduling stuff to the next step with scheduling external files and then data pipelining.

Have you heard of Data Pipelining?

I’m not talking about operating system data pipes that use those vertical bars (|) in Unix, Linux and AmigaDOS to funnel data out of one program into another, although there are similarities. Rather, the data pipeline process is adopting various workflow conventions so that everyone in an organization getting data from point-A to point-B is following is doing it properly.

Honestly it’s the sort of stuff that stinks of extra moving parts and technical liability to me unless it’s really, really called for. Probably the most popular data pipeline system is Apache Airflow. In the Python world, Luigi is pretty popular. I would rather not touch either with a 100-foot pole. I prefer a lightweight approach. Luckily, there is lightweight data pipelining, and one of them is Huey, which you saw me install just after creating a new Linux container and installing Python 3.10 on the last video.

In this video I’m going to make a precursor to the data pipelining video based on a more basic concept of scheduling.

I like to:

pip install schedule

This gives me “scheduling for humans” in Python. You can read about it (https://pypi.org/project/schedule/)[https://pypi.org/project/schedule/] Their example is this:

import schedule
import time

def job():
    print("I'm working...")

schedule.every(10).seconds.do(job)
schedule.every(10).minutes.do(job)
schedule.every().hour.do(job)
schedule.every().day.at("10:30").do(job)
schedule.every(5).to(10).minutes.do(job)
schedule.every().monday.do(job)
schedule.every().wednesday.at("13:15").do(job)
schedule.every().minute.at(":17").do(job)

while True:
    schedule.run_pending()
    time.sleep(1)

I’ve used this sort of scheduling to great effect for many years and when data pipelining came along, I’ve resisted because this gave me most of what I wanted with the least amount of effort. Scheduled items in the above scenario are defined Python functions.

End of story! But this does not provide a lot of flexibility in terms of managing files and keeping schedule-able tasks as separate folders (git/Github repos). Processes that need to get scheduled in my way of thinking usually turn out to be their own scripts, usually in their own repos (folder or directory). There’s a certain “tied together-ness” implied here,

All scheduled items would have to be in that file or importable as Python modules.

import shlex
from os import environ
from sys import stdout
from subprocess import Popen, PIPE


environ["PYTHONUNBUFFERED"] = "1"


# Through standard Python
def pulse():
    anow = f"{datetime.now()}"
    print(anow)
    with open("/tmp/pulse.txt", 'a') as fh:
        fh.write(anow + '\n')


def run(command, cwd=None):
    process = Popen(
        shlex.split(command),
        stdout=PIPE,
        cwd=cwd,
        bufsize=1,
        universal_newlines=True,
        shell=False,
    )
    for line in process.stdout:
        line = line.rstrip()
        print(line)
        # Put logging here
        stdout.flush()


def onepulse():
    pyx = "/home/ubuntu/py310/bin/python3.10"
    cwd = "/home/ubuntu/github/pulse/"
    cmd = f"{pyx} {cwd}onepulse.py"
    run(cmd, cwd=cwd)


Mon Aug 01, 2022

Build Linux container on Windows and Install Python 3.10

It’s time to start using all the recent LXD knowledge and know-how I built. No single container should be that special. You should be able to slam them out like it’s no big deal, creating entirely new development environments and even server production environments like it’s no big deal. And it’s not! Behold:


Sun Jul 31, 2022

Share Folder Between Windows, WSL Linux, Container and Home Cloud

Share Folder Windows Linux Wsl Container Cloud

For my next trick, I really have to nail down and document how I get that ~/data directory to be in common across:

In situations like this when you want to share files freely between systems there are lots of choices. Increasingly copying stuff up to and down from Github is becoming the easiest choice, but that’s stupid because of all the extra delay and overhead. We have the perfect interface for moving files around, and that’s just copying files between drives. We should continue to enjoy that simple interface even with all these hosts, containers, cloud servers and whatnot.

In other words, all 4 of these locations that I currently have in play will have a ~/data location (except for Windows that uses letter-drives).

There’s a broader issue to address here. To make a ~/data folder accessible “everywhere”, it’s primary location shouldn’t really be on your laptop because when your laptop goes offline, anything else using it loses contact. So it’s time to talk about the QNAP NAS a bit more. It’s a good setup for letting the nice people know they need a 24x7 place to run code that isn’t their laptop and isn’t the cloud. That’s such a huge right of passage with tech capability. Some would call it a home server. Whatever. It’s just a place to run your containers that isn’t your laptop and isn’t the cloud. Call it a home server if you will.

Show the diagram from yesterday.

Let’s to through them one at a time.

Add shared folder on the main windows system (on laptop)

Add shared folder on a WSL2 instance of Linux (on laptop)

Add shared folder on LXD Linux Container hosted under WSL2 (on laptop)

lxc config device add Munchkin data disk source=/mnt/data path=/home/ubuntu/data

And after that command is done from the Linux host, the following command has to be done from inside a container Terminal session.

ln -s /mnt/data /home/ubuntu/data

Add shared folder LXD Container hosted on physically separate machine (qnap nas)


Sun Jul 31, 2022

Do You Still Really Need Windows? Switching To Linux in 2022

Livecast from last night moved to this morning because my kid showed interest in doing stuff together, yay! Back to the livestreaming… what a great point I left off at. Stay humble getting livestreaming going again. Start simply with .screenrc and see where that goes.

Today we talk about “places to run code” and all the stupid labels of what our setup (on our laptop) really is.

When you install WSL you get the unified Windows Terminal installed I believe automatically when you wsl –install. This functionally eliminates the need for 3rd party terminals. The “Unix-like Terminal” is now just a standard part of Windows. This is a huge new reality. This shows us probably:

What I am not (who I am not targeting):

What I am (who I am targeting):

What?

Yes. Since Unix-like OSes won, tech like this is general literacy. Just like knowing English is knowing read/write, knowing *nix is knowing generic pluming of tech for much broader reasons (better life) than being a classic developer, sysamins or devops.

Everyone

Who shouldn’t know “generic” tech. There is generic tech now that the proprietary stuff is on its way out.

You’re going to hear a lot about Docker. Hype.

Why not “just use docker” for this? And why not Docker from Windows for even more simplicity?


Sat Jul 30, 2022

You Won’t Be Using Windows in 5 Years

If you really want to blow up here on YouTube (speaking to myself) without compromising yourself as one of those bombastic hyperbole algorithm chasers, you need to do some better communicating about what’s coming. The “elevator pitch” such as it were.

FORGET VMS


Sat Jul 30, 2022

Fixing Broken Jekyll Rouge Code Color Coding in Github Pages Theme

A few months ago I severed the connection between this website (MikeLev.in) and the Github Pages Jekyll default Hacker theme that I started out with. While the Hacker theme was nice, there were enough little customizations I made that it no longer made sense to tie the two together. So I went in brute-force and undid all the scss (Syntactically Awesome Style Sheet) magic that’s built into the default themes like Architect, Cayman, Minima and the others. I was never much of a Web Development person and it just seems like unnecessary complexity if your CSS is simple.

However when I replaced the styles.scss with just a plain old .css file, thus removing all the chained-dependencies, I somehow broke the imports that color-code programming code like Python, JavaScript and BASH files. I know enough about HTML/CSS to know that something isn’t being loaded, probably a .css file through an include, so I went on a hunt and discovered that during the configuration process on the server-side with Ruby, there’s a step:

rougify style github > assets/css/syntax.css

…and when you do this it outputs a file into location. So I could do away with the whole server-side rendering bit if I just found one of those files on some other site and grabbed it. I found one (here)[https://github.com/danielsaidi/website_danielsaidi/blob/master/assets/css/syntax.css] and just put it in location on my site. Then I added the line that loads into my:

~/github/MikeLev.in/_layouts/default.html

…file. It needs just one line edited in:

<link rel="stylesheet" href="/assets/css/syntax.css" />

…and that’s pretty much it. The fancy color-coding came back, whether I used Github Pages code fencing style or the embedded Ruby code style. I did have to make one attritional entry in my normal style.css file. I may have to modify it if I find too much stuff indented:

.highlight, .highlight .w {
    margin-left: 1em;
}

Sat Jul 30, 2022

Knowing what GNU screen you’re on with .screenrc

Now that I’m introducing the nice people to GNU screen, make it a little more usable with a .screenrc. This file should be put into your ~/ (home) directory named .screenrc. This is just like .vimrc in that it’s an invisible text file serving as a configuration file in your *nix home folder.

# ~/.screenrc
#
# Guy Who Tweaks This File and Adds Foul Language Comments:
#  -- Miles Z. Sterrett <miles.sterrett@gmail.com>
#
# Original Author:         Aaron Schaefer <aaron@elasticdog.com>
# Created:        Sat 05 Aug 2006 06:38:47 PM EDT
#
# Settings used to initialize screen sessions
term screen-256color
# Change default escape sequence from C-a to a backslash
#  escape ``                            # default ^Aa

# Do not display the copyright page
  startup_message off                  # default: on

# Change the number of scrollback lines
  defscrollback 10000                   # default: 100

# Ensure the default shell is the same as the $SHELL environment variable
  shell -$SHELL

# Make navigating between regions easier
  bind s split
  bind j focus down
  bind k focus up

# Make resizing regions easier
  bind = resize =
  bind + resize +1
  bind - resize -1

# Configure status bar at the bottom of the terminal
  hardstatus alwayslastline
  hardstatus string "%{= kb}[ %=%{w}%?%-Lw%?%{C}(%{W}%n*%f %t%?(%u)%?%{C})%{w}%?%+Lw%?%?%= %{b}][%{C} %Y.%m.%d %{W}%0c %{b}]"

# Turn off the visual bell
  vbell off

# End of file

Fri Jul 29, 2022

Sending Emails With Python Through SMTP

These last few videos were quite wondrous. I doubt they’ll get any traction in YouTube because I have not made it fun or palatable enough to absorb the importance, but I’ll keep forging ahead and forge a very clear path for folks to follow, which I’ll clean up and make more appealing. But the raw work keeps getting pushed out.

Let’s get this thing emailing them! Let’s get some barebones stuff in place.

import smtplib

msg = '''Dear intrepid explorer,

We can write it across multiple lines, however because we're using
smtp protocol which is archaic and IBM-PC-biased, we need to include
carriage returns and linefeeds between lines. So after we compose
our message, we can do that with a Python list comprehension.

Do you comprehend?

Sincerely,

Mike Levin'''

pcmsg = [f"\r\n{line}" for line in msg.split('\n')]

with open('mail_from.txt') as fh:
    email, paswd = [x.strip() for x in fh.readlines()]

with open('mail_to.txt') as fh:
    mail_to = [x.strip() for x in fh.readlines()]

server = smtplib.SMTP('smtp.gmail.com', 587)
server.ehlo()
server.starttls()
server.login(email, paswd)

BODY = '\r\n'.join([f'To: {", ".join(mail_to)}',
                    'From: %s' % email,
                    'Subject: testing',
                    '\r\n', msg])

try:
    server.sendmail(email, mail_to, BODY)
    print ('email sent')
except:
    print ('error sending mail')

server.quit()

Fri Jul 29, 2022

Using GNU Screen to Monitor Linux System Daemon Service

This is a pretty epic blog entry and video in which I get Linux services running under GNU Screen under an LXD Container under Ubuntu 18.04 Linux under the Windows Subsystem for Linux.

What should my pitch be to the world? The iron is hot! Moving now puts you about 5 years ahead of the coming WebDev implosion. NodeJS, ReactJS and the like will only get you so far in tech. Being a Web Developer is a lot more pigeon holed than people think. It’s just that the Web is still the big, sexy thing right now, and it’s quite suitable for human interface design. But not everything’s a Website or an App. Not everything requires that deep dive into UI-stuff. That’s the silent majority of tech. That’s the world of everything-but Web Tech. And that’s the world I’m inviting you into here.

How do we begin?

Well, as so many things do, it starts with Windows. You probably own a Windows laptop of some sort. If it’s fairly modern, it’s running Windows 10 or 11. And if so, you’re one step from Linux if on Windows 11, and 4 steps if on 10.

This seems like a super-nerdy thing, but generic Linux tech is a door opener to so many things in life.

Generic Linux tech means getting most Microsoft and other profit-driven proprietary vendor tech out of the picture.

Stay on Windows for a few years, but condition yourself to get off of it by mastering the text-based Linux terminal.

Even if everything gets taken away from you, what remains in your head will be enough to just sit down and reboot your life.

The particular power you’ll acquire is the ability to generically automate things in the world of information tech. That’s valuable.

You’ll be able to keep doing stuff while you walk away from your computer, multiplying your power in this world.

What precisely did we do in the last few videos? We made Linux services on Linux containers residing on standard Windows 10 (or Windows 11) systems using little more than what Microsoft provided by default.

This makes available to you thr bizarro alternative world of tech that has come to dominate severs and all consumer electronics under the covers. Even Microsoft itself is coming around, which makes this timing so important.

You’re about 5 years ahead of thr world at large recognizing this and recalibrating around the new reality. Tech trends will change. What skills are valuable in the workplace will change. The very concept of modern literacy will change.

Following this path now gives you access to the “generic tech” of Linux, Python, vim & git. You can start working on projects that make use of them right now today. You don’t have to buy anything. You don’t have to make yourself dependent on any one company.

You just start at the beginning with some simple magic incantations, and have many new life path-choices open up to you. This is not hype. Just follow me.

I present to you a more practical daily approach to mastering generic tech than the currently popular WebDev path provides.

So the trick now is to turn your practice into your day to day work. We break the sort of work you need to tackle into 2 phases of startup steps to get you to the exploratory phase I’m at now.

Where I’m at now are just generic tech skills everyone should have, way more basic than all that fad-driven Web Development stuff that dominates the media. This is simply what happens when you turn on a piece of hardware.

Let’s get the necessary pieces down and make a video.

This is what goes in ~/github/pulse/pulse.py

from time import sleep
from datetime import datetime

print("Hello World")

while True:
    anow = f"{datetime.now()}"
    print(anow)
    with open("/tmp/pulse.txt", 'a') as fh:
        fh.write(anow + '\n')
    sleep(5)

This is what currently is in /etc/systemd/system/pulse.service

[Unit]
Description=Run Python script to handle scheduling

[Service]
Type=simple
Restart=always
RestartSec=5
User=ubuntu
Group=ubuntu
WorkingDirectory=/home/ubuntu/github/pulse/
ExecStart=/home/ubuntu/py310/bin/python3.10 /home/ubuntu/github/pulse/pulse.py
StandardOutput=syslog
StandardError=syslog

[Install]
WantedBy=multi-user.target

This is what we need to put into /etc/systemd/system/pulse.service to make it run under gnu screen:

[Unit]
Description=Run Python script to handle scheduling

[Service]
Type=forking
Restart=always
RestartSec=5
User=ubuntu
Group=ubuntu
WorkingDirectory=/home/ubuntu/github/pulse/
ExecStart=/usr/bin/screen -dmS pulse /home/ubuntu/py310/bin/python3.10 /home/ubuntu/github/pulse/pulse.py
StandardOutput=syslog
StandardError=syslog

[Install]
WantedBy=multi-user.target

This is how we install the screen program:

sudo apt install screen

And after we make the changes to the file we have to actually reload the daemon (not simply stop and start the service).

sudo systemctl daemon-reload

Wow, it’s weird that I’m not the only one talking about this stuff:

I learned that to install Linux on your Windows machine for the first time, you can see what Linux’s are available with:

wsl --list --online

Ha ha ha, yeah so it’s all knowing the magical incantations. This is an important one. I will have to update the magic-spell-for-Linux. Instead of what I recommend so much which (today) winds you up with Ubuntu 20.04, I want you to end up with 18.04 to make the lxd init wizard go smooth. Oh, showing all this should be one of the videos I take the time to edit. The iron is hot! But nonsense. Just do it in your style and keep focusing on your day-job and your kid. Don’t take excessively deep rabbit-hole plunges. Okay, so when I run that command from a PowerShell, I get this:

PS C:\WINDOWS\system32> wsl --list --online
The following is a list of valid distributions that can be installed.
Install using 'wsl --install -d <Distro>'.

NAME            FRIENDLY NAME
Ubuntu          Ubuntu
Debian          Debian GNU/Linux
kali-linux      Kali Linux Rolling
openSUSE-42     openSUSE Leap 42
SLES-12         SUSE Linux Enterprise Server v12
Ubuntu-16.04    Ubuntu 16.04 LTS
Ubuntu-18.04    Ubuntu 18.04 LTS
Ubuntu-20.04    Ubuntu 20.04 LTS
PS C:\WINDOWS\system32>

Okay, so the command to get 18.04 installed must be:

wsl --install -d "Ubuntu-18.04"

Wow, this streamlines my instructions to people. It will need this context:

There are various rough edges today that will be smoothed out. We’re walking through a portal to another world, the world of Linux.

But we do not want to cross over into just one particular instance of Linux installed side-by-side with your Windows. No, we want to cross over into the world of “owning” generic Linux instances that you can treat like folders in a directory.

Now this might sound like virtual machines to those of you who have been around the block a few times. Rather, it’s more like using all the stuff that’s been done to support virtual machines, but to use it for something else so efficient that it just feels like switching directories when you’re using them, and not virtual machines.

Oh, installing nettools gets you ifconfig if you don’t have it. That explains why it’s not always there. I like this YouTuber guy Dave’s Garage.

sudo apt install net-tools

Wed Jul 27, 2022

Setting Up LXD on WSL2 with systemd enabled Ubuntu 18.04

What a wonderful starting point. Make sure I document for myself how to instantiate a new container and make it feel like home:

This is working off of LXD installed with apt (not snap) on a WSL2 Ubuntu 18.04 instance. WSL installs 20.04 by default so to make this environment you have to install Ubuntu 18.04 from the Microsoft Store. Unfortunately, systemd must be running in order for lxd to install, but systemd is not running on WSL2 Linux. You can’t even just turn it on. There’s an external dependency here. I’m confident it will go away in time as Microsoft recently hired Lennart Poettering who created systemd. But until then you have to use (distrod)[https://github.com/nullpo-head/wsl-distrod]. On an already existing siting Linux host under WSL2, this turns on systemd:

curl -L -O "https://raw.githubusercontent.com/nullpo-head/wsl-distrod/main/install.sh"
chmod +x install.sh
sudo ./install.sh install

Okay, once this is done, LXD should install cleanly:

sudo apt install lxd
sudo lxd init

That init command is the way to set up LXD for the first time. It steps you through a wizard with recommended defaults. The reason to use Ubuntu 18.04 installed from apt (instead of installed from snap under Ubuntu 20.04) is so that you can just hit [Enter] through the entire wizard. Once that’s done you can use the “lxc” commands. Almost all further use of “lxd” as a command is over. All the creation and manipulation of instances of containers is accomplished through the lxc command.

lxc ls 

You can also use “lxc list”. Either way, this is the beginning of your new container habits. If there’s stuff there then the general pattern is:

lxc stop ContainerName
lxc start ContainerName

If there’s no container there and you want to create a new one, the concept is “launch”. Ugh, can you believe it? Yet more mental gymnastics of word/meaning remapping for the container world.

lxc launch images:ubuntu/18.04 GlookingLass

This is the creation of a container instance of an image. The image itself is never saved locally as any sort of master image. It just becomes an immediately runnable instance by using it in an lxc start statement. But there’s a few things that ought to be done.

lxc config device add GlookingLass github disk source=/mnt/c/Users/mikle/github path=/home/ubuntu/github
lxc config device add GlookingLass dotssh disk source=/mnt/c/Users/mikle/.ssh/ path=/home/ubuntu/.ssh/
lxc config device add GlookingLass data disk source=/mnt/data path=/home/ubuntu/data

Those are the two locations from the WSL2 Linux “host” that get added through the lxc command. It’s also worth noting that on the host, this was done to make the data location available:

mkdir /mnt/data
sudo mount -t drvfs '\\EchidNAS\data' /home/ubuntu/data

NOTE: I learned on a following day that this line had to be added to my /etc/fstab to make it persistent.

sudo vim /etc/fstab

Added this line:

\\EchidNAS\data /mnt/data drvfs defaults 0 0

And so if this container is moved around, how these locations get established may vary. But so long as the containers are on my Windows laptop, these commands work. As an added bonus, I have this on the .bash_profile of the WSL2 Linux host so that containers can always have access to the WSL2 host’s nameserver in case I want Linux containers to use graphical apps through X-Server.

export DISPLAY=$(cat /etc/resolv.conf | grep nameserver | awk '{print $2}'):0
echo "export DISPLAY=${DISPLAY}" > ~/data/displa.sh

And in the .bash_profile of the container:

source ~/py310/bin/activate
source ~/data/display.sh
. ~/.bash_prompt
cd ~/github

Oh, that reminds me. To get Python 10 installed and a virtualenv created:

sudo apt update
sudo apt install git
sudo apt install software-properties-common
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt install python3.10
sudo apt install python3.10-venv
python3.10 -m venv py310

This is close to the general container-building pattern I’ll be using maybe forever forward now. With this information in-hand I should go ahead and do experiments knowing I can just do it all again. Make the new patterns in my head. And then make a video where my actions are more practiced for the world. 1, 2, 3… 1?

At this time I will git clone my vim repo and from there, copy these files to home ~/

Oh! One more thing. Let’s get my current favorite gratuitous Linux prompt highlighting working. That’s what that .bash_prompt file is that’s being referred to in the .bash_profile file. Basically just grab it from here (https://github.com/bpeebles/bash_prompt.sh)[https://github.com/bpeebles/bash_prompt.sh] and save it as ~/.bash_prompt. Refer to it in .bash_profile as instructed and you’ll have a nifty prompt that shows your:

It’s a good time to remind myself how to see if systemd is running. This is useful on both the WSL2 Linux host machine to see whether you need to install distrod, or on the containers to assure yourself systemd is running for the next step. That’s the “ps” command.

Okay, so if you want to know if systemd is running, what you’re looking for is the name of the command running on pid 1. That’s “process ID 1”. In both Unix and Linux, everything running on your system is given an ID in the order it runs. By definition, the “init” task is the very first thing to run. The old name of the init task was init. So this is how we look at it:

ps 1

The “ps” command that stands for “process status” in Unix/Linux lists all hte running processes. The number 1 tells it to only list the data for the process with the ID 1. This is your init process. You’ll see in the COMMAND column there’s a lot of gobbledygook for the path and stuff. We can customize the output to just the command column, which gives its appreciation.

ps -o comm 1

This is good but it still shows the column header which is confusing and would mess up automations if you’re writing something that’s just supposed to pull back the name. And so we can suppress the headers:

ps --no-headers -o comm 1

And there we have it. That’s what’s necessary to see if we’re running systemd or not. I’ve got to get that into my head so I don’t struggle every time. The concept is that it’s just looking at ID 1 with the ps command. The rest is formatting.

Now I feel comfortable experimenting with my first container. Initially, it feels precious. But because you’re in script-land, everything is easily automateable so long as you keep good notes. And I am.

Everything I’ve done so far has really been for this next step.

Just do it without getting hung up on a video. Do it with a video after you’ve done it to a container or two. Hmmm. Snapshot a container before you do the work. What are these moves?

Get a service running:


Tue Jul 26, 2022

Windows 11 is more like Windows Il (for It’s Linux)

category: linux

I’m not going to make the argument that it’s time for you to learn Linux. If you don’t know that by now, it’s your bad. However, I will clarify that this does not mean learning a Linux desktop such as Ubuntu or KDE Plasma. The time of the Linux Desktop is not necessarily upon us, nor would it matter.

No folks, I’m talking about those boring old text-based Terminal windows where you better be seeing ~$ or some variation. If you’re seeing C:> or some variation of that, you’re in trouble. Out with the DOS; in with the *nix!

From the time Steve Jobs went on a rampage against the command-line to promote his point-and-click Macintosh computer, the world has been conditioned to think there’s something inherently bad about that text-based environment. Well, good. It’ll turn a lot of people away, and those who can overcome that fear will have an advantage. So the first lesson of learning Linux is to overcome the fear of the terminal.

There are other ways to interoperate with a computer than all the point-and-click baby-talk Steve Jobs tricked us into abiding by for some forty years. Ah yes, about 40 years since the 1984 introduction of the Mac and the first (laughable) versions of Windows a year later in 1985. They pulled one over on us, didn’t they? Computers are now a predominantly consumer and fashion-driven industry, aren’t they? Their battle over control is a battle over your soul, and you can feel it.

Are you a Mac or are you a PC person doesn’t compare to are you an Android or are you an iPhone person. And the fact that it’s proprietary vs. proprietary in the former case, and again proprietary vs. proprietary in the later is problematic too. There always seem to be two main options, a split between a quality experience at a premium or a commodity-driven economy-of-scale product from the rest of the industry pulling together enough to beat a mega-competitor. Shouldn’t we be past all this nonsense by now?

We are. Unix-like operating systems won in a broad sense, and Linux won in particular, when it comes to rapidly molding some hardware into something custom. There’s never been a better time to take up all that old-school text-based Terminal stuff. For you see the era of Jobs and Gates has finally come to an end. Text-based terminal is cool again, and the source of all-age power.

I say all-age power, because it’s accessible to people of all ages, and the power it provides is relevant, regardless of what “age” we’re in, be it the rise of computers or the rise of general artificial intelligence. Learning to touch-type on standard keyboards will make you more powerful on this world and in life than simply relying on forever improving voice recognition. The idea is to master inanimate matter. For once the matter is self-animated, what’s the point. Take a break from watching the machine evolution for a bit and focus on evolving yourself a bit. Learn Linux Terminal and a hand-full of complimentary tools. Be relevant today and long into the future.

One irony is that it was way back in 2007 when the Apple Macintosh itself switched MacOS to be Unix-based under the hood, under the stewardship of Steve Jobs. So it is the very same person who drove us to our excessive reliance on point-and-click operating systems who swapped out the proprietary innards with the great new emerging standard. He just didn’t contribute much back beyond that.

Well that’s not totally true. Linux wouldn’t have systemd today if it weren’t for Steve Jobs. For you see at the time Apple forked a version of a BSD kernel called Mach but it didn’t have all the bells and whistles. In particular, it lacked system management, for which Apple developed the initialization management software (init) called lauchd. It even has a lauchctl. Sound familiar? It should. It’s the model Lennart Poettering looked at when developing systemd, which has taken over Linux system service management on most distros.

At about the same time and nearly 20-years after Apple embraced Unix, Windows is now shipping ready-to-install Linux. It’s not technically shipping with Linux. But ever since Windows 11, having a working true Linux Terminal has been typing one command in a “Run as admin” PowerShell away. Unfortunately, on Windows 10, there’s 3 steps that come before that, but it’s still worth it. Whether on Windows 10 or 11, Microsoft now officially supports your Linux adventure.

All your dev-like work should be occurring with forward-slashes. Even if you’re not a software developer but you need to manipulate data a lot, you should be doing things in a Terminal, and that terminal should have forward-slashes.

This is magic-land. This is where the spells are not merely cast, for that’s using any old software. No, this special place called Terminal is magic because it’s what controls the show before there is a show. It was has control first during the hardware’s initial boot procedure.

No, I make almost the reverse argument. Forever shifting desktops work against achieving mastery, because the tools keep changing.

I’m stubbornly staying on Windows 10 until Windows 11 is cool enough for me. But that’s because I can have the move-to-Linux experience that Microsoft has planned for us while still on Windows 10. Most people can not because of the complexity of having the proper WSL2-experience on Windows 10. It takes:

Step #1: Enable Windows Subsystem for Linux

This command must be typed into a PowerShell, I believe as admin.

dism.exe /online /enable-feature /featurename:Microsoft-Windows-Subsystem-Linux /all /norestart

Step #2: Enable Hypervisor

This command is similar to the one above, but it specifically turns on your native hardware’s (laptop’s CPU) support for hypervisor features. This allows hardware-level “virtual machine” technology to permit full, true Windows to run simultaneous with full, true Linux. Neither is the other’s “host”. That role will now be played by the hypervisor features, as managed by the wsl.exe program on the Windows side. Dadump, dump!

dism.exe /online /enable-feature /featurename:VirtualMachinePlatform /all /norestart

Step #3: Download & Run This Patch

And here’s the step that trips everyone up. A patch needs to be run. I’m not sure exactly what’s going on here, but it is my speculation that whatever this is is one of the big reasons why Windows 11. There’s some serious system-wrangling going on here to make the miracle that is the Windows Subsystem for Linux work. Even with this patch, support for WSLg (Linux graphics capabilities) has not come to Windows 10.

(https://wslstorestorage.blob.core.windows.net/wslblob/wsl_update_x64.msi)[https://wslstorestorage.blob.core.windows.net/wslblob/wsl_update_x64.msi]

And so, patched up and done with however many reboots this makes you do, you are now able to run your Linux Terminal sessions under WSL2. But if you’ve already been on WSL for awhile, you will have to convert your WSL Linux instances from 1 to 2.

But wait! Before you go upgrading your instance, ask yourself if it isn’t actually a good time to move yourself over not just to Linux Terminal, but Linux Containers too? Of course it is! So for now, as of this writing, you want to be on Ubuntu 18.04 for your main WSL2 Linux system, and that’s not the default.


Tue Jul 26, 2022

Linux Container Under Windows 10 WSL Supporting Graphics

category: linux

I’ve been showing people how to get Linux graphics working under the Windows Subsystem for Linux on Windows 10 using VcXsrv for awhile now. But now that I’m using containers, the question arises whether these containers can use the same Linux graphics. And so…

See if we can get Linux graphics under:

Yes, it was possible. The .bash_profile of the “host” Linux running directly under WSL2 is must contain:

export DISPLAY=$(cat /etc/resolv.conf | grep nameserver | awk '{print $2}'):0

This requirement predates containers to make VcXsrv know where to listen to X-Windows messages. It amounts to finding the IP of the internal DNS server, which is always a trick with these virtual-lan setups. That command extracts it from a standard Linux file /etc/resolv.conf and plugs it into a location accessible from the container.

I had to capture this variable and “pass it down” to the container. It’s dynamically generated on the host, but it must be the same one used in the Linux container’s DISPLAY environment variable. And since my ~/data location is in common to the host and container, I am able to do this in the .bash_profile as well:

echo "export DISPLAY=${DISPLAY}" > ~/data/display.sh

An unexpected surprise is that because the contents of this file reads exactly like an executable bash script:

export DISPLAY=172.30.112.1:0

…I am able to put this one very simple line into the .bash_profile of the container:

source ~/data/display.sh

…which you may recognize as extremely similar to the command that activates Python venvs:

source ~/py310/bin/activate

…because it’s using the same trick!


Mon Jul 25, 2022

Removing Password Authentication from SSH Services

category: linux

  1. Get a new LXD container that doesn’t trigger the warning.
    • Don’t forget to use NAT networking mode.
    • Add port 2222 to port 22 NAT map.
  2. Login locally to give ubuntu user a password and add openssh-server service.
  3. Ensure that you can login using the ssh program (re-figure-out IP?)
  4. Get rid of the password challenge.
  5. Move keys over to get rid of passwords.
  6. Get keys in place to git clone from github without challenge.

Eliminating Password From OpenSSH on NAS Linux Container

Create new keys? They’re auto-generated in /etc/ssh/ but just in case:

ssh-keygen -t rsa -C "email@address.com"

Do we need to turn off the ability to challenge passwords? I so:

What I’m sure I do need is a list of authorized keys. If only one then it’s the same as the actual public key being used. One public key, one line.

Much of this video went into creating the ~/data shared folder. Now all locations have the same 3 folders:

That’s a good setup for the videos to come!

systemd-way to restart a service:

sudo systemctl restart ssh

It wasn’t even necessary. But it was very necessary to chmod the files correctly:

sudo chmod 600 id_rsa

And so on.


Mon Jul 25, 2022

New Container Station LXD Ubuntu 18.04 Image and SSH Server Install

category: linux

On a Windows 10 or 11 machine:

  1. Get Linux Terminal running (under Windows Subsystem for Linux / WSL).
  2. Get LXD containerization happily/readily/easily running running under that.
  3. Create container that can access my main local folders & feels like home.
  4. Make this Linux container able to access a “network drive” for data.
  5. Reproduce this container (minus local elements) elsewhere on the network.
    • Pulls down Github repos as necessary

Mon Jul 25, 2022

LXD Container on Windows WSL2 Using NAS SMB/CIFS Share

category: linux

The forcible push forward because now I have lxd/lxc containers running as easily as I had imagined. LXD really is like it’s built into Linux now. And I’m running Linux under Windows, so LXD feels like it’s built into Windows!

It’s been suggested to me to run Docker for Windows (or even LXD for Windows, which I think exists), either of which would give me Linux containers under Windows. But that’s not the goal. The goal is Linux containers under Linux in preparation for a Windows-free future. Windows is ALWAYS optional. Any time you’re introducing a new development tool from the Windows-side, beware! You’re being boxed-in by Microsoft.

Issues for today to solve to forge forward are:

I need a place for big Sqlite files! I plan on doing a lot of work that will be “database back-ended” (if even using the Python dict key/value API). Throw the results of a lot of requests into a “raw data bucket”. And so the container’s “virtual” hard drive does not seem an appropriate place. I’m going to map in probably a location on my home network, from my NAS (network application server). I can share drive locations from there and it’s in the context of my home local area network (LAN) behind router.

I already have my ~/github folder mapped (which i

Mount host’s “home” location as container’s “home” location:

lxc config device add container-name home disk source=/home/${USER} path=/home/ubuntu

I need to choose between mapping WSL host-Linux’s home to the LXD container or the github folder, which is itself a symlink on the Linux host. Catch-22. That’s an interesting vid. Do a part 2.

After removing all mounts from container, try this:

lxc config device add GlookingLass home disk source=/home/${USER} path=/home/ubuntu
lxc config device add GlookingLass github disk source=/mnt/c/Users/mikle/github path=/home/ubuntu/github

Ugh, no luck!

Okay it seems like I have to choose the best compromise. And it was very close to my original plan!

So first I have to re-establish this from the WSL Linux Host for LXD:

ln -s /mnt/c/Users/mikle/github /home/healus/github

Okay, so now I have my github folder from the Windows-side symlinked into the Linux-side.

lxc config device add GlookingLass github disk source=/mnt/c/Users/mikle/github/ path=/home/ubuntu/github/
lxc config device add GlookingLass dotssh disk source=/mnt/c/Users/mikle/.ssh/ path=/home/ubuntu/.ssh/

What are the other folders?

Yeah, my initial gut is right. Of all these, only .ssh needs to be mapped in, in addition to github.

But there are a few files that could use to be moved over:

Hmmm, well .vimrc is already in the vim repo. And my bash script for editing my journals ensures it’s always up-to-date there, so I should really just throw my .gitconfig into my ~/github/vim folder and grab it out of there on a 1-time basis from new containers.

Okay, and now finally to have a home sweet home, we just execute these 2 commands from within the container:

cp ~/github/vim/.vimrc ~/
cp ~/github/vim/.gitconfig ~/

Ugh! When trying to git clone I get this error:

[py310] ubuntu@GlookingLass:~/github $ git clone git@github.com:miklevin/oauthlogin
Cloning into 'oauthlogin'...
error: chmod on /home/ubuntu/github/oauthlogin/.git/config.lock failed: Operation not permitted
fatal: could not set 'core.filemode' to 'false'

All the advice on the net involves doing this:

Save the file and shutdown WSL launching wsl –shutdown from a PowerShell Relaunch Ubuntu WSL

I did this to the Linux host machine, and it miraculously did it.

[py310] ubuntu@GlookingLass:~/github $ ps -aux
USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
root         1  0.0  0.1 224520  8188 ?        Ss   16:59   0:00 /sbin/init
root       216  0.0  0.2  94584 13048 ?        Ss   16:59   0:00 /lib/systemd/systemd-journald
root       220  0.0  0.0  42108  3564 ?        Ss   16:59   0:00 /lib/systemd/systemd-udevd
systemd+   234  0.0  0.0  79924  5208 ?        Ss   16:59   0:00 /lib/systemd/systemd-networkd
systemd+   259  0.0  0.0  70496  4900 ?        Ss   16:59   0:00 /lib/systemd/systemd-resolved
root       260  0.0  0.2 170748 17512 ?        Ssl  16:59   0:00 /usr/bin/python3 /usr/bin/netwomessage+   261  0.0  0.0  49932  4420 ?        Ss   16:59   0:00 /usr/bin/dbus-daemon --system -syslog     262  0.0  0.0 193412  4048 ?        Ssl  16:59   0:00 /usr/sbin/rsyslogd -n
root       263  0.0  0.0  70468  5900 ?        Ss   16:59   0:00 /lib/systemd/systemd-logind
root       264  0.0  0.0  31296  3124 ?        Ss   16:59   0:00 /usr/sbin/cron -f
root       267  0.0  0.0  15964  2436 ?        Ss+  16:59   0:00 /sbin/agetty -o -p -- \u --noclroot       268  0.0  0.3 187228 20372 ?        Ssl  16:59   0:00 /usr/bin/python3 /usr/share/unaroot       406  0.0  0.0  64768  3780 ?        Ss   17:04   0:00 su --login ubuntu
ubuntu     407  0.0  0.1  76404  7232 ?        Ss   17:04   0:00 /lib/systemd/systemd --user
ubuntu     408  0.0  0.0 258496  2852 ?        S    17:04   0:00 (sd-pam)
ubuntu     418  0.0  0.0  21612  3980 ?        S    17:04   0:00 -su
ubuntu     429  0.0  0.0  39672  3580 ?        R+   17:04   0:00 ps -aux

There’s so little running there, I should be able to know each one pretty well. This is the “before”, before I add something running automatically under systemd.

Alright, this is right on the verge of the damn busting wide open.

It took awhile for me to get here.

Take a few deep breaths. Make sure you can build something up in good baby-steps. Document it here. Get it running here. Then use that as your starting point for your next step at work.

Get your dashboards running under Huey on your home network first.

Start development local on your laptop, on an LXD container under WSL. This native Linux container development under Windows 10 without any proprietary Windows software except for the WSL-stuff now built-in. This is the beginning of the “into the future” transition plan off of Windows.

Document the precise next steps.

Okay, we’re going to be creating a Python scheduling daemon.

Or should that be a Python Scheduling Demon for clickbait?

It all starts here:

/etc/systemd/system

And the file that would be named pulse.service that goes in there follows a specific convention laid out here:

[Unit]
Description=Run Python script to handle scheduling

[Service]
Type=forking
Restart=always
RestartSec=5
User=ubuntu
Group=ubuntu
WorkingDirectory=/home/ubuntu/github/hearbeat/
ExecStart=/usr/bin/screen -dmS heartbeat /home/ubuntu/py310/bin/python /home/ubuntu/github/heartbeat/pulse.py
StandardOutput=syslog
StandardError=syslog

[Install]
WantedBy=multi-user.target

Such a file needs to be put or created in place with admin privileges. Merely by virtue of being there, there is now a basis set of systemctl controls for it, or usually also “system” controls if you prefer.

Hmmm. While I feel ready to jump right into systemd stuff finally, there’s just one more bit of housekeeping I need to tend to first. And that’s the use of Windows network shares like SMB/CIFS from a container. Doesn’t look like it’s built in.

I went on a wild goose chase with smbclient. It turns out it’s not necessary! Uninstall it because it also put in a lot of python 3.7 stuff. Remember to do:

sudo apt autoremove

…to clean up after anything that’s been left behind. Okay. So this looks like it works on the WSL Linux host.

The amazingly simple answer was:

mkdir /mnt/data
sudo mount -t drvfs '\\EchidNAS\data' /mnt/data

However, if this is tried on an LXD container, I get:

unknown filesystem type 'drvfs'

Interesting! This drvfs looks like a WSL thing. Since it does not work on a container, one more command is required from the Linux WSL host in order to make this location available to the container. But we should settle in on a ~/data convention. So next we do this on the WSL Linux host:

sudo ln -s /mnt/data /home/healus/data

We will still want to connect it from /mnt/data in the container, so we do this:

lxc config device add GlookingLass data disk source=/mnt/data path=/home/ubuntu/data

And I start the container and it’s there!


Sun Jul 24, 2022

Learning *nix Today Means Learning systemd!

category: linux

I am going to initiate you to the world of nix. Everyone’s got to go through some course in life. From some point, we choose and decide. We break down walls. The world I’m proposing is pretty well called *nix for Unix-like operating systems. But given the profoundly strong initiative of adding kernel-level hypervisor capabilities, soon thereafter by the unification of how system services are handled with systemd, Linux has pulled a wee-bit ahead of the FreeBSD Unix community, and it is them being forced into the somewhat embarrassing position of having to change to stay Linux-compatible. So perhaps it should be *nux that I advocate. Or just plain nux.

There is a blurred line between when we are following a compelled course set in motion for us by virtue of being born, and later when we take control of our own lives. We eventually see things in a new light, shifting from the perspective of a dependent child to one who must make it out there in life. Whether it’s on your own or not is one of those many decisions you’ll need to make.

Linux has pulled that far ahead that for general system-automation work, Linux is the new standard, replacing SysV. There wasn’t much wrong with SysV except for a needless lack of convention across versions of Linux, plus the requirements for an unreasonable amount of Unix/Bash-script skills. This was not what Apple wanted for the Unix soon to be at the heart of Macintosh, so they developed lauchd which came with the control program launchctl. Starting to sound familiar? Well it should! It’s what the controversial FOSS Linux developer Lennart Poettering copied when writing the now-popular systemd for Linux.

And do uh yeah, Unix was there first with replacement to a bash-file oriented init system.


Fri Jul 22, 2022

From Average Windows User to Linux Terminal User

category: linux

Next step? My next step is what I believe is many peoples out there starting point into the interesting world, as Linux comes to Windows. I need to get back on track with public videos showing a typical person’s route to Linux terminal.

These days, you are still likely on Windows. For those on Macs or Linux desktops, the story is a little different. They’ve already embraced *nix platforms. Mac users must embrace Homebrew while whatever Linux desktop will have an underlying text-based distro, which will be their natural choice. For everyone else who is likely on Windows, there is Ubuntu.

This is because Microsoft partnered with a company called Canonical that makes a popular Unix distro. You are therefore aligned with Debian Linux, one of the great non-commercial branches. It’s a popular choice for cloud use today due to it’s great software repository, ease-of-use and popularity in various desktop versions. It’s technically not very POSIX-compliant (one of the measures of Unix-compatibility) because it uses its own package management system. But the Debian package management system (versus Redhat Package Manager) has become so very popular. The fact that Ubuntu Linux has indeed become the “canonical” Linux due to its Microsoft partnership is unironically self-fulfilling.

It’s not a bad bandwagon to jump onto, Ubuntu-based Terminal Linux.

But the story I have to tell is going to be for most of you capable of tuning in and listening to this right now.

The general purpose computer is really just finishing being invented, in the sense that there is enough agreement around how it ought to work that learning it ought to be helpful throughout all life. Your knowledge and know-how and capabilities won’t suddenly become obsolete because some-such next fad.

We know that this time is upon us because of how easy Windows just made Linux to fully install on your system, side-by-side with Windows. As such, every Windows system has also potentially become a Linux system that can help protect you from obsolescence by being overly-dependent on technologies that may suddenly change radically or go away, because proprietary or vendor services.

No, this is an approach that leverages the power of your existing laptop, which is most likely running Windows 10 or 11. Either way, it’s pretty easy to get to “simultaneous” Linux with very little downside.


Fri Jul 22, 2022

Do Not Lose The Power of Touch Typing Despite AI APIs

category: tech

The ability to actually do a thing is often looked down upon, due either to jealousy or a classist attitude, neither of which are good reasons to not learn how to do a thing.

It’s better to lean how to do a thing and be looked down upon by those who can not than it is to inhibit yourself for their approval or acceptance. So go learn! Go do!

Or stay right here to learn and do, because in the modern Information Age, actualizing can be done remarkably comfortably at a terminal with a keyboard. This is a fundamental ability in life that should not be lost on modern youth because of the popularity of smartphones and their onscreen thumb-keyboards and voice recognition.

Much power comes from learning how to touch type and then by getting good at it through practice over time.

Touchscreens work against full-speed muscle memory due to their shifting layouts. If you want to get better at something over time that relies greatly on little-changing interfaces, a standard keyboard for typing is one of history’s great examples. Another example of long-unchanging interfaces in tech is driving. One must master a series of hand or foot moves to result in control of a machine that should operate like an extension of your body.

There is a certain oneness with tools this soon ends up being about. It is where much of the “love for a thing” actually resides. Because tools are like extensions of our body, and the innovations of new ones are incorporated into our very being, one must make their taking-up, study and mastery into an important topic. Life’s too short to take up the wrong tools too often, and only all the more so if they’re not your vibe.

Your essential inner vibe is the music of you as expressed first by your hardware. This is a certain unlikely arrangement of encoded data that is deeply interdependent with the system capable of “running the code”. When your code starts running, you are a floating mass of dividing cells. Even after birth you are so helpless you can’t even hold your own head up for a few weeks.

This made you very vulnerable once, yet this vulnerability does not define you. In due course of life, you’ve taken control of your own life, at least enough so to define yourself. There’s major philosophical splits in humanity here over this. I believe that once you’re truly self-aware, you control enough of your own experiences that you can override the “default” paths and plot some new and a sometimes surprising ones. Much in life that makes it worth living comes from overriding the cocksure-ness of super determinism. I mean, who could have seen everything coming?

Social mammals such as us start out very trusting not out of choice. Our parents, colony or tribe have had to have made the choice to keep you alive. Now you’re alive under your own volition, but how scary is that? You’re part of a system whose stability comes wholly from the ability of one generation to hand things down from one generation to the encryption. Can you imagine the “whisper down the lane” drift?

How much that is just taken for granted by one generation was the “landing in the moon” miracle to the prior generation. It’s that way with smartphones and the Internet today. These are like Harry Potter magic wands suddenly introduced into our lives. Our mundane Muggle-world is now itself magical.

Todays youth expect quality voice recognition built into mobile touchscreen keyboards, which are increasingly their main way of interacting with technology. I mean who even needs to even be in front of a larger screen with a keyboard anyway? The augmented and virtual reality interfaces of tomorrow go a similar way as mobile, because while haptics and chambers could enhance the VR experience, it doesn’t rival always being in you like mobile is today.

While there is some truth to this statement, why what for for increasingly parental AIs to take over just to avoid developing a highly useful skill? That’s like arguing to stay a baby.


Thu Jul 21, 2022

Moving Is Not Easy To Do: Real-life or Linux

category: tech

Moving is not easy to do. I have moved. I moved myself without a truck. I did it just through my Jeep. It’s taken a lot of trips and will take a few more still, but I get to sort and sift unlike “the big dump”. I have to be careful to not allow what I’ve done with the jeep to transform into a big dump.

The big dump is losing sight of what you have enough to call yourself disorganized again. Or rather, to lose control of this “digestion” process I’m trying to foster. I’m digesting what I own to pair it down for a more freer and happier life and existence.

Okay, so you’re in a new home. You have to reorient yourself. This is neural mapping. This is forced change and perchance, growth. With each such “reset” of your muscle memory, you have a chance to think things through. What’s home? What are your habits? What are your most important cross-cutting habits?

Momentum has been broken. This must be re-established. This sort of makes you a newb again in many ways. You have to quickly reacquire your equilibrium.

Pshwew! I’m gradually getting back on track. And even though the getting back on track is gradual, the actual little steps forward are enormous. Let’s recap yesterday’s work, which was actually a culmination of quite a lot of work. What are the most important things to keep in mind?

I need to do some off-work-time videos like I did at the old Poconos place. Just because I’m “back” here in New York City doesn’t mean those videos have to stop. Keep creating. You’ve only gotten a taste. You’ve only given them a taste. Keep channeling. Keep finding that inspiration and let it run through you. Expand yourself. Push yourself. Use the power of thought.

Make a few people happier than they would have been without you–especially your kid. How can you reach them? Do your own interesting things worth learning from. Do things that make you happy and feel like you are in a wonderful place.

What is life about? You are in a wonderful place.

I am on Windows 10. I remain on Windows 10 primarily because of the screen transitions between virtual desktops. It is almost unbelievable to me that Windows can have the trackpad 4-finger gesture can be set to switch desktops using a smooth transition but the Ctrl+Windows keys plus the left and right arrows don’t have a transition at all but rather an instantaneous and disconcerting pop from desktop to desktop. What a major usability regression for people who love virtual desktops!

Windows 11 Doesn’t Let You Hide Taskbar Icons

Even Windows 11 Hacks to Remove Clock Don’t Work


Wed Jul 20, 2022

Moving Into LXD WSL2 ~/ “Home” Once You’ve Moved Into Containers

category: linux

Think! Hmm, okay. Next step is still the beginning. Wow is that powerful. Things are always in transition and I am always re-establishing “home”.

This post is about that re-establishing of a comfortable and productive home.

I love listening to the YouTubes. I’m listening to one called World’s smartest person wrote this one mysterious book.

Okay, next? Next is rebooting… no, I’m doing it continuously. I’m just doing it slowly.

Transition.

Buckminster Fuller who was a classmate of this child prodigy who predicted the black hole reviewed some of the rediscovered works of William James Sidis, who was a child prodigy “manufactured” by his parents. Wow, Skillshare seems to be sponsoring everything these days. Maybe I ought to pay attention. Think about ongoing life for Adi. The era of the Tardigrade Circus is almost upon us. That ship HAS NOT SAILED. I will contrive myself to be at the forefront of some interesting stuff going on which will double as an income source, a big chunk of their higher-grades education and a way to stay relevant as we plummet into the future.

Keep all your eggs in one basket, then copy that basket all over this place. This post concerns making that “all over the place” portion of that wisdom true. We already have Github for making our software repositories portable. We need our important mapped drives now such as ~/github and ~/.ssh. That literally gives you the keys to the (albeit your own) kingdom.

Okay now, next steps once again? Don’t let yourself get distracted. It’s time to convert this journal post public. Then force yourself to “recapture state”. Recapturing state is such an important concept for picking up where you left off in order to continue being productive.

Okay, so I’m here to report to you from the front-lines of staying relevant and resisting obsolescence in a technologically rapidly changing world.

Things need not be so scary on the tech-front. That is, it need not be scary if you learn it pretty deep, taking advantage of today’s cheap, powerful and abundant hardware, using it first how it was intended to be used. And that is as a proprietary Windows 10 or 11 laptop.

This is cringe to the ears of millions. And I agree

I’m going to do a release on this journal to start getting some new published material out there. But then I have to force next-step!

Jekyll is a humbling lesson in patience. 1, 2, 3… 1? Step 1 at this point is to enumerate the requirements for a “new home” in Linux. What does it really entail in terms of:

Before I go listing, take care of that that hugely important point. I found this page: https://www.cyberciti.biz/faq/how-to-add-or-mount-directory-in-lxd-linux-container/ It says:

To mount the host’s /wwwdata/ directory onto /var/www/html/ in the LXD container named c1, run:

lxc config device add c1 sharedwww disk source=/wwwdata/ path=/var/www/html/

So for my use, it would read:

lxc config device add GlookingLass github disk source=/mnt/c/Users/mikle/github/ path=~/github/

I tried it and had this success:

[py310] healus@LunderVand:/mnt $ lxc config device add GlookingLass github disk source=/mnt/c/Users/mikle/github/ path=/home/ubuntu/github/
Device github added to GlookingLass
[py310] healus@LunderVand:/mnt $ lxc exec GlookingLass -- su --login ubuntu
To run a command as administrator (user "root"), use "sudo <command>".
See "man sudo_root" for details.

ubuntu@GlookingLass:~$ ls
github
ubuntu@GlookingLass:~$

That is a success assured moment. So what are you really doing here? Interesting! Okay, this hasn’t shared your .ssh file. Try doing that as well.

lxc config device add GlookingLass dotssh disk source=/mnt/c/Users/mikle/.ssh/ path=/home/ubuntu/.ssh/

Success! Wow, this is big. How much is it now to slam out new home instances. These two commands only need be done once after each new container creation. So go through all the moves, capturing them here again. You are codifying containerization home moves. Moving home when you’ve moved into containers.

Okay, think! From the top:

lxc stop GlookingLass
lxc delete GlookingLass
lxc launch images:ubuntu/18.04 GlookingLass
lxc config device add GlookingLass github disk source=/mnt/c/Users/mikle/github/ path=/home/ubuntu/github/
lxc config device add GlookingLass dotssh disk source=/mnt/c/Users/mikle/.ssh/ path=/home/ubuntu/.ssh/
lxc exec GlookingLass -- su --login ubuntu

Okay, so with the ~/github and ~/.ssh folders from the Linux host linked, why is this container not instantly home? What’s still missing? The running of a script to put a few files directly in ~/ including maybe a .bash_profile.

Is this my new “helpers” repo?

homer

Hmmm. Homer, haha! Maybe. Sure, why not? Nice strong identity. Has something to do with traveling and home… maybe. Sorta.

Okay, made that repo. Since it’s mapped into ~/github already I don’t even need to clone it, though doing so would be good to test the keys.

Wow, it doesn’t even come with git pre-installed.

If we’re using containers and we’re installing Python for the first time, do we even really need virtualenv’s?

Okay, so this is the tricky part now. Let’s make the script that sets these containers up.

sudo apt install git
sudo apt install software-properties-common
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt install python3.10
sudo apt install python3.10-venv
python3.10 -m venv py310

Wow, and that pretty much does it.


Mon Jul 18, 2022

Refresh Your Mind With The LXD API

Wow has it been difficult to get my head in the game again, lately. I just moved from the Poconos back to New York City. It has not been easy physically, financially or emotionally. One has to catch one’s self in spiraling. One must exert extreme cognitive and bodily control to undergo such transitions with grace and continuing positivity. This is a large part of what it takes to be human. We are strong.

It’s easy to do “good enough” when you’re following on yesterday’s habits. But when the deck gets reshuffled a bit you can measure yourself and try to do better according to today’s and perchance tomorrow’s standard. The point is not to thrive standing still, but to thrive just as well and better than most in tomorrow’s world.

I’m on a good path with using Linux, Python, vim & git for grounding and nearly all things. But I’m spiraling again. Catch yourself! That disembodied feeling of switching homes, laptops or now as it turns out, containers. I need to settle into a container as a new home and make that LXD container feel like home… every time I open the Terminal.

I want a “new” home that comes from lxd. 1, 2, 3… 1?

Give myself a refresher course. Grounding… grounding? Grounding! It’s what happens when you open a terminal. See if you can change the Linux system you’re running on. Once again, the thing that is important here is learning the lxc command for everyday easy casual use. Or more likely, for a one-time “startup-cost” round of activity, followed by getting back to day-to-day business. We’re in the startup cost phase of new-API-embracing.

Take a look at what containers are available.

lxc ls

Log into it. Check if putting this in .bash_profile works (it does).

lxc exec foo -- su --login ubuntu

Delete the current container:

lxc stop foo
lxc delete foo

Create a new container:

lxc launch images:ubuntu/18.04 GlookingLass

I can log in as root:

lxc exec GlookingLass /bin/bash

I can add a user (with a home directory) and a password:

useradd -m healus
passwd healus

But I don’t think I want to create a new username and password. I can just use the lxc default, ubuntu/ubuntu.

Okay, so it’s really just that easy to create new home directories. This is chroot evolved, for sure. A cookie-cutter template system for slamming out new vanilla Linux systems.

Wow, that’s eerie cool how easy it is to slam out a new container.

They don’t even have a default python!


Sun Jul 03, 2022

Fairly Certain I Had a Brush With Catknappers

Wow, so I’m moved into the new place. One encounters interesting events in life and must move on. Lingering longer does no good, such as with the circumstances surrounding the disappearance of my cat after its escape in transfer to the car.

Sometimes we linger longer
Or a grieving starts to grow
Which needs a thorough sieving
Before we can let go.

So this is that thorough sieving of my thoughts. Ugh! I friggin’ hate this stuff. I brought it on with my own stupidity, not having 2 cat-carriers ready for my move and believing I could use the one for both of them. That was my first mistake. My 2nd mistake was really believing that I was under some deadline pressure to be out of that place early enough to unpack on the other side before it was too late. I had a queen-sized mattress on my car-roof and all the melting contents of my fridge in the car. So I felt an urgency.

Mistakes Are Made In Urgency

Lynnie Cat Was Likely Catnapped But Is Recovered

In that urgency, I put the easier cat into the carrier first. When it became clear the 2nd cat wasn’t going easily into the carrier, I put the carrier in the car and figured I’d focus my full effort on manually transferring the 2nd, but more feisty, cat into the car by-hand.

This was a mistake. I should have stayed another night and gotten a 2nd cat carrier at PetSmart in the morning, or found a cat carrier or some other container some other way. The cat got away. I definitely was stupid on that one, putting the wrong cat in the carrier and trying to transfer the more feisty one by hand. But I was overconfident in my ability to move one cat to a car without a carrier.

I spent a lot of time trying to get Lynnie before I left. I was torn. I almost unpacked the fridge again, but the electricity in the house was off by this time. There would have been no point in unpacking. The logical thing was to get my stuff to my new place an hour-and-a-half away, unpack (including getting that queen mattress up a flight of stairs) and come back later that night to retrieve Lynnie with the now-available cat carrier. I would leave the door open with cat treats inside to get Lynnie back in, and this is exactly what I did.

I Came Back That Night But Not Before Somebody Else Did

But when I came back that night I was sure I’d find her inside, but when I got back, the side-door was closed and the cat treats were broken up (by hand, they’re like a chocolate bar) and spread around. There were other things like the grilling under the deck pushed-in and the rake that I left in a different place. Someone clearly had been in the house.

This is a fact. In talks since, a few folks have insisted this is the way it is with cats. They will disappear for a few days and then reappear. But Lynnie has had her occasional outdoor adventures and has always stayed close and come back. She has well-established hiding places I could find her.

Someone Was Inside My Place Between 9:30 PM and 3:30 AM

Someone being inside my place struck me as strange and creepy, but I didn’t think they had the actually caught my cat or there would have been some sort of note. Malicious intent was the last thing on my mind. It had to be my wife and her friend who weren’t accompanying me on the move but when I asked them, they said they had not been in the house. Spooky!

Spooked But Not Worried

And the cat had my contact info on the collar and I didn’t receive a call. The alternative that they did catch my cat and had no intention of contacting me was so unthinkable that I put it out of my mind. So I figured the cat escaped them too (though I see now I should have pursued this avenue more aggressively), and I came back the following night with no luck. Finally, on my third time back, I was convinced someone had my cat, and I started piecing it together.

I Apparently Had “Bad Karma” I Didn’t Know About

Earlier that week, a neighbor who I was comforting over their own things going on told me (or perhaps let it slip?) that I had bad karma in the neighborhood from rearranging part of a stone-wall on the property to better accommodate a tree-swing. I made the stone wall a few feet shorter as a safety precaution against smashing feet into it from a tree-swing I had put up in one of the few possible locations. This neighbor so much as told me that I was going to pay. I was quite taken aback, and I was like “Who”? And “Pay how?” I was met only with a strange “oh, you’ll find out” look.

What Would Have Happened If The Cat Didn’t Get Loose?

It was a strangely passionate warning, and my cat escaping clearly gave someone the opportunity they were looking for. I wonder what would have happened to the property if the cat incident wasn’t there. The damage they did to the grilling under the deck shows me they clearly don’t have an issue with property damage. Maybe there plan was to get me in trouble with the landlords somehow. Is this not over? Well, document, document, document!

I have pictures of every inside room and outside the house. If something changes between my having left the property and the landlords returning, I will be able to show the before-and-after.

So if it was one of the neighbors that this neighbor told me I had bad karma with, who could it be?

Oh, is it that perpetual Facebook-group bellyacher? There is this one person always posting grievances on the unofficial community Facebook page with the same first name as a reclusive mystery-neighbor that my drunk neighbor implied. Could they be one and the same? If so, I had a very powerful lever to try pulling.

I knew that if I posted the situation on Facebook drawing the attention of everybody who might know somebody close to me who wasn’t above catnapping, it’d ripple through the grapevine like lightning. So I wrote this post and 10 minutes later, Lynnie was back.

Let me repeat that. After 3 solid days of calling for Lynnie, shaking her treats, putting food out, setting up a humane-trap, checking under every patio deck and hidey-hole, and generally camping out and awaiting the return of a cat who never didn’t come immediately back before, she appeared 10 minutes after making this Facebook post. It was while I was on the street shaking treats and calling. I even think I saw her bolting back towards the house, but it was mostly in shadows and just the sound of a fast-moving animal.

When I got back to the house that I had re-opened the sliding-door to, Lynnie was in the house. And yes, I thoroughly checked the house several times over the prior days after hopeful door-opening sessions.

But there she was now suddenly, 10-minutes after the call-out. Coincidence? Maybe. But there were already comments on the post so I knew it was being read. I even scheduled an event for 9:30 AM the following morning to discuss who might have Lynnie, which I deleted along with the post after Lynnie showed up, because that place has enough drama without outing catnappers.

The loose-lipped neighbor called me back the next morning playing it off like it didn’t happen. My gaslighting detector was blaring like a siren. And then at 9:35 AM someone (a name I recognized) tried Facebook-calling me in connection with the cancelled Facebook event. I had the cat back and there was nothing to discuss. If someone wanted to tell me they had my cat and was waiting before they called the cellphone number that was on the collar until some such-and-such time, they can go eff themselves.

There’s not a lot of people who knew I was leaving at 9:30 PM June 30th and this was one of them. The sliding-glass door I left open for my cat was closed when I got back, period. My cat was gone for the next 2 nights. I might have wanted to not believe it, but I’m not a dumbass. Your drunk neighbor who I was trying to comfort in their tough times so much as told me. And if there was any doubt, the cat running back into the house 10 minutes after my publishing this confirmed it.

The Facebook Post On The Community Group That Freed My Cat

Between 9:30 PM and 3:30 AM on June 30th, 2022 of my last day of rental at
[mountain resort community], my cat Lynnie escaped in transfer. I brought my
2nd cat and packed car to my new place having left treats and the side door
open. I planned on getting back and closing the door to retrieve my cat who
would be inside. 

The problem is that somebody already had. The sliding door was closed and my
heart sank. I went inside to find the treats broken smaller and spread around.
The cat was not in the house. Outside the grilling under the deck where the cat
hangs out was broken as of someone pushing through and the rake was left in an
odd place. 

I held out hope that this was innocent. I've been coming back looking for
Lynnie every day. I see now that this is in vain and somebody took advantage of
this window to steal my cat. I hope poor Lynnie is still alive and this was not
in vengeance for me reworking the end of the previous Michael's stone wall to
be safer against foot smashing when using the tree swing. 

A neighbor told me I was going to suffer bad karma for that. I have
reconstructed the wall to close to original form. When the neighbor who told me
of my bad luck to come saw me this morning and I waved friendly to them, they
zoomed away. I feel I incurred the wrath of a vengeful neighbor. Please let
Lynnie go. Open your door and let her out if she is still alive. 

If she is not still alive, please let this weigh heavily upon your soul as you
await judgement. And I don't mean only in the afterlife while there is that
too, this will go on Reddit if Lynnie does not reappear. I will be in the
neighborhood tonight calling for her. Find salvation. Anonymous tips welcome.

Tue Jun 28, 2022

Adding the Who, What, Why, When, Where & How To Site Nav

Fundamentally, life is management. But we must think about this management not just as economic, but rather managing life as a whole. In my case specifically I’m talking about

Don’t suffer your success because success doesn’t come easy. It’s better to suffer your failures, rinse and repeat until you love your success.

I got the qualifiers links in MikeLev.in navigation: Who, What, Why, When, Where & How, though the pages don’t exist yet. Making those pages is a priority.

WHAT

Pushed most over to /what/. This is just draft:

What I propose is for you to casually create and manage automations throughout life without having to dedicate excessive time to developing and keeping such skills.

I propose that you “internalize it”, passing it onto automatic muscle memory as you do driving and reading-for-pleasure, creating an internal asset better than fame or fortune.

This is a rigid structure for one and only one aspect of your life, that which technically enables you to do other things, pursuing other interests in life unencumbered by startup costs. You are your own tech because being so is just basic literacy today.

To this end, I propose Linux, Python, vim & git


Mon Jun 27, 2022

The Key LXC API Detail I Must Master

I need to understand this command:

su –login ubuntu

…in order to understand this command:

lxc exec foo -- su --login ubuntu

Sat Jun 25, 2022

Figuring Out The LXD/LXC Command-line API

category: linux

For my next trick, I have to get really comfortable with the lxc commands under lxd. Even that I know that’s my next trick has been quite a trick. The term “lxc” has two very different meanings, especially as older lxc technology is actively becoming deprecated and unsupported. I got a notice form my QEMU NAS hosting a sort of baby-Kubernetes that my lxc container is now unsupported and I need to upgrade it. However, the “lxc command” on the command-line under lxd is alive and well. And the one I now need to learn more urgently than any other is:

lxc exec <instance_name> -- <command>

This explains why there is a space BEFORE AND AFTER a double-minus. This is very rare in a Unix/Linux API. Double-minus signs are usually reserved for the long-version of a command-line argument, such as –help being the verbose alternative to -h and –editable being the verbose alternative to -e. It’s such a standard nix convention that seeing this really threw me off. Come on, Ubuntu! Find a better way. Oh, the damage is done. Okay, learn the API…

Okay, so I have an lxd image (no longer an lxc?) named “foo”. To execute commands (in general) inside the foo container, I do this:

lxc exec foo -- [command]

It is important beyond important to understand that the first double-minus is a delimiter to a command. Unix/Linux commands will very frequently carry an additional double-minus argument and so in order to read a login instruction correctly, we must read:

lxc exec foo -- su --login ubuntu

…as “With the lxc command that still controls things under the lxd Linux system daemon and is not really the old lxc deprecated tech even though it is named that way, execute against the image named “foo” the commands that follow this double-minus that must have a space between it and the command. In this case the command is “switch user” to the ubuntu user. I believe this means that the command is normally executing as “root”. I believe this is confirmed because if we didn’t want to switch users right away and actually wanted to be root, we would use this command:

lxc exec foo -- /bin/bash

That is the equivalent of saying all the above, but instead of immediately trying to switch user, we effectively just launch a bash shell with no command, which terminates at simply sitting in a bash shell as root.

So:

–login

…is a command-line parameter to switch user (su) and “ubuntu” is the value being fed in for that parameter. It can be thought of as a key/value pair with login as the key and ubuntu as the value. The parameter is –login and the argument is ubuntu. The language surrounding APIs is absolutely terrible.

But I think I got this.

9 times out of 10, I’m going to want to:

lxc list
lxc start foo
lxc exec foo -- su --login ubuntu

That will show me what containers I have and start one if it’s not started and log into it to do stuff there.

And now one of the profoundly important tests I need to do is whether the Linux container instances running under LXD under Windows subsystem for Linux (WSL2) do themselves run systemd, which is required for the sort of Linux system daemon experimental automation work I’m about to embark on:

Lxd Lxc Instances Under Wsl Wsl2 Running Systemd

…and they do! Woot! Success assured.


Sat Jun 25, 2022

vim tricks: Why The Walrus Fears The Carpenter

category: vim

Today is going to be a very busy day! Put as much as possible in storage. Get the place about as empty as you can, but for the stuff you can move on the day before the official out-day. There’s still time for morning thoughts and the corresponding livecast! That stuff is more important than ever as you massage your website into place and get organized.

The writing associated with this blog post has been moved over to vim tricks.

Magic Happens Where Linux, Python, vim & git Intersect

Here’s a venn diagram of Linux, Python, vim & git that’s been bouncing around in my head for a long time and is only natural to finally get out there.

Linux Python Vim Git Venn Diagram


Fri Jun 24, 2022

LXD Linux Containers in Windows Linux WSL2, But What Is Home?

category: linux

Today I answer some of the biggest questions in life: where is home? And where do you keep all your shit? Labels are stupid. We seek better symbols through example. I talk about the early-days SNL comedian John Belushi a lot as the ultimate Blues viber. For a Green Arrow perspective, we look to George Carlin, the very person who cleared the way for me to open this post with the expletive I do. George tells it to you like it is.

Issues concerning “home” and your true identity cut to the quick, as the movie-version of The Marvelous Wizard of OZ says. George has some other ways of saying it. Call it what you will. We all go through several rebirths in our lives. Home changes. Where we store our stuff changes. It may or may not change you, depending on what you want and how you let it.

Dorothy was born into bleak, gray Kansas. What the MGM movie did was not just an artistic decision celebrating Technicolor movie production. No, the book, actually named The Wonderful Wizard of Oz, opened in black and white too. Don’t believe me? Here’s an excerpt:

“When Dorothy stood in the doorway and looked around, she could see nothing but the great gray prairie on every side. Not a tree nor a house broke the broad sweep of flat country that reached to the edge of the sky in all directions. The sun had baked the plowed land into a gray mass, with little cracks running through it. Even the grass was not green, for the sun had burned the tops of the long blades until they were the same gray color to be seen everywhere. Once the house had been painted, but the sun blistered the paint and the rains washed it away, and now the house was as dull and gray as everything else.”

Ahh, there’s no place like home, right Dorothy? At least for the mainstream movie audience that need some good old American Way brainwashing. I told you I was born the same year as UNIX, 50-miles from Murray Hill, NJ, right? For those who bust on New Jersey—we just gave you the modern digital world, is all. And when Washington crossed the Delaware from Pennsylvania to turn the revolutionary war, where was it into? New Jersey. You’re both free and on your phone or computer right now because of what? New Jersey.

We’ll, my mom was born the same year as The Wizard of OZ, 1939, just 20-days before its release. I think to myself her problems probably began with such a large cultural event stealing all the thunder of her birth, additionally aggregated by being the 3rd of 3 children and the first girl in a family that didn’t much value females I’m told—old-school Hungary-fleeing pre-WWII merchant-class Jews as they were. My Pop Pop ran a shoe store while his two sons entertained themselves by throwing tennis balls at my mom growing up. Ahh, home.

We’ll, Dorothy and my mom escaped. Little known fact, Dorothy’s experience wasn’t really a dream. Unlike Alice’s adventure in Wonderland where she woke up and returned to her static state, Dorothy really went to the land of Oz, located hidden in the middle of the uncrossable great desert somewhere in the United States. I suspect it was for the solar power or maybe geothermal energy sources for the data center Oz would require. Data center locations are kept secret too when they can be. Baum was a true channeler.

Oz is real and Dorothy eventually moved her doubting Aunt Em and Uncle Henry there who started out living fancy in the urban Emerald City but ended up moving to a small farm in Munchkin-land for familiarity and comfort. Ah, home. There’s no place like home, but it need not be bleak grey Kansas (no disrespect to the fine city of Topeka that birthed the Amiga Computer’s Video Toaster). Home is where you move your Aunt Em and Uncle Henry too, even if that means uploading them to Oz. We’ll deal with the moral implications of what to do with the “original matter” bodies in later posts. It’s the information, and “living experience” it induces that actually matters.

Growing up and leaving home, still supported my parents. Later, supporting ourselves. Getting your shit together for the 1st time for “next-level” stuff—doing it a 2nd time, a 3rd time, rinse & repeat until dead. These are times we reinvent ourselves. You’re never done reinventing yourself. It’s actually a continuous process. You’re subtly doing it right now just my listening to me. It’s all been said before, but the Pied Piper gets to add their particular flair and charm to the music. The more you let yourself “be static” the more forced and painful these transitions will be—so says I.

It’s true for your loved-ones too. The more you encourage them to be static personalities and to close their minds and take up “your ways 2-point-oh”, the more they’re going to be pained later on when they realize it’s not their ways. It’s not what makes their heart leap for joy and their soul sing. You chose their “home” and slapped a ball-and-chain in their legs to keep them there as a little carbon copy of you—probably out of fear and not for a true love for life you feel yourself. If you love something you set it free. If the live was real, it may or may not come back to you. It’s not about you.

But what about you? And what about home? Home is where you transition all that is love-worthy. Home is where you organize your heart, mind and a little bit of stuff. Don’t invest too much into matter. It’s just a form of ball-and-chain. Left at your parents’, it’s held hostage. Moved to your living space, it’s choking. Put in public storage, it’s a costly drain. Purge, my friends. Purge and get down to what’s most love-worthy and irreplaceable. That’s home. Everything else can be reacquired or recreated. Better yet, just grab enough information about it so that it can be remembered and release similar neurotransmitters in your brain as the space occupying boondoggle once did.

Hope is what you back up in git repos (folders or directories) and then back up in remote repositories for safety, such as Github or Git Lab on your own private cloud server (or two). It might sound too techie or unachievable today, but that won’t last for 10 years. Personal secure cloud servers will be as common as external hard drives are today in under a decade. They’re almost there today with network application servers from QNAP and Synology and endless Kickstarter projects. Put all your eggs in one basket and then back that basket up all over the place.

Next, curate the one particular folder you call home. It holds your .vimrc (vim configuration), .bash_profile (terminal startup script with aliases, etc.) and a few other things such as your /usr/local/sbin contents that you can move “home” for the sake of a git repo backup. You’re essentially packaging “home” to go. It’s what the Jews did with the canonical 5 Books of Moses and slapped into the Ark of the Covenant every time they were kicked out of their home and out into the diaspora. Get the most important basics. Everything else can slip back into oral tradition. If it was really important, it’ll return to your life.

Don’t answer these questions in this video. Too big! Just pose the questions to clarify next steps in your mind. Show a few tabs of research. And leave it as a cliffhanger as you go off to investigate on your own. But of course the answer to all these questions is some sort of qualified Yes.

Go, Go Unstoppable Me!

What’s home?

Bonus feature: stay alert to what makes you happy and what works best for you. Have another “vanity home” or public mailbox if you will where you can collect, refine and even advertise these things about yourself. Let’s call it a personal homepage on the Web. My name is Mike Levin so why not have a domain name that is a nearly perfect match? But you can’t have spaces in a domain name and you need to choose a top-level domain (TLD). What about MikeLev.in? Okay but it implies the wrong pronunciation. Emojis to the rescue. I have a 🎤. I like to talk. I 💙 to talk into the mic. I’m 🎤 💙in, so let’s begin.

Why would you do this? Why would you share your current interpretation of who you are and what you’re about on the Internet? It’s to help satisfy the second tier of the Maslow Pyramid of Needs: societal connections and belongings. I’m connecting with Scott M for example in my video chats, a 20-year veteran of the Murray Hill NJ facility where UNIX and half our modern world was invented. There’s that and that he sounds like an interesting guy himself to boot. So, let’s boot. It’s never too late to reboot. Pick yourselves up by the bootstrap and defy the law of conservation of momentum. Be human!

That just may be the purpose of sentient life.

There’s no place like the current instantiation of your system for defining current home.


Thu Jun 23, 2022

Get LXD Running Under WSL2 on Windows 11 with Ubuntu 18.04

category: linux

Teach how to get LXD under WSL2 running, easy peasy!

https://github.com/nullpo-head/wsl-distrod

To enable systemd on an existing WSL2 installed Linux:

curl -L -O "https://raw.githubusercontent.com/nullpo-head/wsl-distrod/main/install.sh"
chmod +x install.sh
sudo ./install.sh install

And then enable it:

/opt/distrod/bin/distrod enable

Test whether systemd is running:

ps --no-headers -o comm 1

If that prints “systemd” you’re all set to install lxd!

If there had been failed attempts:

sudo apt purge lxd
sudo rm -rf /var/lib/lxd

If a fresh attempt:

sudo apt install lxd
sudo lxd init
(yes all the way through wizard)

Reboot. wsl –shutdown from Powershell

lxc launch images:ubuntu/18.04 u1804a
lxc exec u1804a -- su --login ubuntu

Keep this in mind for sharing volumes from pool:

lxc storage volume create default blah
lxc config device add c1 blah disk pool=default source=blah path=/blah
lxc config device add c2 blah disk pool=default source=blah path=/blah

There’s also advice on sharing host volumes (including home):


Thu Jun 23, 2022

On Being Unstoppable, APIs & LXD / LXD Under WSL2 (Success!)

category: linux

APIs, and not giving up when learning them!

Start out with another reading of my morning thoughts.

Don’t let anyone or anything stop you. The world and many of its players don’t want you advancing in levels. It lessens themselves in their own mind. They think in terms of scarcity and zero-sum games. A zero-sum game is where a fixed amount of rewards are divvied out to winners and losers at the end.

Whether or not existence is a game of zero-sums is a function of whether life itself can overcome entropy and infuse more matter-organizing information into existence than loss through the 2nd “scrambling” law of thermodynamics. Not all heat (information) can be converted into work (organized matter). Who knows? Maybe it can. We are each Maxwell’s Demon.

This is why I’m guiding you towards Linux Daemons. It gives us a path towards actually becoming Maxwell’s Demon, who by entropy-defeating thought experiments, can un-stir the milk from coffee or un-smash a glass. All you need is a guiding-hand on the application program interface (API) of reality.

Coffee’s on the left and Cream’s on the right. Mix’em. Now put a wall down the center of your cup that has a door. Only open the door when a coffee molecule’s going left towards the door or a cream molecule’s going right towards the door. Thus the arrow of time can be reversed and entropy defeated through life sentience.

The problem is that sentient thought takes energy as does opening and closing a door. Nothing’s free in this world—but maybe the fruits of your creative thoughts are… ever so marginally… free! It’s the one place where the laws of thermodynamics are seriously challenged, and the whole purpose of life might thus be to make it collective significant enough to overcome the inevitable heat death of the universe.

So back to not letting anyone or anything stop you. Through no fault of their own, it’s just the way the existence is rigged, there is more random scrambling chaotic energy that decreases order and undermines your will than there is copacetic order-nurturing energy that increases the entropy of a system. Things go to hell more naturally than to heaven—and only all the more so if someone views you as their “opponent”.

Therefore, the number one priority is to protect the quality of the information coming into your system. Lies are profoundly damaging, and like Sherlock Holmes in his bloodhound “mode” often referred to my his account documenter, Dr. Watson, you must always be tuned into the “scent” of truth. Truth has a smell. If something doesn’t smell right, whatever appears to be going on may be a lie no matter what you’re told.

Like Holmes, stay relentlessly in the trail and do the right thing. Document everything and explore many avenues. Don’t close your mind to any possibilities. When anger rears it’s ugly head against you, you know you’re onto something. Somebody who thinks they’re in a zero-sum game is jealously protecting the truth. The less secure the truth-hider, the more intense the emotional energies you will find rallied against you.

This is when you can expect to be attacked and beyond channeling just the unstoppable Sherlock Holmes, you must also now channel the unstoppable Juggernaut—Professor X’s more physically endowed brother from the Marvel Universe who could give the Incredible Hulk a run for the money. The Juggernaut as his name implies is physically unstoppable. He’d pass through a planet if one got in his way. The intellectual unstoppabilty of Holmes must be backed up by the physical unstoppability of The Juggernaut. Eat well. Get enough sleep. Keep yourself hydrated.

Now comes a point about subtly. It’s a subtle point and is maybe implied by the Holmes paragraph. But more to the ping it’s about application program interfaces, or APIs. In one of the most delightfully ironic and meta lessons of literature, you have to get to the actual 14th and last book of Oz (by Baum) to learn the most important lesson of all about unstoppability. It’s not merely intelligence and strength. There’s a bit of faith.

In the book Glinda of Oz, Dorothy needs to penetrate into the world of the Flatheads and Skeezers in the last unexplored lands of the Quadlings to help stop a war. Dorothy is blocked at the entrance by an invisible wall. There seems to be no way through or around it, but it turns out after a full day’s walk (if I remember correctly) the wall ends and you can walk right around it. Thus Dorothy and her companions gain entrance to the land of the Flatheads and Skeezers.

Finding such a solution to a problem is not easy and you must have both guessed the wall must end in a way that doesn’t seal it off and that you will eventually reach it if you just invest the time and energy to try. That’s the faith. You have to have enough faith to try. But you must also have enough wisdom to know when to give up.

You may fail and the whole day’s journey may be for naught. Either the invisible wall ends and you can go around it, or it doesn’t and you can’t. You functionally know one way or the other. If Dorothy was traveling for days to get around the wall then it’s as well as sealed because they would be too late to stop the war—and it is functionally endless in which case Dorothy must back off and retry. So you must know how much time you’re willing to invest in exploring a solution and eliminating possibilities. But the invisible wall wasn’t endless or sealed off and Dorothy got around it.

This is where I’m currently at with LXD on WSL2, and in other analogous situations in life. Invisible walls have been constructed. It’s easy to give up. Life and very existence conspires to encourage your giving up. Signals are emitted at you telling you that you have no other choice than to give up. This tells you that you’re onto something and the truth is right around the corner, if you just keep at it like Sherlock Holmes, the Juggernaut, and Dorothy and her companions.

Tell the nice people about the time-line of Unix, which is like it’s invisible wall, and how by 2030 they’ll get over it.

Over the next 10 years, Windows is going away. You’ll be on Linux, like it or not and know it or not. “Windows” becomes to Microsoft what “Cocoa” is to Mac.

In other words, it doesn’t matter if it’s Windows, Unix or Linux underneath as far as the graphical desktop is concerned. Mac users hardly felt the shift to Unix in 2007 when OS9 became OSX. Same with Windows.

Backslashes become forward-slashes… I.e. Powershell and DOS Prompts go away. There may still be emulated Powershell & DOS Prompts for compatibility, but the WSL stuff of today “goes native” and the Windows stuff of today becomes the 2nd-class citizens of Windows 2030. Eight years away.

Get your butts on WSL2 today. You need to learn Linux Terminal. It’s the new just plain “Terminal”. It already is on Mac (Unix) and will be on Windows by

  1. Learn Tech. Learning tech is learning Linux Terminal. If you don’t know Linux Terminal, you don’t know tech. Proprietary’s dead, Jim!

Get LXD working. It is working… figure out the API. Go around the invisible wall.

Great Success! I’ve Figured Out Logging Into LXC / LXD from WSL2

Wsl Wsl2 Logging Into Lxd Lxc

Unix/Linux Terminal Is About Literacy

This is about Literacy… some might say Technology Literacy, but I say just plain literacy. Reading, writing & aRithmetic, the 3 R’s (history, etc.). But computer? Tech? nix? literacy is just literacy. It’s like one of the 3R’s. That’s cause Unix/Linux won. By 2030, you’ll all know and accept it and it won’t even be an issue. It’ll be like it was always that way.

Learn Unix / Linux Now (Instead of 2030)

You get advantages in life by switching now, because even though Windows will be Linux-based by 2030, it’s still all about dependencies on proprietary GUI (graphical user interfaces) even if the underlying stuff is now generic tech.

If you know the generic tech, you’re more powerful than those who rely on the often-disrupted (force upgrades, etc.) GUI tools.

Remaining LXC / LXD under WSL2 challenges:

We talked about the Yin/Yang “energies” or “vibes”

Blue Circle Green Arrow Yin Yang Duality Vibes

We talked about Virtual Local Area Networks & avoiding nested VLANs

Wsl Lxd Vlans Virtual Local Area Network Qemu

We talked about how Unix won & the *nix footprint

Unix Won Compromise 80 20 Rule Footprint

We talked about lesser vim wizards vs. greater emacs wizards

Lesser Vim Wizards Vs Greater Emacs Wizards

We talked about the Idea Capture Funnel and Materializing Into Real World

Idea Capture Funnel


Wed Jun 22, 2022

Fixing Jupyter Desktop Set Python Environment Error Message

category: about

I’m currently JupterLab-Desktop 3.3.4-2 and got an error message I hadn’t seen since the last version 3.3.4-1 I had to skip (because of this error). But I got it spontaneously on the latest version I’ve been using for months and I can’t let JupyterLab Desktop get taken away from me, so I decided to fix it.

Would have liked to have gotten back to LXD.

In my attempt to fix the Jupyter Desktop Set Python Environment problem, I:

And that “registered” the new Python 3.10.5 kernel with JupyterDesktop and it’s now my default (yay!)

To get JupyterLab working again, I believe it was a result of deleting the JupyterLab config file:

jupyterlab-desktop-data

Found in a Windows path like:

C:\Users[username]\AppData\Roaming\jupyterlab-desktop

I am sorry I had to break this into two videos and that I went the long way in fixing it. However, I’m very happy I’m on Python 3.10.5 in JupyterLab now.

Perhaps it was providence that it spontaneously stopped working (giving that Jupyter Desktop Set Python Environment error).

Part 1: The Failure

Part 2: The Success


Wed Jun 22, 2022

Livestreaming Question & Answers With YouTube Audience

category: about

Hello everyone! This video started out as an attempt to edit a YouTube livestream before it was ready and turned into Q&A with the audience who joins in as the workday begins. I guess that’s only logical and I let it continue because it really is so work-related. It covered a lot of topics of obsolescence resistance, vendor traps and LXD (Linux containers) under WSL2 (Windows Subsystem for Linux).

I had questions from about how I navigate around my notes and read them later. I showed my tag system and talked about how it’s not about re-reading, but it’s really about learning better in the first place like taking notes in class. The best way to learn is to teach, and this helps me teach myself. Over time, good ideas will emerge, get polished and be moved to other sites. But I don’t worry about that so much.

As far as a little bit of organizing as I go, I have my own CMS called Skite which lets me use one text file and lots of blog-like entries (in combination with vim macros) to blog (for life or per site or per topic).

All based on taking notes as a excellent way of learning.

Knowledge workers of the information age bring their minds and capabilities as their part of the deal. They have value because they pre-generated skills and abilities that the organization needs. We don’t work for ourselves because bringing it to an organization releases our value more efficiently than all the overhead of working for ourselves (employees, paperwork, marketing, taxes, etc.)

Peter Drucker… Managing In Times of Great Change… Knowledge workers.

Python Pandas is the money maker… when coupled with OAuth2 savvy.

The cloud MUST BE OPTIONAL.

The new vendor dependency trap battle (i.e. slapping a leash on you) is taking place through proprietary application program interfaces… Cloud APIs… but also VSCode, no doubt. Who will learn vim or git when you have VSCode?

The answer is no one can escape VSCode. You just must manage the brainwashing.

And thus Microsoft dependency is assured.

Where is safetly?

Sqlite3… learn to use it for EVERYTHING locally. You at least have a cross-platform contingency plan for all things high performance-ish data. All Sqlite3 work can be adapted to the cloud for scaling, etc.

The Unix way… The Linux way. Google “The Unix Way”… and toolbox concepts.

Django and flask are are rare safe harbors. Their APIs change what? Every 10-years… maybe.

When flask embraced concurrency (async/await keywords), the API didn’t break. Compare that to FastAPI where pedeantic is a dependency as is Web-documentation publishing… very Microsoft.

Look for non-changing APIs in long-term tools. The Unix Way. Also appears in a lot of timeless-esque Python tools.

Obsolessence resistanmce on Python?

Technical liability & overhead… It’s what vendors hold over your head. It’s liable to be chopped off, leaving you with obsolete skills.

Use Python dict API… standard builtin data model like int, float… Python also has list, tuple & dict. Of these, the dict is a universal key/value pair API applicable everywhere and in everything… highly optimized, and can be “back-ended” by SQLite3, Redis, Amazon S3 or what-have you. But by using the standard API, your code will never break and you can change what back-end’s your key/value pairs for persistence.

This is a million-dollar trick.

Obsolessence resistance is key to me… but entirely at odds with the:

This is the world everyone will compel you into today:

I’m 51 years old and have been made obsolete by so many things that looked like sure bets, but there is no such thing, but Unix.

We’re all temporarily screwed by JavaScript’s dominance. But it will become much like SQL or RegEx. Who really loves SQL or RegEx, but they’re always present subsystems and resume keywords. So be it. JavaScript? Same thing. User Interface APIs. Concurrent (non-blocking) and always there. Thank goodness it wasn’t display PostScript (it could be worse).

Loosely couple a JavaScript API-layer with Python back-end server to deal with all today’s WebDeb “gotcha’s” (new framework every year) in a Pythonic way.

The work I’m putting off right now is the LXD work. It’s there. It’s working. It’s ready to go. It’s my new Linux playground under Windows.

I NEED LXD under WSL2… not having it is unthinkable. I think I have it working. I need to do Python Data Pipelining (like Luigi, but Huey).

Not Having Lxd Under Wsl2 Is Unthinkable

I talk about data getting, shape-changing & putting.

Key to that is the Linux tools.

Key to that is playing around with various tools under Linux with FUBAR’ing my main Linux “host” and keeping it on my laptop (not cloud).

All my pain is because Windows Laptops are cheap, cool & very well supported from drivers to repairs… it’s offering WSL which appears good enough… especially if supporting LXD… which is the REAL Linux playground…

There are nested Linux playgrounds. WSL Linux’s can be swapped out easily enough… but not so easily for rapid daily new Linux playgrounds, which I need for my next phase of work.

I need virtualenv aka venv as in Python but for Linux.

Guile is like a new BASH Script.

Once upon a time there was Unix script. Not good enough. So each “shell” had their own: BASH Script & Kornshell Script… problematic. No one likes, so then there was PERL. All Unix’s and Linux’s started shipping with PERL, thus the P of the original LAMP platform: Linux, Apachee, MySQL & PERL (PHP & Python)… plone/zope.

But PERL is overkill and underkill at the same time, so many flocked to Python and Python became the new PERL, but the GNU organization can’t have such dissonance in the GNU tool-set, so they replaced Unix Script with Guile. Guile could let you do away with Python if you just wanted obsolescence resistance in the basics.

SQL-like data manipulation with Python/Pandas is not the basics, though much could be accomplished with:

It is sooooo the Unix-way… however, Python (pip install, etc.) is too good to walk away from.

Please comment on the Video.

Find it on the latest MikeLev.in blog post.

I am organizing them for you.

The greatest thanks will be to help me blow up on YouTube.

I am an SEO after all, and I’m experimenting with different factors, and comments on videos seem to me like one of the most important.

Interaction is invaluable.

A sure sign of quality.

So if you think you’re onto something here, at least give me a hello on comments.

I really appreciate it.

And now back to our regularly scheduled workday.

Peace out.

I know I’m different, but I’m just using the tools Microsoft is including for themselves to not be rendered obsolete. Hilarious turnaround, is not?

But by giving the doorway to the world of Linux, they are also installing vendor-dependency traps like: Azure dropdown from Terminal and VSCode to keep you from working in LXD under WSL2 and from vim.

But each has an ~8GB footprint. LXD footprints are much tinier. So it’s about hard drive space on a laptop (anti-cloud methodology is keeping your “virtual Linuxes” light and small).

There’s always alternatives… VirtualBox was once one, then it got bought by Oracle… just like MySQL. So paths to lightweight that don’t get gobbled up or otherwise ruined by vendors are rare gifts, vendors don’t want you forming habits around.

LXD is one. vim. vim plus LXD is Microsoft’s nightmare.

It’s not conspiracy that systemd isn’t active on WSL by default, but it certainly doesn’t hurt Microsoft’s cause.

systemd is just general Linux system automation that serves as an alternative to many other vendor tech’s.

Think about why they named it WSL instead of LSW

Linux must always be subordinate. Anything showing you Linux can run the show is bad.

LXD is on its way to being like the kernel virtual machine (kvm) which is hypervisor for Linux, built-into the Kernel and part of the official Linux. It’ won’t docker. Docker’s too strange. LXD works just like KVM but containers.

Question from Ben from the Philippines who is a Scheme LISP emacs (these are the things I admire most / musical)


Wed Jun 22, 2022

Storytime with Mic Lovin’ - The Wizard of Oz’s Redemption Story Arc

category: about

Welcome to story time with 🎤 💙in’

This morning, I’m going to spoil fourteen books for you, but that’s okay. Consider it like unboxing presents on Xmas morning. Regardless of your religious affiliation this symbolism should resonate. Ever watch The Grinch? It’s an American thing. So let’s get right to the good stuff, Cindy Lou Who…

On Xmas morning kids wake up to a day of surprise, box-opening and playing. The natural release of dopamine and serotonin into the system is quite… uhhh, large. That Xmas-morning feeling is one of the great gifts religion has given American culture. Or was that Coca Cola? Either way, Red & Green means passion and sheen! Serotonin and dopamine… generally for the good.

Red Ironman emotional armor-stockings are hung with care next to a tree that’s quite literally Green Arrow spearhead shaped. Two great emotional story-telling devices that vibe great together! Whether all this evolutionary-triggering imagery was planned or just itself evolved is unclear, but it’s funny looking at it under a microscope. This is not to to poke fun at the rampant materialism it institutionalizes once a year, but rather to pick out the bits that can help us improve our lives today… daily!

So what about this morning? There’s always some sort of backdrop of crap in life. I’m in the middle of a move back to an extraordinarily expensive place out of love for someone who is rejecting me but in dire need—and won’t understand my actions for years to come. It’s putting pressure on my new marriage. But everything is fine. I’ll pull through and it will all be for the better for both me and them. This is just life. And it’s not about me. It’s about The Wizard of Oz.

This is their Dorothy adventure, and the Wizard of Oz’s little-known crime and redemption arc that’s as big as Darth Vader’s, which few will know about, not having read the books.

The Wizard stared out as a talented ventriloquist but humbug wizard who was all smoke and mirrors and found himself, like Dorothy, in Oz as a result of unexpected bad weather.

The Wizard was a circus ventriloquist floating over the Omaha State Faire in his tethered hot air balloon that, which got swept into a mysterious land full of witches and munchkins who thought him a powerful Wizard and embraced him as their leader.

Not one to turn down an opportunity and seeing no better prospects, the Wizard accepted their offer and moved to Oz. Unfortunately, this amounted to a coup and he cast out their ruling family and gave their daughter, the rightful heir to Oz to the old witch Mombi who turned her into a him and tried turning him (now Tip) into a statue in old Mombi’s garden.

This is where the unforgivable act of the Wizard of Oz begins.

To be perfectly clear, the Wizard of Oz, still a humbug Wizard accepting his new title and not fully knowing what he was doing, handed over Ozma, the daughter of the overthrown ruling family of Oz, and the most sweet, wise and most powerful person in all of Oz (or at least is destined to become), to an old witch who made a slave of her and turned her into a him to throw off suspicion and keep Ozma hidden. Mombi the old witch further intended to turn Tip into statue in her garden.

Had Tip (actually Ozma) not escaped with Jack Pumpkinhead, an animated magical friend made my Tip to scare Mombi, but turned alive by Mombi herself testing a powder of life, Ozma would be Tip stone statue holding a water basin for squirrels to bathe in Mombi’s garden.

The Wizard of Oz at this time in the story has been revealed as a humbug and returned to the non-magical state of Omaha in the United States. “Child, you cut me to the quick. I’m an old Kansas man myself” says the exposed Wizard in the movie when schooled my Dorothy. The book says “I was born in Omaha—” “Why, that isn’t very far from Kansas!” cried Dorothy.

Later, the Wizard flies off in his restored hot air balloon back to Kansas, Omaha or wherever, but without Dorothy because Toto ran after a cat and Dorothy missed the trip. That’s where most of the movie-educated public’s knowledge ends; partial and with a most incomplete story of the Wizard himself.

Well through extraordinary means involving subterranean cannibalistic vegetable people—you’ll never know if you’re not a reader, the Wizard connects again with Dorothy, becoming part of her regular crew of peeps, thus beginning his Darth Vader redemption story arc, which this post is about. For you see, the Wizard of Oz is actually Dorothy’s father, because you remember he’s a Kansas man himself and she lives with her Aunt and Uncle Henry.

Do you think L. Frank Baum did that by accident? No, of course not. It was completely intentional and he took that secret to his grace. Nobody knows that when she ran away she stumbled upon none other than her own real father. Take that Matt Patt!

So, this is your Xmas Morning git surprise. Surprising Information is better than any material gift. This knowledge and new insight into The Wizard of Oz will be with you for the rest of your life. Something you thought you already knew has new deeper meaning, and in fact a whole series of fourteen excellent books ahead of you to enjoy.

There are fourteen books in the Oz series, written by L. Frank Baum himself. The rest, such as The Shaggy Man, are written by Baum’s successors. That’s okay, start just with the first fourteen (they’re small) and decide whether to continue. You’ll meet The Shaggy man in The Road to Oz, which is one of Dorothy’s many return visits. But be warned, they’re full of violence (the Tin Wiidman can break a mean neck).

Would Ozma have been better off as the daughter of a privileged royal family? We’ll, she does seem naturally predisposed to compassion, wisdom and intelligence. But who knows? Perhaps her experience as the boy Tip with her very sheltered life of servitude and cruelty were necessary so he (still Tip at this time) could orchestrate his own escape. Would Dorothy have been better off if she got into that storm cellar and was never swept away to meet such challenges and all of her friends?

Both Ozma and Dorothy’s experiences were certainly the kind of coming of age adventure that classically help shape you and helps to turn static personalities into dynamic ones. Tip to Ozma. Dorothy the evader into Dorothy the embracer. Alice the… no, Alice stayed static. Brits.

Should the Wizard upon his return to Oz have remembers his criminal mistreatment of Ozma, trusting her to old Mombi, and has ‑immediately done everything in his power to make good? Yes! But the Wizard was still not very powerful yet and was wrapped up in his own story-arc of redemption. He still had some growth to do to be powerful enough to properly help Ozma. So he helped Dorothy instead and let the two roads gradually converge.

Was the Wizard of Oz a bad guy? Was Dorothy wise to re-embrace him into her life on their joint return to Oz? Was the Wizard hypocritical in his desire to unhumbugify himself and learn real magic? Is the Wizard wrong to actively learn real magic now as the apprentice of Glinda the Good Witch of The North? I think not.

The Wizard is now filling his bag of tricks with real magic. Good people recognize his redemption arc and they love Ozma for who she is—including everything that brought her to be the wonderful young person she is today, including her time with Old Mombi Ozma now understood to be a profoundly valuable educational opportunity she would have never had in her previously privileged royal life.

And so that takes us back to the now-moment. I will switch from talking about the Wizard of Oz to my own life and my return to New York City. My child is still rejecting me and I can understand why. All I can do is lead my example, developing my own story arc and sort of magic, dehumbugifying myself, learning from the Good Linux of the GNUrth, sharing everything I learn with you, my good friends of the YouTubes.

This has been Storytime with Mike Levin. [insert mic poem]

🎤 💙in’


Tue Jun 21, 2022

Making Every Morning Like Xmas Morning / Non-stop Surprises & Learning

category: about

Every morning should feel like Xmas morning (or equivalent). We need common language; symbols we can all use to convey meaning and feelings. The feeling I’m trying to convey is that every morning should have surprising gifts awaiting for you that make you want to wake up, jump out of bed and rush to engage the world. Be surprised. Be disappointed. Be alternatively surprised and engaged, and bored and moving on. Have favorites you keep out and while sweaters and such are sorted and stored.

By the end, exhaustion overtakes your for the day and the whole thing can repeat the next day. Wouldn’t it be nice if every morning were like that? Yeah really, it can. Every single morning can be like Xmas morning. That’s what the age of the Internet and unlimited incoming information means, if only we can know what pushes our buttons and have the ability to convincingly package it all up as gifts for ourselves—gifts of information. Gifts of surprising learnings and growth. It is the fountain of youth.

Most matter shouldn’t matter, so these gifts are “gifts of information” of various sorts. Details will vary based on you; who you are and when makes you tick. So there is a certain element of self-discovery in this I’d you do not yet know yourself that we’ll get. What types of learnings and growth in life are you still hungering for? Does learning and improving and changing even drive you? Or are you just happier static and in your comfort zone? Is material comfort all you really live for?

I know a lot of people view it as a sort of material-things contest. Or a battle of what rung you’re on in the hierarchy of primate tribalism. As such, there’s a natural order with the luckiest, fiercest and best rising to super-wealth and power, like Putin and Musk. Musk may be the richest person in the world, but Putin is more powerful. And while while the wild world population growth has slowed down in recent years, the ratio of have’s-to-have-not’s continued creeping towards the 1/X power-law Pareto wealth distribution curve where the have’s are infinitely rich and the have-not’s are infinite in number.

If this is our new global primate hierarchy, then research shows there’s two places you don’t want to be due to how stressful it is: at the very top or the very bottom. Big males silverback gorillas always have to be on guard against the competition while the ones at the bottom can be freely picked on by everyone else (cruelty is a fact of life) without recourse or hope. So the very top and bottom is a life of always watching your back or no one ever having your back. Neither is very appealing and so the wise monkey spends his morning time figuring out how to be and stay in the middle. Interestingly, 1/X has no inflection point and zooming in on any segment looks identical.


Mon Jun 20, 2022

Every Little Thing gets done (ELTgd) Leads to More Epic Days

category: My Static Site Generator (Skite)

My precious time-off from work time on this Monday the Juneteenth of 2022 are rapidly winding down. I pulled off some things I previously thought impossible in my mind.

I shit you not, this was a most excellent day. I’ve interacted with my YouTube audience even though I have not livecast in a number of days. I pushed out one of my best videos ever: This is the way the world is saved, not with a bang but a whimper that talks about Alan Turing’s not-told-enough story.

By external measures, things should probably be getting me down. But I know how to exercise my uniquely human facilities and marshal my endless internal energies and skills. Writings like this would be a good example. Lead by example.

Don’t be a sourpuss and don’t assign blame. Don’t rain down pessimism on everyone around you looking for the next reason it’s not your fault. Everything in your life past a certain point is of your own choosing. The line demarcating that point is blurred and dim, but that’s no excuse either.

So what now? I’m waiting for some reports to generate for work to double-check some numbers. When it’s done running, I’m going to pull up the Web user interface for the same report and create a screenshot to go side-by-side with the Google Sheets-based dashboard. Okay, that’s done.

Back to what you want and need to think about. Hmmm. Almost everything is done, but not quite. I need to add support for categories to skite. If ELTgd (Every Little Thing gets done) then I need to finish this out, or this little thing isn’t really done. Dependency-blocking… yeah, one thing is almost done but it depends on another thing and another thing and so on down the chain so that not even one little thing can get done. Don’t let that happen. 1, 2, 3… 1?

Look at the 10_slice.ipynb code. Look at the most convenient place to inject the support and HOW you’re going to test it. Think! Okay, put in some category data on MikeLev.in (this file).


Mon Jun 20, 2022

Github Pages Jekyll Previous/Next Arrows For Categories

category: My Static Site Generator (Skite)

One of the most profoundly differentiating and SEO-lovely things I could do on my site is to have 2 different meanings for the previous/next arrows on my site. One at the top of the posts which is cross-cutting across the entire blog in the order they were published. Another at the bottom of the posts which contains the previous/next arrows to just within the category in which the post was published.

There are a couple of issues here. First, that’s not so easy as regular previous/next arrows within the Jekyll and Github Pages publishing system. I am hopeful I will be able to get this to work, but there are issues that must be worked through. First and foremost is the multiple-belonging issue. If I’m putting alternative prev/next arrows on the one master copy of the published content (no duplicate content) then there will only be one alternative set of previous/next arrows, so each blog post must only be able to belong to a single category, or else I would need multiple alternative prev/next arrows or a win/loss scenario.

We start with the official Jekyll documentation on posts.

That page has this line:

Unlike tags, categories for posts can also be defined by a post’s file path. Any directory above _post will be read-in as a category.

Now that’s interesting. If I were to allow duplicate content, that’s how I would do it. I could also easily support the multiple belonging data structures made possible by the many-to-many relationships of both tags and categories. It’s a bit troublesome. I don’t want duplicate content and I don’t want the many-to-many relationship in categories. Tags I can understand, but there should be more differentiation between categories and tags. They’re giving flexibility I understand, but my approach must not allow this.

Just do some general research into arrow solutions. I’m not the first person to scratch this itch. See what others have done. There’s a number of solutions out there, but this is my favorite:

https://stackoverflow.com/questions/61287058/jekyll-link-next-previous-sorted-posts-within-a-category

I am afraid it may require a plug-in:

https://rubygems.org/gems/jekyll-category-aware-prev-next

Okay, think! What kind of experiments can you do? Well, not of the blog posts even have a category yet. There are 2 ways to create categories. It can be written into jekyll frontmatter or the _posts folders can be under category folders.

It has to be a plugin-free solution in order to avoid rabbit hole complications. Here’s one: Tag-aware Previous/Next links for Jekyll

Try showing their tag solution here:

{% for tag in page.tags %}
  {% if tag == "pyesl"%}
    <ul>
      {% assign posts = site.tags['pyesl'] | sort:"weight" %}
      {% for post in posts %}
        {% if post.url == page.url %}
          {% assign last_inx = forloop.index0 | minus:1 %}
          {% if forloop.first == false %}
            <li class="prev"><a href="{{ BASE_PATH }}{{ posts[last_inx].url }}" title="{{ posts[last_inx].title }}">&larr; Previous</a></li>
          {% else %}
            <li class="prev disabled"><a>&larr; Previous</a></li>
          {% endif %}
          <li><a href="{{ BASE_PATH }}{{ site.JB.pyesl_path }}">PyESL</a></li>
          {% if forloop.last == false %}
            {% assign next_inx = forloop.index0 | plus:1 %}
            <li class="next"><a href="{{ BASE_PATH }}{{ posts[next_inx].url }}" title="{{ posts[next_inx].title }}">Next &rarr;</a></li>
          {% else %}
            <li class="next disabled"><a>Next &rarr;</a>
          {% endif %}
        {% endif %}
      {% endfor %}
    </ul>
  {% endif %}
{% endfor %}

Wow, that rendered out to my site just fine. Interesting! Okay, that’s a first tiny success. Get a second tiny success.

There’s issues. My skite & slice system will overwrite all the files in _posts, but that’s where I need to put the category in as front-matter. So I’m going to do a few git commits and pushes without slicing & dicing. Think! Go the frontmatter route for a test. Do a particular page that will be easy to keep showing… like this one! And so make the plan here, do one skite-gen then put the category arrow code in… in where? Okay, the plan:

Commit and push without skite.

On a related note, I will also be interested in previous/next arrows in normal Jekyll pages, or maybe collections whatever those are. Here’s some folks talking about that: https://talk.jekyllrb.com/t/previous-and-next-links-with-a-collections/3171/6

{% for category in page.categories %}
<h4>Category: {{ category }}</h4>
<div class="spacer">
  <div class="post-nav">
    <div class="post-nav-prev">
    {% assign posts = site.categories[category] | sort:"weight" %}
    {% for post in posts %}
      {% if post.url == page.url %}
        {% assign last_inx = forloop.index0 | minus:1 %}
        {% if forloop.last == false %}
          {% assign next_inx = forloop.index0 | plus:1 %}
          &larr;&nbsp;<a href="{{ BASE_PATH }}{{ posts[next_inx].url }}">{{ posts[next_inx].title }}</a>
        {% endif %}
        </div>
        <div class="post-nav-next">
        {% if forloop.first == false %}
          <a href="{{ BASE_PATH }}{{ posts[last_inx].url }}">{{ posts[last_inx].title }}</a>&nbsp;&rarr;
        {% endif %}
      {% endif %}
    {% endfor %}
    </div>
  </div>
</div>
{% endfor %}

I will have to maintain a one-to-many relationship between categories and posts manually (versus many categories to many posts which Jekyll supports) in order for this solution to not have funky side-effects. I’m iterating through a loop, but expecting one and only one category to return (if any) for any given post.


Mon Jun 20, 2022

Qnap Container Station Lxc End Of Support Notification, Move To Lxd

category: linux

Going into QNAP Container Station, a sort of Kubernetes, Jr. built into the QNAP NAS (Network Application Server) suite, I got this message:

Qnap Container Station Lxc End Of Support Notification Lxd

Hmm, I thought lxc files were the containers for the LXD service. I got some learnings to do.


Mon Jun 20, 2022

Gutting Dependencies Out Of Your Projects & Life

category: My Static Site Generator (Skite)

Gradually get better at everything. Every little thing gets done. ELTgd. And it felt good. Good God, it feels good to get every little thing done. This is my poetry. The poetry of life in this awkward material world where everything is difficult, the likelihood of your existence is mind bogglingly improbable, and that’s just your starting point. Let’s see… every little thing.

Today’s Juneteenth and that’s a day off from work. Nice! Get some stuff done that you normally wouldn’t. Of course there’s packing for the move, but that will happen sooner or later per desperation. Get stuff done that takes that special inspired energy that only mornings like this can provide.

Jekyll static site generator stuff. What’s most broken?

Dependency on Github Pages templates. Sever that connection. On which site? This one, of course! Sever the connection. Generate the site and be ready to go back in a pinch. What creates the connection? It’s a selection in Github Pages settings that writes into your _config.yml file. There are also sometimes plugins connected to that theme which would also need to be removed.

Okay, I just removed the 2 lines in the config file that set the theme and plugin. I committed and pushed the change. Be ready to go look at the site now. This is going to be an interesting process, and very symbolic. Cut out dependencies.

Okay, watch for errors in your email. Yep. They’re much more informative than they used to be. It reads:

---------- Forwarded message ---------
From: GitHub <support@github.com>
Date: Mon, Jun 20, 2022 at 8:14 AM
Subject: [miklevin/MikeLev.in] Page build failure
To: Mike Levin gets you onto Linux, Python, vim & git!


The page build failed for the `main` branch with the following error:

Your SCSS file `assets/css/style.scss` has an error on line 1: File to import
not found or unreadable: jekyll-theme-hacker. Load path:
/hoosegow/.bundle/ruby/2.7.0/gems/jekyll-theme-primer-0.6.0/_sass. For more
information, see
https://docs.github.com/github/working-with-github-pages/troubleshooting-jekyll-build-errors-for-github-pages-sites#invalid-sass-or-scss.

For information on troubleshooting Jekyll see:

  https://docs.github.com/articles/troubleshooting-jekyll-builds

If you have any questions you can submit a request at
https://support.github.com/contact?tags=dotcom-pages&repo_id=395107155&page_build_id=351029098

That css file it’s referring to currently reads like this:

@import 'jekyll-theme-hacker';

.container {
    max-width: initial;
}
body {
    font-family: medium-content-sans-serif-font, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen, Ubuntu, Cantarell, "Open Sans", "Helvetica Neue", sans-serif;
    line-height: initial;
    font-size: 125%;
    text-rendering: optimizeLegibility;
    -webkit-font-smoothing: antialiased;
    margin: 1%;
}
#main_content h1 {
    font-size: 1.6em;
}
#main_content h2 {
    font-size: 1.4em;
    margin-top: .5em;
}
#main_content h3 {
    font-size: 1.2em;
}
#main_content h4 {
    font-size: 1.4em;
}
ul li {
    list-style-image: url(/assets/images/green-arrow.png);
}
img {
    vertical-align: middle;
    display: block;
    margin-left: auto;
    margin-right: auto;
}
header {
    width: 100%;
    font-family: monospace;
    white-space: pre;
    font-size: 1.8vw;
    background: black;
    border-bottom: initial;
    padding: initial;
}
nav {
    text-align: center;
    font-size: 1.3em;
    margin-bottom: .5em;
}
footer {
    text-align: center;
    font-size: .75em;
}
.spacer {
    padding-top: .5em;
    padding-bottom: .5em;
}
.post-nav {
    display: flex;
    justify-content: flex-end;
    // justify-content: space-between;
    flex-wrap: wrap;
}
.post-nav-prev {
    margin-right: auto;
    // text-align: left;
    padding-top: .5em;
    padding-bottom: .5em;
}
.post-nav-next {
    text-align: right;
    padding-top: .5em;
    padding-bottom: .5em;
}
.rotate {
    top: 50%;
    left: 50%;
    transform: translate(-50%, -50%);
    width: 65vh;
    animation: rotation 8s infinite linear;
    object-fit: cover;
}
@keyframes rotation {
    from {
        transform: rotate(0deg);
    }
    to {
        transform: rotate(359deg);
    }
}

I also deleted some empty frontmatter double-lines so as to not mess up rendering here. I don’t know if the indent would neutralize it. Clearly I need to delete the line:

@import 'jekyll-theme-hacker';

Commit and push… wait for email, refresh site. One of the precautions I took was to bring up the site in a second web browser and pull up the Google developer interface so I can inspect the cached CSS even after it’s gone. I can then transpose bits I want to keep from the cached version back over into the css file. I may change it from scss to just plain css.

Bingo! The background turned white.

The text and links, while still in the fonts that I like using, are all a much more traditional color with unvisited links in blue and visited links in purple. Oh, a beautiful connection to the past. Keep it this way for a little bit. No more Github Pages templates, at least on this site.

And without my old content management system pushing stuff from ~/github/helpers/templates out anymore, it won’t revert. Woot! Okay, now we’re gutting. Let’s keep gutting.

Hit @p to publish THIS page and see if it looks correct.


Sun Jun 19, 2022

Removing a Package From PyPI.org (a pip installable app I created)

category: My Static Site Generator (Skite)

It’s not often I break the flow of a single-session getting-something-done into two articles here. Just a few days ago I did that epic post about getting my personal CMS-system rebooted and mixed about a zillion topics. But in this case I need to merge the function of two repos and remove one of them from PyPI! And the removal of a previously published and shared bit of functionality from PyPI is topic worth focusing on in and of itself.

Similar to the previous post, I’m renaming a notebook from 00_core.ipynb, this time to 10_slice.ipynb. I’m also moving it from ~/github/blogslicer/ to ~/github/skite/ and my world is improved. I test-render it with nbdev_build_lib and it checks out. Finally, I fix the path and filename reference to it in the recently renamed (also from 00_core.ipynb) 00_skite.ipynb. Now test-generate the site… nice. Works like a charm.

Next, take a nice long hot morning Father’s Day bath and research on your phone removing packages from PyPy…

https://blog.ovalerio.net/archives/1971

I wish to completely eradicate the blogslicer Repo. It hasn’t been out long and is very special use case which I believe nobody else has or scratches an itch nobody else feels or knows that they do. I’ll fix that later but probably with skite. pip install kite is probably in the future, but now that I know how to pip install -e –editable I do t need to put it in PyPI for development purposes. Game changer, for sure!

Okay, so think! Log into PyPI… 2FA! I don’t have 2FA set up for PyPI… or at least I never remember setting it up. Ugh! Okay, put in an account recovery request. Releasing into existing packages still works. I’ll probably want to put some sort of warning in the blogslicer repo, but it’s become a very low priority now.

I’m going to switch to making “sub-groups” of blog posts. Probably the prev/next arrows at the top of a blog post page will be different from the ones at the bottom of a blog post page. I found this:

https://stackoverflow.com/questions/28331879/jekyll-next-and-previous-links-with-different-categories

and this

https://github.com/jekyll/jekyll/issues/260

It’s not going to be a clear win, but there is some hope.


Sun Jun 19, 2022

Having The Confidence To Diverge From nbdev File Naming Convention

category: My Static Site Generator (Skite)

I could not be more pleased with how the my content release system has developed over the past few days. It is unfortunate the reason why I have the time to work on it, but it is nonetheless useful and will help me remedy the cause by virtue of getting more organized. I am embarking on a new chapter. I am seeking to understand things more deeply and document my process of understanding and making more immediate use of the deep understanding and documentation. You can see this start to take place in how I’m managing and updating my .vimrc file.

I no longer use the blast.sh bash file. In fact, It’s time to go delete it. Done. There’s quite a bit of revision that could be done in the helpers repo. I’ve adopted the skite repo now for the CMS and there should be aspects of it that closely reflect the Jekyll static site generator system, such as the _layouts, _assets and _includes folders. Another interesting thing I want to do is get rid of the concept of “core.py” for the skite system. I’m not really following that object oriented package model that nbdev promotes. So think through good nicknames for functionality. There should be:

Okay, go ahead and make these changes… 1, 2, 3… 1?

Okay, that part of it went very well. But now I want to eliminate the blogslicer repo, and even remove it from PyPI! Wow, I should cut this journal entry and make that the focus of a wholly new article.


Sat Jun 18, 2022

I’ll Just Put This In My /usr/local/sbin/

category: My Static Site Generator (Skite)

And here we go! I just renamed journal.txt to journal.md to keep it consistent and compatible with my new content management system. This is a huge move for me. Previously, I couldn’t care less about how markdown rendered in my personal private journal. It was not for sharing. Events of recent times have made it painfully obvious that almost the entire purpose of this location is to “seed” content for other locations. You need somewhere to just begin writing without inhibition or particularly committed purpose, but still with the open-endedness to just copy/paste it over to where it belongs… which just by virtue of doing that POOF! It’s published. Or at very least, it’s organized into another private Github repo. Not everything maintained with my content management system need necessarily be published via the Github Pages system. But the same system that can is identical to one that does not (and keeps it private). The only difference is the public/private setting in Github and configuration of Github Pages… brilliant!

Okay, so we pay out annual dues to Microsoft for unlimited private repos. So what? It’s nothing compared to the boon of abilities this yields in one’s life. Plus it keeps all your personal information management skills compatible with what’s necessary for a professional developer. Double-win! And now that I’ve adjusted my Linux Terminal prompt to show what git repo branch is active (although I’m almost always in main), I can start to take more advantage of git branches by virtue of it being kept in the front of my mind. Previously (currently, really), I only use git as a sort of personal unlimited undo feature. I’m not a professional developer, so I don’t really collaborate on code. All my “code” is personal, or just really journal-writing like this. Git and vim being so thought-of as developer tools does a disservice to humanity. It should be incorporated into mainstream education so everyone has these abilities.

Okay, so where am I going with this? Getting organized, of course. For months I’ve had a list of Github Pages sites to assure myself I could get them published easily and add content to them with vim copy/paste between buffers of always-loaded files. It worked well. But how much do I really have to update TicTacUFOSightings.com or TardigradeCircus.com. They’re both great ideas, but their time has not come (or is over). As of yesterday, my content management system supports site-files in arbitrary location and of arbitrary lengths, so I could keep all these sites in there and more while only generating out the ones I need while I’m typing in them. If I’m in my personal journal, I hit the @l macro (or I may switch it to @g) in order to slice & dice it and output it to the _posts directory within. Ugh, am I going to have to neutralize all the markdown in this file now? Yeah, maybe. Casual project over time. It would have great utility for even this journal to be sliced & diced as if published.

Hmmm, okay so next steps? Take ~/github/skite/sites.csv and put a copy of it into ~/github/journal. Rearrange them for the order you want to use them and add in the private repos that are not part of public Github Page publishing. I’ve got a tremendously important letter to write tomorrow. I want to get a first draft done today. It’s about an incident that happened today and I want to capture all the details (in a longer version) while it’s still fresh on my mind and want to talk about it as the incident that happened yesterday. But technically it actually happened today, so I can wait until tomorrow (Father’s Day) and still talk about it as yesterday. So I can have a chance to gather my thoughts and get organized in these other “higher level” ways that has been long overdue. Okay, so go and copy sites.csv into journal.

Okay, so now I should be able to just :badd sites.csv and have it in my vim buffer along with this journal. Check! Okay, now add the entries for the other repos. Interesting! My WorkJournal.zd line works as an example. So I already had one in there that worked this way and the slice & dice system was already working on it, so I know I have it working. Of course, the only difference is whether the Github repos it pushes into are public and Github Pages or not, so this is just a solid system for managing any journal. Sweet!

#!/home/healus/py310/bin/python

from os import system
import pandas as pd


df = pd.read_csv("~/github/journal/sites.csv", delimiter="|")
df = df.applymap(lambda x: x.strip())
df.columns = [x.strip() for x in df.columns]
alist = df[['path']].values.tolist()
journals = " ".join([f'~/{x[0]}/journal.md' for x in alist])
edit_journals = f"vim {journals}"
system(edit_journals)

Sweet. Now I edit ~/github/journal/sites.csv to whatever I like and then just type “all” from any Linux Terminal, and I’m editing all files listed in csv.

Hmmm, while I’m at it I ought to always be editing my .vimrc and on quitting, backing it all up to my vim repo, along with my .bash_profile and /usr/local/sbin/all file.

#!/home/healus/py310/bin/python

import shlex
import shutil
import pandas as pd
from os import system
from sys import stdout
from subprocess import Popen, PIPE

df = pd.read_csv("~/github/journal/sites.csv", delimiter="|")
df = df.applymap(lambda x: x.strip())
df.columns = [x.strip() for x in df.columns]
alist = df[['path']].values.tolist()
journals = " ".join([f'~/{x[0]}/journal.md' for x in alist])
edit_journals = f"vim {journals} ~/.vimrc"
system(edit_journals)


def flush(std):
    for line in std:
        line = line.strip()
        if line:
            print(line)
            stdout.flush()


def git(cwd, args):
    cmd = ["/usr/bin/git"] + shlex.split(args)
    print(f"COMMAND: << {shlex.join(cmd)} >>")
    process = Popen(
        args=cmd,
        cwd=cwd,
        stdout=PIPE,
        stderr=PIPE,
        shell=False,
        bufsize=1,
        universal_newlines=True,
    )
    flush(process.stdout)
    flush(process.stderr)


shutil.copyfile("/home/healus/.vimrc", "/home/healus/github/vim/.vimrc")
shutil.copyfile("/home/healus/.bash_profile", "/home/healus/github/vim/.bash_profile")
shutil.copyfile("/usr/local/sbin/all", "/home/healus/github/vim/all")

loc = "/home/healus/github/vim"
git(loc, 'commit -am "Pushing .vimrc to Github..."')
git(loc, "push")

print("Done")

The utility of this is pretty amazing. I edit all my files at once whenever I like, but I keep the list of master files in a visually formatted csv file that serves other purposes in the content management system. Whenever I quit out of this vim session, all my system customizations are backed up to vim.

I wish I worked this way since I discovered stuff like this at 18 years old on the Amiga. Well, I know now. Share it with the world. Okay, done.


Sat Jun 18, 2022

Education Of The Woogle Bug

category: My Static Site Generator (Skite)

Wow, yesterday was epic for so many reasons. Not only did I rewrite my content management system, but my child rejected me when I picked them up for Father’s Day weekend by calling the police on me… for raising my voice and telling them the truth. Their current feelings towards me are as a direct result of my ex sharing them in on a private conversation which was a gross violation not only of our agreement, but of common decency and now my child’s mind is poisoned against me. They good news is that they are still very young and smart and in time they will come to understand what happened. The police are very sympathetic and gave some good suggestions about trying to get my child into public school. I agree.

So here I am on a morning I did not think I was gong to have and am free to write. Part of what I am going to do is document the events of yesterday and last night while it is fresh on my mind. But more important still, I am going to condition my mind for kindness and gentleness. There is a Tin Woodman who wouldn’t hurt a fly (yet you ought to read the books), a Scarecrow who can think his way through any situation and a Cowardly Lion who is anything but. I must channel a little of this and a little of that. Above all, I will be a friend to Dorothy. I will be one of the good and wonderful characters of my child’s journey and not play the role that others are trying to cast upon me. I write my own role and no egregious preconditioning of my child need change me.

I am the person I am and I will continue to love my child unconditionally and continue to enforce our agreement to the letter, documenting and setting right many past violations that I should not have let slip by. I will wake up earlier, do better as a father, husband, employee and yes, even as an online personality. I am at great personal hardship moving closer to my child so I I now need to supplement my income if I am not to be destroyed financially by my upcoming hardships. Attempts to communicate this to my ex resulted in my child being brought into a conversation that was out of context, a violation of our agreement and just completely inappropriate and the began their shift in attitude towards me.

Well, it’s never too late to set things right. And so based on the same spirit of the momentum I created yesterday in rewriting my content management system, I will start using it to flesh-out my personal story and publish it here and on my YouTube videos. My stories are mine to tell and I will tell them. I have and continue to live a very full and interesting life. Even though I’ve gone through some interesting and difficult times including the collapse and betrayal of family and technology, my biggest challenges are clearly still ahead of me. Raising a child, righting my technological ways and telling my stories promises to be the best times of my life. I will shift my algorithm for living from the exploration of life to the enjoying of life. The book I read, Algorithms for Living, calls it the explore/exploit algorithm, but I think it ought to better be called explore/enjoy.

Hmmm, and so what next this morning? Documenting yesterday/last-night is certainly coming up but that’s not where my good and best morning energy should be funneled. That would be a defeat. I should finish out this blog entry and jump right back into where I left off yesterday. My content management system that builds off of Github Pages is first-past complete in terms of the rewrite. I have it processing each site I’m working on (such as this) with a keyboard press from within vim. The old version used to be very time-wasting in reprocessing every site with a master-script for all sites every time the keyboard macro was run. Per that last post… oops, it’s broken!

Time to fix. It’s never too late to fix. When something goes badly wrong, you first collect information. Collecting good information is the most important thing. What’s are the symptoms? Charlie Brown kicks at the football and winds up flat on his back. It’s not Lucy’s fault. Refer to the story of the Scorpion and the Frog. Stinging is just the scorpion’s nature, no matter what it tells the frog, and the frog should know that. Lucy is what she is and Charlie Brown should just know that and accommodate accordingly. So the first step is to see the situation clearly. Collect and record good information. Think it through out loud. Condition your mind to be a good football player, Charlie Brown!

When I use the keyboard shortcut from within vim, it blanks all the [site]/_posts content exactly as it should, but it doesn’t put the slice & diced new posts into location as it should. But the git commit and push works so the pages of the blog content disappear! Oops, that’s bad. I see the symptom of the problem, which reveals a lot about the problem itself. Let’s get this fixed in that good solid diagnostic way and feel good about myself, use it to get interesting parts of my story out there, and document my way to the next, good step, rinse and repeat! 1, 2, 3… 1? Better diagnostic output!

I dislike the current output from my new SSG CMS…

Yucky Output Poor For Diagnostics Fix Cli Command Line Interface

What’s wrong with this interface? No Figlet, hahaha! No, really I need to make this look good first. I need a big fat label.

And on the personal front, I need to a capture the most offending language:

Let me know if you don’t feel safe…

That’s planting subliminal suggestions that the child should be thinking of themselves as being in a situation where they don’t feel safe. That is a form of emotional abuse planting ideas like that over and over in a child’s head. Record the output. It is always important to have a record of the output. You can not prove to yourself and others the situation if you don’t have hard and fast evidence, so bank that evidence. Put it in a git repo and make sure that git repo is kept in a repository that is easily accessible on-demand.

And now look at the output. Organize, label and make it more useful for when it’s looked at later. Get organized! So you think a big, fat label is a silly thing? Compare the above output to this:

So the initial label really sets the mind. When you open an interaction with “Tell me when you don’t feel safe”, it sets the mind of the child. Don’t pretend like you don’t know what you’re doing. And so first we use a big, fat label to make the first interaction with a system work for us instead of against us.

First Headline Label Sets Mindset Interaction With System

Okay, all these hash-symbols when stuff is shown in the Linux terminal is actually markdown that shows as actual headlines, subheads, etc. in Jupyter but it’s working against me here. I’d be better off with an ALL-CAPS clear labeling system that keeps it small and readable on both sides. Fewer headlines! Less noise!

Here you can see my precision control of output in a Jupyter Notebook by showing the full absolute paths of my Python interpreter and other program that’s going to be invoked. Full, non-broken paths are always a good early step to making sure nothing breaks. Relative paths break. Absolute paths break less and reveal themselves as the problem more when they do. I also throw in a SystemExit() so all that stuff later in the script doesn’t try running. I do this with a SystemExit() so I can immediately test it Linux Terminal-side as well.

Precision Control Of Output Jupyter Notebook Figlet Paths Systemexit

And here is the same output Linux Terminal-side. See how nice Figlet is. You don’t think so, but it’s life-changing from a clarity-of-thought in software development point-of-view.

Nbdev Same Script From Jupyter But In Linux Terminal Using Figlet

Wow, okay I’m already getting the feeling that the problem is as good as fixed. I simply move the SystemExit() down little by little, improving the output of the script only showing what’s relevant, eliminating noise, adding signal and spotting what’s broken when I get to it. This is the way you do it in life too. It’s easy to make someone feel like something is broken beyond repair on you by sprinkling in symptoms here and there. It feels like a complex disorder, but if you catch it early enough and take preventative countermeasures, it’s really quite simple. Catch it early and don’t let it become complex!

Oops, change of plan! Instead of moving a SystemExit() progressively down the code for debugging, I am approaching a loop. A much better way to deal with progressive program-running allowance is to comment out large blocks of code. You do this in Jupyter Notebook (and JupyterLab, VSCode and various other text editors) by highlighting the block of code with your mouse and pressing Ctrl+Forward-slash. How do you even write this? Control Forward-slash? Ctrl+/. There’s no ideal way to explain keyboard combinations in English.

Python Commenting Out Large Blocks Of Code Ctrl Control Foward Slash

With the exception of the “Done” which is showing as an HTML H2, this output will be identical in Linux Terminal so no need to show it twice. That’s actually a big part of the goal here. Done is a good place to allow it to show as a ## Done Terminal-side because it doesn’t detract but looks great in Jupyter. Okay, so… I rarely use anything other than the default figlet font, Standard Medium, but this is a case where seeing what site you’re processing scroll by in glorious Figletease would be wonderful. But Standard Medium will be too big for these long sitenames. If you’re looking for a Figlet Font that’s readable yet small, Cybermedium is always a good bet.

Figlet Font Cybermedium Readable Yet Small

All this to diagnose and fix a problem? No! It’s more than just to diagnose and fix. It’s to make a wonderful show of it to help all others in the world facing similar problems while I do, for you see I’m a very generous and pedantic person. I both have the excess time to put on such a show, having this Father’s Day weekend to myself as I do, and I have the ability and excess capacity to put on such a show, having put in my 10-years / 10,000 hours of practice on tools such as Linux, Python, vim & git such as I have. Plus, it makes problem-solving fun!

Next!

Hmm, let’s show the nice people all the code so far:

# export

import os
import sys
import shlex
import argparse
import pandas as pd
from git import Repo
from pathlib import Path
from art import text2art
from collections import namedtuple
from subprocess import Popen, PIPE


if os.name == "nt":
    git_exe = r"C:\Program Files\Git\cmd\git.exe"
else:
    git_exe = "/usr/bin/git"
os.environ["GIT_PYTHON_GIT_EXECUTABLE"] = git_exe

lr = "\n"
python = sys.executable
home = Path(os.path.expanduser("~"))
blogslicer = Path(home / Path("github/blogslicer/blogslicer/core.py"))
if hasattr(__builtins__, "__IPYTHON__") or __name__ != "__main__":
    is_jupyter = True
    from IPython.display import display, Markdown, HTML

    h1 = lambda text: display(Markdown(f"# {text}"))
    h2 = lambda text: display(Markdown(f"## {text}"))
    h3 = lambda text: display(Markdown(f"### {text}"))
    file = "sites.csv"
    # apex = "/mnt/c/Users/mikle/github/MikeAtEleven.com"
    apex = ""
else:
    is_jupyter = False
    h1 = lambda text: print(f"# {text}")
    h2 = lambda text: print(f"## {text}")
    h3 = lambda text: print(f"### {text}")
    aparser = argparse.ArgumentParser()
    add_arg = aparser.add_argument
    add_arg("-f", "--file", required=True)
    add_arg("-x", "--apex", required=False)
    args = aparser.parse_args()
    file = args.file
    apex = args.apex


def fig(text, font="Standard"):
    if is_jupyter:
        display(
            HTML(
                f'<pre style="white-space: pre;">{text2art(text, font=font).replace(lr, "<br/>")}</pre>'
            )
        )
    else:
        print(text2art(text, font))


fig("Making Sites...")

file_obj = Path(file)
df = pd.read_csv(file_obj, delimiter="|")
df = df.applymap(lambda x: x.strip())
df.columns = [x.strip() for x in df.columns]
if apex:
    apex = Path(apex).name
    df = df[df["apex"] == apex]
Site = namedtuple("Site", "path, apex, title, gaid, tagline")


def git(cwd, args):
    cmd = [git_exe] + shlex.split(args)
    h2(f"git cmd: {shlex.join(cmd)}")
    process = Popen(
        args=cmd,
        cwd=cwd,
        stdout=PIPE,
        stderr=PIPE,
        shell=False,
        bufsize=1,
        universal_newlines=True,
    )
    h3("git stdout")
    for line in process.stdout:
        print(line.strip())
        sys.stdout.flush()
    print()
    h3("git stderr")
    for line in process.stderr:
        print(line.strip())
        sys.stdout.flush()
    print()


print(f"INTERPRETER: {python}")
print(f"SLICER: {blogslicer}")

for index, series in df.iterrows():
    site = Site(**series.to_dict())
    fig(site.apex, font="Cybermedium")
    here = Path(home / site.path)
    print(f"PATH: {here}")
    # [x.unlink() for x in Path(here / "_posts/").glob("*")]

#     # Blog Slicer
#     cmd = f'{python} {blogslicer} -p {here} -t "{site.title}" -s "blog" -a "Mike Levin"'
#     print(cmd, end="\n\n")
#     with Popen(args=cmd, cwd=here, stdout=PIPE, stderr=PIPE, shell=True) as pout:
#         for line in pout.stdout.readlines():
#             print(line.decode().strip())

#     git(here, "add _posts/*")
#     git(here, 'commit -am "Publising Blog Posts"')
#     git(here, "push")

h2("Done!")

This way they can see the figlet function and how I’m commenting out the blocks of code and progressively activating lines to debug. I’ll also be improving the git function output.

It’s also worth pointing out that when you work this way in Jupyter, you’re actually trying to keep your function usage to a minimum to keep the lovely exploratory coding environment of Jupyter’s REPL (read, eval, print loop) working the way it should. All code inside functions takes things “out-of-flow”. Like a good story, a good script should read from top-down, telling a story and having a logical flow. Functions interrupt this flow and make debugging more challenging. So as you can see, I turned both figlet and git into functions… because they should be and not because someone is chasing me with a stick telling me to turn everything into functions.

Okay, so now this next step is a hoot! It’s the construction of the absolute-path command to do the blog-slicing. It re-populates the [site]_posts folder for each site. It’s also highly OS-dependent so I can’t copy/paste one of these commands from Juptyer-side and expect it to run in the Linux Terminal. I’d have to run the core.py file in the Linux Terminal and copy-paste a command from there back into the Linux terminal. And that’s exactly what I’m going to do!

One Python Script Calling Another From Jupyter Absolute Paths Os Dependent

Here’s the program output in Linux Terminal. Notice that even absolute paths are OS independent when constructed with pathlib in Python. Think about that. This is absolutely amazing. I don’t need to purge Windows Jupyter Desktop out of my process yet because Python itself is an OS-independence layer when used well.

Absolute Paths Are Os Independent When Constructed With Pathlib In Python

People live in fear of difficult problems. They feel they are not up to them. They become reclusive, not wanting to step out into the purview of a Great and Powerful Wizard, even though it’s just a man behind a curtain. The real brave ones like the Lion are made to feel cowardly because they’re told they are, but what’s inside belies the truth. The Scarecrow is the really intelligent one, always using his intellect but not giving himself credit. And who has more heart than the Tin Woodman?

NO! No, I say! Step out into the center of the room and announce yourself to the great and powerful humbug! Humbug, I tell you! I call you out on your humbug and my own genuine wizardly talents supersedes yours. Take that!

Okay, think! Think, Scarecrow. Copy/paste one of the Linux-side blogslicer commands. Look at the output.

I Caught Myself A Woogle Bug Gotcha Debug Python

I caught myself a woogle bug! LOL, Gotcha! Sheesh, debugging Python can be so easy if you’re just not calling one program from another through the Unix / Linux API… my own fault.

Still, I love the design of my program and going through this debugging exercise was very much worth it for the lessons in cleaning up program output and keeping meticulous pedantic records.

Alright, I caught the bug and I’m also doing a good bit of cleanup in blogslicer. It doesn’t need all the headline stuff. Trim it down. Blogslicer is about as trimmed down as I can make it, just outputting the list of files it generated with absolute paths. This looks very nice inline with the Figlet labels:

 __  __         _     _                 ____   _  _
|  \/  |  __ _ | | __(_) _ __    __ _  / ___| (_)| |_   ___  ___
| |\/| | / _` || |/ /| || '_ \  / _` | \___ \ | || __| / _ \/ __|
| |  | || (_| ||   < | || | | || (_| |  ___) || || |_ |  __/\__ \ _  _  _
|_|  |_| \__,_||_|\_\|_||_| |_| \__, | |____/ |_| \__| \___||___/(_)(_)(_)
                                |___/

INTERPRETER: /home/healus/py310/bin/python
SLICER: /home/healus/github/blogslicer/blogslicer/core.py
____ _  _ ____ ____ _ _    _    ____ ___ ____ ____ _  _  ____ ____ _  _
| __ |  | |___ |__/ | |    |    |__|  |  |___ |    |__|  |    |  | |\/|
|__] |__| |___ |  \ | |___ |___ |  |  |  |___ |___ |  | .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/GuerillaTech.com -t "Guerilla Tech" -s "blog" -a "Mike Levin"

/home/healus/github/GuerillaTech.com/_posts/2022-04-22-post-1.md
_    ____ _  _ _ _  _ _  _ _  _  ____ ____ _  _
|    |___ |  | | |\ | |  |  \/   |    |  | |\/|
|___ |___  \/  | | \| |__| _/\_ .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/Levinux.com -t "Levinux" -s "blog" -a "Mike Levin"

/home/healus/github/Levinux.com/_posts/1970-01-01-post-1.md
_    _ _  _ _  _ _  _ ___  _   _ ___ _  _ ____ _  _ _  _ _ _  _ ____ _ ___  ____ ____ _  _
|    | |\ | |  |  \/  |__]  \_/   |  |__| |  | |\ | |  | | |\/| | __ |  |   |    |  | |\/|
|___ | | \| |__| _/\_ |      |    |  |  | |__| | \|  \/  | |  | |__] |  |  .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/LinuxPythonvimgit.com -t "Linux, Python, vim & git" -s "blog" -a "Mike Levin"

/home/healus/github/LinuxPythonvimgit.com/_posts/2022-05-21-post-6.md
/home/healus/github/LinuxPythonvimgit.com/_posts/2022-05-21-post-5.md
/home/healus/github/LinuxPythonvimgit.com/_posts/2022-05-20-post-4.md
/home/healus/github/LinuxPythonvimgit.com/_posts/2022-05-02-post-3.md
/home/healus/github/LinuxPythonvimgit.com/_posts/2022-05-01-post-2.md
/home/healus/github/LinuxPythonvimgit.com/_posts/2022-04-21-post-1.md
_    _  _ _  _ ___  ____ ____ _  _ ____ _  _ ___   ____ ____ _  _
|    |  | |\ | |  \ |___ |__/ |  | |__| |\ | |  \  |    |  | |\/|
|___ |__| | \| |__/ |___ |  \  \/  |  | | \| |__/ .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/LunderVand.com -t "LunderVand" -s "blog" -a "Mike Levin"

/home/healus/github/LunderVand.com/_posts/2022-04-22-post-1.md
_  _ _ _  _ ____    _    ____ _  _ _ _  _  ____ ____ _  _
|\/| | |_/  |___ __ |    |___ |  | | |\ |  |    |  | |\/|
|  | | | \_ |___    |___ |___  \/  | | \| .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/Mike-Levin.com -t "Mike Levin Dot Com" -s "blog" -a "Mike Levin"

/home/healus/github/Mike-Levin.com/_posts/2022-04-22-post-1.md
_  _ _ _  _ ____ ____ ___ ____ _    ____ _  _ ____ _  _  ____ ____ _  _
|\/| | |_/  |___ |__|  |  |___ |    |___ |  | |___ |\ |  |    |  | |\/|
|  | | | \_ |___ |  |  |  |___ |___ |___  \/  |___ | \| .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/MikeAtEleven.com -t "Mike At Eleven" -s "blog" -a "Mike Levin"

/home/healus/github/MikeAtEleven.com/_posts/2022-06-06-post-23.md
/home/healus/github/MikeAtEleven.com/_posts/2022-06-05-post-22.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-07-post-21.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-06-post-20.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-05-post-19.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-04-post-18.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-04-post-17.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-04-post-16.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-04-post-15.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-04-post-14.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-04-post-13.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-04-post-12.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-03-post-11.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-03-post-10.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-03-post-9.md
/home/healus/github/MikeAtEleven.com/_posts/2022-05-02-post-8.md
/home/healus/github/MikeAtEleven.com/_posts/2022-04-20-post-7.md
/home/healus/github/MikeAtEleven.com/_posts/2021-09-02-post-6.md
/home/healus/github/MikeAtEleven.com/_posts/2021-08-26-post-5.md
/home/healus/github/MikeAtEleven.com/_posts/2021-08-24-post-4.md
/home/healus/github/MikeAtEleven.com/_posts/2021-08-23-post-3.md
/home/healus/github/MikeAtEleven.com/_posts/2021-08-22-post-2.md
/home/healus/github/MikeAtEleven.com/_posts/2021-08-20-post-1.md
_  _ _ _  _ ____ _    ____ _  _  _ _  _
|\/| | |_/  |___ |    |___ |  |  | |\ |
|  | | | \_ |___ |___ |___  \/  .| | \|


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/MikeLev.in -t "Mike Levin" -s "blog" -a "Mike Levin"

/home/healus/github/MikeLev.in/_posts/2022-06-18-post-140.md
/home/healus/github/MikeLev.in/_posts/2022-06-17-post-139.md
/home/healus/github/MikeLev.in/_posts/2022-06-16-post-138.md
/home/healus/github/MikeLev.in/_posts/2022-06-16-post-137.md
/home/healus/github/MikeLev.in/_posts/2022-06-15-post-136.md
/home/healus/github/MikeLev.in/_posts/2022-06-15-post-135.md
/home/healus/github/MikeLev.in/_posts/2022-06-15-post-134.md
/home/healus/github/MikeLev.in/_posts/2022-06-15-post-133.md
/home/healus/github/MikeLev.in/_posts/2022-06-14-post-132.md
/home/healus/github/MikeLev.in/_posts/2022-06-14-post-131.md
/home/healus/github/MikeLev.in/_posts/2022-06-14-post-130.md
/home/healus/github/MikeLev.in/_posts/2022-06-13-post-129.md
/home/healus/github/MikeLev.in/_posts/2022-06-12-post-128.md
/home/healus/github/MikeLev.in/_posts/2022-06-11-post-127.md
/home/healus/github/MikeLev.in/_posts/2022-06-10-post-126.md
/home/healus/github/MikeLev.in/_posts/2022-06-10-post-125.md
/home/healus/github/MikeLev.in/_posts/2022-06-09-post-124.md
/home/healus/github/MikeLev.in/_posts/2022-06-09-post-123.md
/home/healus/github/MikeLev.in/_posts/2022-06-08-post-122.md
/home/healus/github/MikeLev.in/_posts/2022-06-08-post-121.md
/home/healus/github/MikeLev.in/_posts/2022-06-08-post-120.md
/home/healus/github/MikeLev.in/_posts/2022-06-07-post-119.md
/home/healus/github/MikeLev.in/_posts/2022-06-07-post-118.md
/home/healus/github/MikeLev.in/_posts/2022-06-07-post-117.md
/home/healus/github/MikeLev.in/_posts/2022-06-03-post-116.md
/home/healus/github/MikeLev.in/_posts/2022-06-01-post-115.md
/home/healus/github/MikeLev.in/_posts/2022-06-01-post-114.md
/home/healus/github/MikeLev.in/_posts/2022-05-31-post-113.md
/home/healus/github/MikeLev.in/_posts/2022-05-31-post-112.md
/home/healus/github/MikeLev.in/_posts/2022-05-31-post-111.md
/home/healus/github/MikeLev.in/_posts/2022-05-30-post-110.md
/home/healus/github/MikeLev.in/_posts/2022-05-30-post-109.md
/home/healus/github/MikeLev.in/_posts/2022-05-29-post-108.md
/home/healus/github/MikeLev.in/_posts/2022-05-29-post-107.md
/home/healus/github/MikeLev.in/_posts/2022-05-28-post-106.md
/home/healus/github/MikeLev.in/_posts/2022-05-28-post-105.md
/home/healus/github/MikeLev.in/_posts/2022-05-27-post-104.md
/home/healus/github/MikeLev.in/_posts/2022-05-26-post-103.md
/home/healus/github/MikeLev.in/_posts/2022-05-26-post-102.md
/home/healus/github/MikeLev.in/_posts/2022-05-25-post-101.md
/home/healus/github/MikeLev.in/_posts/2022-05-25-post-100.md
/home/healus/github/MikeLev.in/_posts/2022-05-25-post-99.md
/home/healus/github/MikeLev.in/_posts/2022-05-23-post-98.md
/home/healus/github/MikeLev.in/_posts/2022-05-23-post-97.md
/home/healus/github/MikeLev.in/_posts/2022-05-22-post-96.md
/home/healus/github/MikeLev.in/_posts/2022-05-21-post-95.md
/home/healus/github/MikeLev.in/_posts/2022-05-20-post-94.md
/home/healus/github/MikeLev.in/_posts/2022-05-20-post-93.md
/home/healus/github/MikeLev.in/_posts/2022-05-19-post-92.md
/home/healus/github/MikeLev.in/_posts/2022-05-19-post-91.md
/home/healus/github/MikeLev.in/_posts/2022-05-19-post-90.md
/home/healus/github/MikeLev.in/_posts/2022-05-18-post-89.md
/home/healus/github/MikeLev.in/_posts/2022-05-18-post-88.md
/home/healus/github/MikeLev.in/_posts/2022-05-18-post-87.md
/home/healus/github/MikeLev.in/_posts/2022-05-17-post-86.md
/home/healus/github/MikeLev.in/_posts/2022-05-17-post-85.md
/home/healus/github/MikeLev.in/_posts/2022-05-16-post-84.md
/home/healus/github/MikeLev.in/_posts/2022-05-16-post-83.md
/home/healus/github/MikeLev.in/_posts/2022-05-16-post-82.md
/home/healus/github/MikeLev.in/_posts/2022-05-15-post-81.md
/home/healus/github/MikeLev.in/_posts/2022-05-13-post-80.md
/home/healus/github/MikeLev.in/_posts/2022-05-13-post-79.md
/home/healus/github/MikeLev.in/_posts/2022-05-13-post-78.md
/home/healus/github/MikeLev.in/_posts/2022-05-13-post-77.md
/home/healus/github/MikeLev.in/_posts/2022-05-13-post-76.md
/home/healus/github/MikeLev.in/_posts/2022-05-11-post-75.md
/home/healus/github/MikeLev.in/_posts/2022-05-10-post-74.md
/home/healus/github/MikeLev.in/_posts/2022-05-10-post-73.md
/home/healus/github/MikeLev.in/_posts/2022-05-08-post-72.md
/home/healus/github/MikeLev.in/_posts/2022-05-08-post-71.md
/home/healus/github/MikeLev.in/_posts/2022-05-08-post-70.md
/home/healus/github/MikeLev.in/_posts/2022-05-07-post-69.md
/home/healus/github/MikeLev.in/_posts/2022-05-07-post-68.md
/home/healus/github/MikeLev.in/_posts/2022-05-07-post-67.md
/home/healus/github/MikeLev.in/_posts/2022-05-07-post-66.md
/home/healus/github/MikeLev.in/_posts/2022-05-06-post-65.md
/home/healus/github/MikeLev.in/_posts/2022-05-06-post-64.md
/home/healus/github/MikeLev.in/_posts/2022-05-05-post-63.md
/home/healus/github/MikeLev.in/_posts/2022-05-05-post-62.md
/home/healus/github/MikeLev.in/_posts/2022-05-05-post-61.md
/home/healus/github/MikeLev.in/_posts/2022-05-05-post-60.md
/home/healus/github/MikeLev.in/_posts/2022-05-04-post-59.md
/home/healus/github/MikeLev.in/_posts/2022-05-04-post-58.md
/home/healus/github/MikeLev.in/_posts/2022-05-04-post-57.md
/home/healus/github/MikeLev.in/_posts/2022-05-04-post-56.md
/home/healus/github/MikeLev.in/_posts/2022-05-03-post-55.md
/home/healus/github/MikeLev.in/_posts/2022-05-02-post-54.md
/home/healus/github/MikeLev.in/_posts/2022-05-02-post-53.md
/home/healus/github/MikeLev.in/_posts/2022-05-02-post-52.md
/home/healus/github/MikeLev.in/_posts/2022-05-01-post-51.md
/home/healus/github/MikeLev.in/_posts/2022-05-01-post-50.md
/home/healus/github/MikeLev.in/_posts/2022-05-01-post-49.md
/home/healus/github/MikeLev.in/_posts/2022-05-01-post-48.md
/home/healus/github/MikeLev.in/_posts/2022-05-01-post-47.md
/home/healus/github/MikeLev.in/_posts/2022-05-01-post-46.md
/home/healus/github/MikeLev.in/_posts/2022-05-01-post-45.md
/home/healus/github/MikeLev.in/_posts/2022-05-01-post-44.md
/home/healus/github/MikeLev.in/_posts/2022-04-30-post-43.md
/home/healus/github/MikeLev.in/_posts/2022-04-30-post-42.md
/home/healus/github/MikeLev.in/_posts/2022-04-30-post-41.md
/home/healus/github/MikeLev.in/_posts/2022-04-30-post-40.md
/home/healus/github/MikeLev.in/_posts/2022-04-30-post-39.md
/home/healus/github/MikeLev.in/_posts/2022-04-29-post-38.md
/home/healus/github/MikeLev.in/_posts/2022-04-29-post-37.md
/home/healus/github/MikeLev.in/_posts/2022-04-29-post-36.md
/home/healus/github/MikeLev.in/_posts/2022-04-29-post-35.md
/home/healus/github/MikeLev.in/_posts/2022-04-28-post-34.md
/home/healus/github/MikeLev.in/_posts/2022-04-27-post-33.md
/home/healus/github/MikeLev.in/_posts/2022-04-27-post-32.md
/home/healus/github/MikeLev.in/_posts/2022-04-26-post-31.md
/home/healus/github/MikeLev.in/_posts/2022-04-26-post-30.md
/home/healus/github/MikeLev.in/_posts/2022-04-25-post-29.md
/home/healus/github/MikeLev.in/_posts/2022-04-23-post-28.md
/home/healus/github/MikeLev.in/_posts/2022-04-23-post-27.md
/home/healus/github/MikeLev.in/_posts/2022-04-22-post-26.md
/home/healus/github/MikeLev.in/_posts/2022-04-22-post-25.md
/home/healus/github/MikeLev.in/_posts/2022-04-21-post-24.md
/home/healus/github/MikeLev.in/_posts/2022-04-21-post-23.md
/home/healus/github/MikeLev.in/_posts/2022-04-15-post-22.md
/home/healus/github/MikeLev.in/_posts/2022-04-15-post-21.md
/home/healus/github/MikeLev.in/_posts/2022-04-13-post-20.md
/home/healus/github/MikeLev.in/_posts/2022-04-10-post-19.md
/home/healus/github/MikeLev.in/_posts/2022-04-09-post-18.md
/home/healus/github/MikeLev.in/_posts/2022-04-08-post-17.md
/home/healus/github/MikeLev.in/_posts/2022-04-07-post-16.md
/home/healus/github/MikeLev.in/_posts/2022-02-21-post-15.md
/home/healus/github/MikeLev.in/_posts/2022-02-18-post-14.md
/home/healus/github/MikeLev.in/_posts/2022-02-17-post-13.md
/home/healus/github/MikeLev.in/_posts/2022-02-16-post-12.md
/home/healus/github/MikeLev.in/_posts/2022-02-15-post-11.md
/home/healus/github/MikeLev.in/_posts/2021-11-14-post-10.md
/home/healus/github/MikeLev.in/_posts/2021-08-30-post-9.md
/home/healus/github/MikeLev.in/_posts/2021-08-26-post-8.md
/home/healus/github/MikeLev.in/_posts/2021-08-24-post-7.md
/home/healus/github/MikeLev.in/_posts/2021-08-20-post-6.md
/home/healus/github/MikeLev.in/_posts/2021-08-20-post-5.md
/home/healus/github/MikeLev.in/_posts/2021-08-20-post-4.md
/home/healus/github/MikeLev.in/_posts/2021-08-06-post-3.md
/home/healus/github/MikeLev.in/_posts/2021-08-05-post-2.md
_  _ _ _  _ ____ _    ____ _  _ _ _  _ ____ ____ ____  ____ ____ _  _
|\/| | |_/  |___ |    |___ |  | | |\ | [__  |___ |  |  |    |  | |\/|
|  | | | \_ |___ |___ |___  \/  | | \| ___] |___ |__| .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/MikeLevinSEO.com -t "Mike Levin SEO" -s "blog" -a "Mike Levin"

___  _ ___  _  _ _    ____ ___ ____  ____ ____ _  _
|__] | |__] |  | |    |__|  |  |___  |    |  | |\/|
|    | |    |__| |___ |  |  |  |___ .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/Pipulate.com -t "Pipulate" -s "blog" -a "Mike Levin"

/home/healus/github/Pipulate.com/_posts/2022-04-22-post-1.md
___  _   _ ___ _  _ ____ _  _ _ ____ ____ _    _    _   _  ____ ____ _  _
|__]  \_/   |  |__| |  | |\ | | |    |__| |    |     \_/   |    |  | |\/|
|      |    |  |  | |__| | \| | |___ |  | |___ |___   |   .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/PythonicAlly.com -t "Pythonically" -s "blog" -a "Mike Levin"

/home/healus/github/PythonicAlly.com/_posts/2022-04-28-post-1.md
___  _   _ ___ _  _ ____ _  _ _ ____ ____ _    _    _   _  ____ ____ ____
|__]  \_/   |  |__| |  | |\ | | |    |__| |    |     \_/   |  | |__/ | __
|      |    |  |  | |__| | \| | |___ |  | |___ |___   |   .|__| |  \ |__]


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/PythonicAlly.org -t "Your Pythonic Ally" -s "blog" -a "Mike Levin"

/home/healus/github/PythonicAlly.org/_posts/1970-01-01-post-1.md
____ ____ _  _ ____ _  _ ____ ___  _    ____ ____ _ _  _ ____ ____ ____  ____ ____ _  _
|__/ |___ |\/| |  | |  | |__| |__] |    |___ |___ | |\ | | __ |___ |__/  |    |  | |\/|
|  \ |___ |  | |__|  \/  |  | |__] |___ |___ |    | | \| |__] |___ |  \ .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/RemovableFinger.com -t "Removable Finger" -s "blog" -a "Mike Levin"

/home/healus/github/RemovableFinger.com/_posts/2022-05-01-post-1.md
___ ____ ____ ___  _ ____ ____ ____ ___  ____ ____ _ ____ ____ _  _ ____  ____ ____ _  _
 |  |__| |__/ |  \ | | __ |__/ |__| |  \ |___ |    | |__/ |    |  | [__   |    |  | |\/|
 |  |  | |  \ |__/ | |__] |  \ |  | |__/ |___ |___ | |  \ |___ |__| ___] .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/TardigradeCircus.com -t "Tardigrade Circus" -s "blog" -a "Mike Levin"

/home/healus/github/TardigradeCircus.com/_posts/2022-04-22-post-1.md
___ _ ____ ___ ____ ____ _  _ ____ ____ ____ _ ____ _  _ ___ _ _  _ ____ ____  ____ ____ _  _
 |  | |     |  |__| |    |  | |___ |  | [__  | | __ |__|  |  | |\ | | __ [__   |    |  | |\/|
 |  | |___  |  |  | |___ |__| |    |__| ___] | |__] |  |  |  | | \| |__] ___] .|___ |__| |  |


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/TicTacUFOSightings.com -t "Tic Tac UFO Sightings" -s "blog" -a "Mike Levin"

/home/healus/github/TicTacUFOSightings.com/_posts/1970-01-01-post-1.md
_ _ _ ____ ____ _  _ _    _   _ ____ ____ ___  ____ ____ ___ ____  ___  ___
| | | |___ |___ |_/  |     \_/  |__/ |___ |__] |  | |__/  |  [__     /  |  \
|_|_| |___ |___ | \_ |___   |   |  \ |___ |    |__| |  \  |  ___] . /__ |__/


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/WeeklyReports.zd -t "Mike Levin's Weekly Reports" -s "blog" -a "Mike Levin"

/home/healus/github/WeeklyReports.zd/_posts/2022-05-07-post-2.md
/home/healus/github/WeeklyReports.zd/_posts/2022-05-14-post-1.md
_ _ _ _  _ ____ ___ ____ ____ _  _ ____ ___ ____ ____ ____ ____  _ ____
| | | |__| |__|  |  [__  |__| |\/| |___  |  |__| |___ |  | |__/  | |  |
|_|_| |  | |  |  |  ___] |  | |  | |___  |  |  | |    |__| |  \ .| |__|


/home/healus/py310/bin/python /home/healus/github/blogslicer/blogslicer/core.py -p /home/healus/github/WhatsaMetaFor.io -t "What's A Meta For" -s "blog" -a "Mike Levin"

/home/healus/github/WhatsaMetaFor.io/_posts/2022-05-07-post-4.md
/home/healus/github/WhatsaMetaFor.io/_posts/2022-05-06-post-3.md
/home/healus/github/WhatsaMetaFor.io/_posts/2022-05-03-post-2.md
/home/healus/github/WhatsaMetaFor.io/_posts/2022-04-30-post-1.md
## Done!

Very nice.

The last thing is all that ugly git output, not knowing when to show stdout and when to show stderr.

I’ve done an excellent job so far on this fixing of this bug and improving the output from the program. I think it’s even going to play well online. I can’t wait to talk people through this post. Currently the only issues left are how git displays and actually adding a new folder to the git commit to the release process.

Okay, and now finally it’s done. I have the assets/images/* folder being added every time. It makes this post look correct and now any journal.md file I’m editing in vim is one @l away from being published.

# export

import os
import sys
import shlex
import argparse
import pandas as pd
from git import Repo
from pathlib import Path
from art import text2art
from collections import namedtuple
from subprocess import Popen, PIPE


if os.name == "nt":
    git_exe = r"C:\Program Files\Git\cmd\git.exe"
else:
    git_exe = "/usr/bin/git"
os.environ["GIT_PYTHON_GIT_EXECUTABLE"] = git_exe

lr = "\n"
python = sys.executable
home = Path(os.path.expanduser("~"))
blogslicer = Path(home / Path("github/blogslicer/blogslicer/core.py"))
if hasattr(__builtins__, "__IPYTHON__") or __name__ != "__main__":
    is_jupyter = True
    from IPython.display import display, Markdown, HTML

    h1 = lambda text: display(Markdown(f"# {text}"))
    h2 = lambda text: display(Markdown(f"## {text}"))
    h3 = lambda text: display(Markdown(f"### {text}"))
    file = "sites.csv"
    # apex = "/mnt/c/Users/mikle/github/MikeAtEleven.com"
    apex = ""
else:
    is_jupyter = False
    h1 = lambda text: print(f"# {text}")
    h2 = lambda text: print(f"## {text}")
    h3 = lambda text: print(f"### {text}")
    aparser = argparse.ArgumentParser()
    add_arg = aparser.add_argument
    add_arg("-f", "--file", required=True)
    add_arg("-x", "--apex", required=False)
    args = aparser.parse_args()
    file = args.file
    apex = args.apex


def fig(text, font="Standard"):
    if is_jupyter:
        display(
            HTML(
                f'<pre style="white-space: pre;">{text2art(text, font=font).replace(lr, "<br/>")}</pre>'
            )
        )
    else:
        print(text2art(text, font))


fig("Making Sites...")

file_obj = Path(file)
df = pd.read_csv(file_obj, delimiter="|")
df = df.applymap(lambda x: x.strip())
df.columns = [x.strip() for x in df.columns]
if apex:
    apex = Path(apex).name
    df = df[df["apex"] == apex]
Site = namedtuple("Site", "path, apex, title, gaid, tagline")


def flush(std):
    for line in std:
        line = line.strip()
        if line:
            print(line)
            sys.stdout.flush()


def git(cwd, args):
    cmd = [git_exe] + shlex.split(args)
    print()
    print(f"<< {shlex.join(cmd)} >>")
    process = Popen(
        args=cmd,
        cwd=cwd,
        stdout=PIPE,
        stderr=PIPE,
        shell=False,
        bufsize=1,
        universal_newlines=True,
    )
    flush(process.stdout)
    flush(process.stderr)


print(f"INTERPRETER: << {python} >>")
print(f"SLICER: << {blogslicer} >>")

for index, series in df.iterrows():
    site = Site(**series.to_dict())
    fig(site.apex, font="Cybermedium")
    here = Path(home / site.path)
    [x.unlink() for x in Path(here / "_posts/").glob("*")]

    cmd = f'{python} {blogslicer} -p {here} -t "{site.title}" -s "blog" -a "Mike Levin"'
    print(cmd, end="\n\n")
    with Popen(args=cmd, cwd=here, stdout=PIPE, stderr=PIPE, shell=True) as pout:
        for line in pout.stdout.readlines():
            print(line.decode().strip())

    git(here, "add _posts/*")
    git(here, "add assets/images/*")
    git(here, 'commit -am "Publising Blog Posts"')
    git(here, "push")

h2("Done!")

Fri Jun 17, 2022

Throw All The Meat & Veggies Into The Stew & Stir The Blog / Vlog

category: My Static Site Generator (Skite)

Wow, that was one of the most epic blog posts I’ve ever written. The genesis on that was that with the innocent seeming downgrading of my Ubuntu 20.04 to 18.04 to have a better chance of getting LXD Linux containers working on WSL Windows Subsystem for Linux, I broke my fragile chewing-gum and paper-clips blogging release system. And it was for the best. It was in desperate need of an overhaul and it took precedent over my streak of YouTube livecasting.

I originally documented that whole Unix Script-based version over on MikeLevinSEO.com, figuring I’d separate technical content (there) versus more personal content here. But after all my talk about one big integrated life, I see how futile that was. I need everything dropped in one big vat, initially. We throw all the meat and veggies into the stew pot and stir it up.

Every time we revisit something we want to rewrite it. Things should be short and easily rewritable. They should also remain “living” code in Jupyter Notebooks. These things should remain fluid and flexible so that when inspiration hits you can mold it to taste. I’m in that stirring and molding phase. I’m going to go for an 80/20-rule win: a vim keyboard macro to generate just the site I’m editing. That should be possible by just feeding the script the name of the path of the file being edited.

vim will let you show the full path up to the “head”:

:echo expand('%:p:h')

We could put that value into a vim macro with this:

:let @l = expand('%:p:h')

Okay, wow I got it working:

let @l = ":execute '!python ~/github/skite/skite/core.py -f ~/github/skite/sites.csv -x ' . expand('%:p:h') . '; read -p \"Press Enter to Continue...\"'^M"

Amazing! Pshwew, a real game-changer. Only generate out the sites I’m working on. Simply add sites that I may generate out to sites.csv in the skite repo. I need some figlet output! I started out thinking I would need a bash script wrapper. I don’t! Oh boy, okay.


Thu Jun 16, 2022

The Epic nbdev Static Site Generator Post

category: My Static Site Generator (Skite)

This recent downgrading from Ubuntu 20.04 to Ubuntu 18.04 broke by blog release system, Ugh! It’s not that files were lost. It’s that my paths subtly changed and hardwired locations broke. And I don’t particularly want to just bring it back alive the way it was. I was already mixing Unix shell scripts (bash scripts, technically) with Python programs and as such I should really commit entirely to Python programs because they’re much easier to read and maintain. Nobody wants to have to code in bash script, which is why PERL became so popular as the pee-in-the-pool sysadmin devops language of default Linux installs, and now Python is taking over that role. Python is a Unix/Linux system automation language. Face it and use it that way. Okay, 1, 2, 3… 1?

We want to switch from Unix Scripts to Python Programs, but we very much do not want to give up the timeless Unix/Linux API. What do I even mean by that? I mean the way *nix operating systems let you execute programs with parameters, arguments, switches and what-have-you with no muss and no fuss. They just run from the command-line as invoked, and can thus be scheduled and even strung together with the output of one program going into the input of another. It’s a huge part of what made Unix so great and the inevitable winner (albeit through the Linux implementations) in the Operating System Wars. In case you haven’t noticed, *nix OSes won… period. It’s not even a qualified statement. Unix and Linux won. Windows lost. Proprietary Mac lost. VAX lost. LISP lost. CP/M lost. DOS lost. TOPS lost. The list of corpses goes on. Need I continue?

Unix/Linux won because it’s API (application program interface) is best. This is the part we do not want to lose when we adapt our blog release system from bash scripts that naturally lean into the Unix-way to Python programs which may or may not lean into the Unix-way. So how do we write our Python programs so that thy are Unixy? Easy! We modularize them and make them work from the command-line with arguments. We use argparse. We don’t use Click and we don’t use DocOpt and we don’t use Typer. We don’t code in stupid dependencies outside the Python Standard Library if we don’t really need to based on the API of the Standard Library alternative being good enough, per the 80/20-rule. That means getting 80% of the benefit from the first 20% of the effort. And the good old Python Standard Library argparse import does exactly that.

Okay, let’s get some code down. When we start a project like this, we start with bare-bones basics. The first code-sample should be an eternal principle and as Elon Musk would say, first principles. What’s first principles of smushing Python programs into Unix-like interfaces? It’s making sure we understand how to execute Python code from a command-line interface, more colloquially known as the CLI. Start saying CLI. It will serve you well. I came from the Amiga Computer community and they knew that CLI was the correct terminology for the API you’re looking for. CLI is the API you’re looking for. You have to trust someone. I’m not trying to make any money off of you except for the YouTube advertising bucks. I’m not trying to turn you into micro soft sheep. I’m the Better Pied Piper. Trust me.

What are my principles in making this new content release system?

1, 2, 3… 1? The API for Python under *nix looks something like this… get examples. Beware! Everyone wants to run advertising in front of you. Go to the genuine Python documentation first. Start from here: https://docs.python.org/3/

From there, we can find Python subprocess to run from terminal and start to learn about the general API options. There appear to be many approaches and we need not leap into the first without a deeper look.

subprocess.run appears to be the first and obvious choice, but right at the opening of the documentation it says:

The recommended approach to invoking subprocesses is to use the run() function for all use cases it can handle. For more advanced use cases, the underlying Popen interface can be used directly.

This is very suspicious to me. Run accommodates the 80/20-rule they appear to be saying, but in the same breath they’re making us aware of subprocess.popen. Let’s look deeper! When we find subprocess.popen’s introduction, it says:

The underlying process creation and management in this module is handled by the Popen class. It offers a lot of flexibility so that developers are able to handle the less common cases not covered by the convenience functions.

There is also a big red warning here saying that for maximum reliability:

For maximum reliability, use a fully-qualified path for the executable. To search for an unqualified name on PATH, use shutil.which(). On all platforms, passing sys.executable is the recommended way to launch the current Python interpreter again, and use the -m command-line format to launch an installed module.

Clearly a first step is going to be getting a virtualenv (or venv) set up again at login. On the last video, I got the ~/github folder to be served from the Windows-side which a huge step forward in path simplification and data security (make sure OneDrive backs it up since you’re paying for it anyway?).

The example they give is:

Popen(["/usr/bin/git", "commit", "-m", "Fixes a bug."])

I know from doing a lot of this stuff in the past, I want to use Popen over Run for maximum options. My needs get pretty particular as we get into it and switching from the more limited Run API to Popen API is a pain, so we start out with Popen. It’s also worth noting that we keep the command-paths short by doing the import like this:

from subprocess import Popen

This provides the best compromise of “standard code” (not renaming Popen using “as” on the import as is my temptation) and short, snappy commands (as opposed to using subprocess.Popen() everywhere).

Ugh! I just realized as I’m getting ready to code, this is another one of those cases of “Before I tell you that story, I have to tell you this story. This story being that of venv and virtualenv, but before even that, the story of Python 3.10 under Ubuntu 18.04 and perhaps of dead snakes.

Okay, deep breath! 1, 2, 3… 1? Try installing Python 3.10 directly from Ubuntu. Do an update/upgrade first.

Okay, now try this without any Dead Snake PPA modifications:

sudo apt install python3.10

Woot! It worked! Welcome to 2022! Oops, no, not at all! I actually got the wrong thing installed. Delete and purge!

healus@LunderVand:~$ sudo apt install python3.10
Reading package lists... Done
Building dependency tree
Reading state information... Done
Note, selecting 'postgresql-plpython3-10' for regex 'python3.10'
postgresql-plpython3-10 is already the newest version (10.21-0ubuntu0.18.04.1).
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.

Now I have to clean it up with:

sudo apt --purge autoremove python3.10

Ugh okay, back to dead snakes.

sudo apt install software-properties-common -y
sudo add-apt-repository ppa:deadsnakes/ppa

And now we can:

sudo apt install python3.10

Which shows me this (I should have read more carefully the first time):

Reading package lists... Done
Building dependency tree
Reading state information... Done
The following additional packages will be installed:
  libpython3.10-minimal libpython3.10-stdlib python3.10-minimal
Suggested packages:
  python3.10-venv binutils binfmt-support
The following NEW packages will be installed:
  libpython3.10-minimal libpython3.10-stdlib python3.10 python3.10-minimal
0 upgraded, 4 newly installed, 0 to remove and 0 not upgraded.
Need to get 5086 kB of archives.
After this operation, 19.5 MB of additional disk space will be used.
Do you want to continue? [Y/n]

I answered Y, and now an attempt to run Python 3.10 succeeds:

healus@LunderVand:~$ python3.10
Python 3.10.5 (main, Jun 11 2022, 16:53:29) [GCC 7.5.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>>

We exit out with the exit() function and move onto our venv or virtualenv step. This is critical for “locking us into Python 3.10” everywhere under Linux forever forward, until we change it deliberately.

There is also some ambiguity here as to whether the correct pip is installed for getting Python packages from PyPI, but we will cross that bridge after we set up our Python virtual environment.

The good news here is that the virtual environment manager, venv, is now built into Python. We want to use the latest one from Python 3.10 (and not the pre-installed under Ubuntu 18.04 Python 3.6 version) and to do that we can use this command. I make sure I cd home because that’s where I want to keep my venv’s:

cd ~/
python3.10 -m venv py310

…and I get the error:

Error: Command '['/home/healus/py310/bin/python3.10', '-Im', 'ensurepip',
'--upgrade', '--default-pip']' returned non-zero exit status 1.

Hmmm, I suppose I was mistaken about venv being built-in. I guess like pip you have to ensure it’s there. So I do this command:

sudo apt install python3.10-venv

It succeeded and now this command:

python3.10 -m venv py310

…worked. If I ls from ~/ I see:

healus@LunderVand:~$ ls
abouts.sh  all.sh  github  hps.sh  install.sh  py310
healus@LunderVand:~$

And that’s what I expect to see. I now have a virtual environment for Python 3.10 and I have to make sure it’s activated all the time. I do that by editing my .bash_profile file.

We can activate our new venv on a 1-time basis to test it with:

source ~/py310/bin/activate

Try it:

healus@LunderVand:~/github/helpers$ source ~/py310/bin/activate
(py310) healus@LunderVand:~/github/helpers$

Nice! It works with color coding and all.

I have an old .bash_profile file but I’m going to establish a new one and build it up with only what I need and to do that I’m going to cd into the github repo where I back-up my .bash_profile and load 2 files at once. This will have the effect of making a new one in my home ~/ directory:

(py310) healus@LunderVand:~/github/helpers$ vim .bash_profile ~/.bash_profile

And now I just navigate between the 2 files in vim and transpose over the line:

source ~/py310/bin/activate

…into the new .bash_profile file. It is the only line in the file currently. Thankfully I can test it by just exiting out of the terminal and launching a new terminal. Confirmed, it’s working as expected but I do lose the nifty Ubuntu color coding on the prompt. No big deal. There are ways to fix that but me detects rabbit holes… at least for now.

Okay, so now back to the story of Popen and argparse.

Honestly, it should be simpler and the fact that it is not “in my fingers” yet is an issue. It is a problem maybe with the Python standard library that has made the APIs elude me, and maybe make these suitable. But before going on some sort of 3rd party pip-installable scavenger hunt, see if you can’t actually get

Okay, pshwew! Am I going to go with the blogslicer identity?

What is the criteria?

How is this going to work?

First and foremost, it should be nbdev-based so it has a duality life in both a Jupyter Notebook and as a pip installable package. To that end, be sure that you pip install nbdev.

pip install nbdev

Wow, that installs a lot! And on its advice:

/home/healus/py310/bin/python3.10 -m pip install --upgrade pip

Okay and this reminds me. We can add new kernels to… no, don’t go that route. That would be Windows-side Python installs. Stick with whatever version (for now) that ships with Jupyter Desktop. But on the Jupyter Desktop-side, upgrade all your pip packages… I have a notebook called pipulate.ipynb in the github mlseo repo currently which does this. Run that. It won’t get the Windows Jupyter Desktop side and WSL Ubuntu 18.04 Python side exactly in sync, but at least all the pip versions of everything will be the latest. Okay, done.

Check for held-back versions because of dependencies:

Package Version Latest Type
------- ------- ------ -----
h11     0.12.0  0.13.0 wheel
mistune 0.8.4   2.0.2  wheel
pip     22.1    22.1.2 wheel
pywin32 303     304    wheel

Okay, next?

Are we going to make this part of the blogslicer package that already exists and was created in exactly this way? Maybe… no. Keep the two things separate. I know how to do this:

pip install -e .

Let …and that is a game changer! You just cd into that repo and type that. In the case of Jupyter Desktop, you would use a Terminal from Jupyter when doing that, or maybe even alternatively a Jupyter magic command right from a notebook (not sure if this will work) such as:

%pip install -e .

I may have to pip uninstall things before using this technique, but I should be able to have a series of side-by-side repos, each of which is a different command used in a Popen() managed set of Linux commands.

So I’m basically designing a series of separate Linux commands in Python, which I’ll invoke using the classic full-path and -m usage so that there’s no misunderstanding, fragility or required OS trickery (eliminating the extensions like nbdev does).

I want to do this as quickly as possible to be “back in business” since I did this Ubuntu downgrade that broke the old release system that was in desperate need of an overhaul anyway. Take advantage of this situation.

This is really becoming an epic post. But what I’m trying to do is epic. A true and righteous way of working. I can’t wait to get Windows-side Jupyter Desktop out of the picture. Soon I’ll be on wslg… maybe. They either have to bring it to Windows 10 or they have to make Windows 11 virtual desktop transitions work as well as they do in Windows 10.

Okay, so we need a name for this. We already are using blogslicer. allsites? Nice and simple and says what it does. ssg? ssites? Yeah, that’s sounding short and sweet. ssgites? skites? skite!

mkdir skite

Ugh! I got sick of looking at the non-color-coded prompt in the Linux Terminal after seeing it so much during my recent work. So I googled up this solution: https://gist.github.com/insin/1425703

PS1="${PYTHON_VIRTUALENV}${GREEN}\u@\h ${YELLOW}\w${COLOR_NONE} ${BRANCH}${PROMPT_SYMBOL} "

It works great! I also had to change the Terminal colors to Tango Dark, which are awesome! And finally, I transposed over just enough of my old .bash_profile file for things to be comfortable:

source ~/py310/bin/activate
export DISPLAY=$(cat /etc/resolv.conf | grep nameserver | awk '{print $2}'):0
cd ~/github/
alias profile="vim ~/.bash_profile"
alias systemd="cd /etc/systemd/system/"
alias open="explorer.exe ."
. ~/.bash_prompt

Nice. It’s feeling like home again. I’m really excited about getting this static site generation system (skite) going! 1, 2, 3… 1?

I made the skite folder. Really now it’s time to do the nbdev stuff. Everything I start out this way now is automatically also going to be an nbdev project. And so it should start out as a public repo on Github… or should it? Isn’t that only for the integration hooks? It’s got to at least be a git repo. So use Jeremy’s naming convention with a new notebook… one more 00_core.ipynb!

Okay, one thing I’m pretty sick of is remembering to run nbdev_clean_nbs every time before I do a git push. Nah, don’t introduce another release script yet. Just remember to do it. Okay, get this thing as a git repo and see if nbdev_new works in it.

You can’t use nbdev_clean_nbs without there being a settings.ini. There’s no settings.ini until you run nbdev_new. You can’t run nbdev_new until it’s a git repo. So you’ve got to put some file in there that doesn’t have the notebook metadata pollution problem. What? Foo! And then we can delete foo later. Just touch foo.txt.

I’m using a better bash prompt that tells which git branch you’re in. I may change it. https://gist.github.com/miki725/9783474

Okay, I did a git init, git add and git commit. But nbdev_new still needs a remote.origin.url. Interesting! So nbdev doesn’t work unless a repo is also in a remote repository. Okay, fine. I’ll github it… and done! Wow, this is looking pretty:

Color Coded Bash Prompt Python Venv Virtualenv Git Branch

Friggin’ wow! This is a brave new world. Keep forging forward. Lift the techniques you’re using in blogslicer and document them here in one super-long blog post until you can release this! Very meta!

Okay the program has to take as input a sites.csv file. Keep it csv so there’s rigid rules about columns. It could also give decent pandas practice. Yeah, practice pandas with this. It’s overkill, but beautiful overkill.

Think! Get the nbdev trick going ASAP. To that end, load the 00_core.ipynb file from blogslicer. It’s okay to have 2 files by this name loaded at once. One informs the other. 1, 2, 3… 1?

Git Repo Immediately After Nbdev_new

Alright, this is something that could be documented better on the interwebs. I know it’s going to be immediately dated, but so what. I am forging the way and this has archival value. I wish I did it back in the day. I’d get so much credit for all the ahead-of-its-time static site generator tech I innovated. So be it. Do it now. Use the great SSG stuff out there to better effect.

Here’s what a Jupyter Notebook looks like when you’re taking your deep breath in preparation for it to become a CLI command and a pip installable PyPI.org package:

Sprinkling In Nbdev Magic Fairy Dust

It’s hardly worth git committing it and all at this point, but I will for posterity’s sake.

Adding All Of Nbdev Crap

Okay… next? The time has come the Walrus said to make it know if it was invoked from Jupyter Notebook “run” or from the command-line. That’s the only purpose this program will have here at first, knowing how it was invoked. It will work one way if run as a Notebook and another way if run from CLI…

Jupyter Detect If Run From Notebook Or CLI

Okay, simple enough. But now we want to immediately export a “Python package”, which is a fancy way of saying a core.py file as a result of that comment… oops, no can do without a settings.ini file! Rip it off from blogslicer then modify minimally. Okay, done. Also had to add the comment line that defines the name of the exported library. Now I have ~/github/skite/skite/core.py which is the Python package containing similar code to the Jupyter Notebook. We’re getting there.

Nbdev_build_lib Default_exp Core

And now we can try running core.py. It should output “bar” and it does:

Detecting If Run From Jupter Notebook Or Command Line

Nice. We have the basic scaffolding now to build out. We need to parse args coming in, if from command-line and we need to set those same args in an in-Notebook fashion if run from Jupyter.

Ugh! Just as I suspected, I’m starting to get those annoying Github Run Failed: CI workflow messages. I don’t really want continuous integration at this point. There’s going to be a lot of stuff going on that would cause fails, so let’s turn those off!

We go into that repo’s Settings / Actions / General and Disable Actions

Github Repo Settings Actions General Disable Actions

Okay, now let’s arg parse. The main thing we’re going to want to take in is a filename of a csv file. Here’s the script detecting it was run from Jupyter:

Jupyter Detecting Script Run From Jupyter

And here is the script detecting it was run from the command-line interface (CLI):

Python Script Detecting It Was Run From Command Line Interface CLI

It’s another good time for a git commit here. Next we import both pandas and the Path function from the pathlib package from the standard library. It’s good practice to funnel filenames through the Path() function for interoperability of file paths between different host operating systems. I additionally made a simple .csv file.

foo,bar,spam,eggs
foo1,bar1,spam1,eggs1
foo2,bar2,spam2,eggs2

Jupyter Reading Csv File With Pandas

Next we want to test this from the command-line version but the csv file is one directory-level up from where core.py is run, so we can accommodate for that in how we feed the filename. Unix-like paths are universal.

Nbdev Exported Package Taking Argument Displaying Pandas Dataframe

This may not look like much, but we are laying a foundational framework… ugh! I hate frameworks! But our framework is as close to “The Unix Way” as we can stay while in a Jupyter Notebook.

Okay, I’m getting tired. Here’s the last item for today. I’m replacing the sites.csv file with this visually formatted pipe delimited “csv” file. CSV stands for comma delimited, but in reality you can delimit a CSV file with almost any character you want because Pandas lets you declare it on the read_csv method.

<pre>
site                   |title                       |gaid
GuerillaTech.com       |Guerilla Tech               |G-N4RYB5DCV4
Levinux.com            |Levinux                     |G-0H19QDRNTL
LinuxPythonvimgit.com  |Linux, Python, vim & git    |G-ZNMVJFLRD2
LunderVand.com         |LunderVand                  |G-TJRX2PSWKT
Mike-Levin.com         |Mike Levin Dot Com          |G-K86B8JW5Q5
MikeAtEleven.com       |Mike At Eleven              |G-L7L2XR3J2G
MikeLev.in             |Mike Levin                  |G-RX2D1N1P2Y
MikeLevinSEO.com       |Mike Levin SEO              |G-RNSSPXFB53
Pipulate.com           |Pipulate                    |G-45KYH6XTTX
PythonicAlly.com       |Pythonically                |G-K5EQ2QQG5D
PythonicAlly.org       |Your Pythonic Ally          |G-1JFHT28DRL
RemovableFinger.com    |Removable Finger            |G-L3QGENNVJ3
TardigradeCircus.com   |Tardigrade Circus           |G-9YEC9X0GDW
TicTacUFOSightings.com |Tic Tac UFO Sightings       |G-YXNMZY6Z6F
WeeklyReports.zd       |Mike Levin's Weekly Reports |G-foobarbaz1
WhatsaMetaFor.io       |What's A Meta For           |G-HL9DEK1TSG</pre>

…and here’s the latest status of the Notebook. You will notice the lambda function where I strip out all the extra whitespace that visual formatting (something I love to do for love-worthy’s sake).

Parsing Visually Formatted Pipe Delimited Csv File Pandas

I’m losing steam, but this very much sets the stage for tomorrow. Maybe an early-morning stream? Perhaps.

Okay, just took a nap. It’s just about 3:00 AM now… getting back to it in the same post. I’m making such good progress… gotta hit this thing home. Close!

Go ahead and add a tagline for each site. Okay done, as well as adding sites.csv to the repo as a sample input file. That’s fine.

Okay, next?

Next we should use Popen against each entry. Don’t do anything too filesystemy yet. Oh, but first we actually want to be able to simply loop through the Pandas dataframe using each row as… hmmm, I want a list of nametuples, one for each row. If I loop through the dataframe, each row is a series objects. Series can be turned into dicts…

Loop Through Pandas Df Each Row As Dict

Now each dict can be turned into a namedtuple. For those not into Python’s namedtuples yet, they’re very much like dictionaries but with a different API allowing Java-like dot notation (instead of square-bracket notation) for grabbing values inside of an object. They’re for other things too, but that’s how I’m using them now.

Pandas Dataframe To List Of Namedtuples Better Api

You have no idea how painful it is to be making this epically long blog post an not being able to generate it with Github Pages to look and see how it’s coming along. I could be running Jekyll locally to do that and that may come later. But for now I’m really just in a race to get it finished enough so that I can generate the site. It’s all very meta and has me up at 3:45 AM working on it. Every day should feel like Xmas Eve with gifts awaiting you!

I just added one more column to sites.csv for a path to where the site’s files can be found on the filesystem relative to where the script is run. Currently, it is exactly the same as the “apex” column, but that is only because I name my directories using a site’s apex domain. I will probably end up putting ~/github/ before each of the path entries.

Pipe Delimited Visually Formatted Csv

And in case you’re wondering, yes I am still test-running the nbdev Python package from the command line interface:

Nbdev Test Running Python Package From Cli

Okay, now’s where it gets real. From subprocess import Popen. Refresh myself with its API…

We should start by looking at the recommended Run() method first:

subprocess.run(args, *, stdin=None, input=None, stdout=None, stderr=None,
capture_output=False, shell=False, cwd=None, timeout=None, check=False,
encoding=None, errors=None, text=None, env=None, universal_newlines=None,
**other_popen_kwargs)¶

But instead, I’m actually using Popen() for its greater edge-case capabilities which as I recall I ran into when last doing stuff like this:

subprocess.Popen(args, bufsize=- 1, executable=None, stdin=None,
stdout=None, stderr=None, preexec_fn=None, close_fds=True, shell=False,
cwd=None, env=None, universal_newlines=None, startupinfo=None,
creationflags=0, restore_signals=True, start_new_session=False,
pass_fds=(), *, group=None, extra_groups=None, user=None, umask=- 1,
encoding=None, errors=None, text=None, pipesize=- 1)

I’m getting all this from the official Python subprocess documentation at https://docs.python.org/3/library/subprocess.html which always has the problem of inadequate examples. It does give simple examples and important patterns to consider. For example, this is another “open” operation, which should trigger in your mind by now (always) the context manager, and indeed they have this example:

with Popen(["ifconfig"], stdout=PIPE) as proc:
    log.write(proc.stdout.read())

While this example is interesting, I need a much more real-world example that demonstrates to me operating system independence. Until such time as I’m running Jupyter Desktop under Linux (with WSLg… soon!), I need to suffer the backslashes. Back-slash, back-slash, back-slash…oh Microsoft, how cruel you are. Anyhoo, here’s my initial success building on knowledge from prior projects. I like this pattern. Keep in mind I’m using the .read() method to get the output, but there are options such as .readline() and .readlines() as well.

Python Subprocess Popen Pipe Windows Shell Stdout Read

One more test before we start building based on this. Happily the “cd” command is multi-platform (Windows & Linux), but so is (amazingly) the “echo” command… at least Windows & Bash-based Terminals under Linux. I have to make sure I understand how parameters or arguments are passed in as members of a Python list. This is a little bit strange at first glance, but it’s consistent across the Python subprocess APIs so you’ve just got to understand and live with it. Commands that go on a single-line in the CLI are broken into members of a list when defined Python-side.

Python Subprocess Popen Run Input Arguments Parameters List

Python and Linux
OS Independence
Stuff making sense
To our AI Descendants (almost)
These are a few of my favorite things!

—Mike Levin, 2022

So in almost all fields of technical endeavor when you’re trying new things and pushing your boundaries and success is unclear, there is often a Eureka! moment as there was with Archimedes and the gold crown in the bathtub, and as with Orville and Wilbur Wright brothers getting the ratios for flight right in their wind tunnel experiments. You just know everything has come together correctly. As humble as it may be in comparison, this is one of those moments for me on a personal level. Everything is going to pay off.

Success Guaranteed Moment New Static Site Generator System Is Born

Up to this point the code is pristine and bare-bones beautiful. I am a bit frozen. I know I must move forward, but I do an nbdev_clean_nbs and name this git commit “Barebones beautiful”.

What is the lightest touch I can possibly do to make this system spring to life again? Well, for starters I can ensure that the files I need for the blog slice & dice system are actually inside each folder. I can make sure I can actually cd to each folder in question. I can set the criteria for doing this next step. First and foremost:

Python pathlib Path() to the rescue! Let’s keep it OS-independent, people. 1, 2, 3… ? Simply cd into each location. That’s all. Change your current directory to reside inside each folder.

Okay wow, this one was a hard one. I had to set the path to each of the folders containing site source-contents for the static site generator process (really slice & dicing since Github pages will do the generation, but still). Now I was tempted to put a ~/github as a prefix to every path entry, but not all operating systems use ~/ as a shortcut for home and I’m trying to keep this OS-independent. So I have to allow for one convention. Paths are relative to your “home” directory, which Python pathlib is thankfully able to get for you on both Windows and Linux. Here it is working in Windows:

Python Find User Home Path Os Expanduser Tilde Windows

And here is the same script running in a Linux Terminal:

Python Os Independent Home Path Expansion Tilde Linux

So as you can see, we’re going to be able to set our current working directory correctly for the blog slice & dice command that I’m so urgently working towards.

This is where a project either stays tightly within the grip of your reins or the wagon goes careening out of control. Keep a tight grip!

Charlie Brown, you’ve been making solid contact with this football. Don’t go losing it now. Think!

Set your current working directory using the cwd parameter of Popen. Test it. Good, no errors.

I very much do not want a split between the Linux and Windows versions of this program, but in order to confirm we are in the correct current working directory, the command is cd from Windows but pwd from Linux. Ugh! Okay, we will do one bit of OS-detecting code-splitting bullshit so we can keep a tight grip on the reins. We have to confirm the cwd argument is working on both sides. It’s hateful to me that this is necessary, but here it is confirmed Windows-side:

Popen Current Working Directory Cwd Windows Vs Linux Python Os Name Nt Cd Pwd

And here it is confirmed Linux-side:

How To Check Current Folder Linux Pwd Print Working Directory Symlinks

This is going to work, but it’s interesting to note that the symbolic links that made the github folder appear to be underneath my Linux-side home directory ~/ or /home/[username] has been expanded to the Windows Subsystem for Linux (WSL) mount point (/mnt/c/Users/[username]). Okay, I can live with that.

We’re close. That was tightening the grip on the reins because this is precisely where things are going to go wrong.

I will not attempt to create bash scripts under Python wherein one command builds upon stateful variables and such of the previous commands. I will not attempt to maintain state or session between different code execution environments. Remember what we are doing here.

We are essentially replacing a “parent” bash script which previously ran the show and is quite a logical way to do this sort of thing with a Python script as the new “parent”. This is smart because Python is tons easier to read and manage over the long-term than Bash Script.

However, we will be calling sub-processes (sequentially) exactly as if we were working from a bash script still. Those subprocesses may (and will) be other Python scripts. There is a temptation here to “keep state” within subprocesses. We must not give into this temptation.

All data regarding a “next step” must be passed in explicitly from the context of the parent script every time. Subscripts are phantoms. They have no memory. We call them using everything they need to know on the call, and then we exit. After that, they were never there. There is no spoon.

Okay, deep breath. It is time to slice & dice your blogs again. Forget about supplementary pages like homepage, about pages, style.css files and all the rest of it. Oh, we will need to add and commit all new files in assets/images since those are so required for building pages like this. But that’s the only extra thing beyond the blogslicer command… the blogslicer command… the blogslicer command…

This is not your first project like this. There is another. Show the nice people blogslicer:

# export

import argparse
from pathlib import Path
from dateutil import parser
from slugify import slugify
from dumbquotes import dumbquote


if hasattr(__builtins__, "__IPYTHON__") or __name__ != "__main__":
    from IPython.display import display, Markdown

    h1 = lambda text: display(Markdown(f"# {text}"))
    h2 = lambda text: display(Markdown(f"## {text}"))
    h3 = lambda text: display(Markdown(f"### {text}"))

    folder_name = "../PythonicAlly.com"
    blog_title = "Pythonic Ally"
    blog_slug = "blog"
    author = "Mike Levin"
else:
    h1 = lambda text: print(f"# {text}")
    h2 = lambda text: print(f"## {text}")
    h3 = lambda text: print(f"## {text}")

    aparser = argparse.ArgumentParser()
    add_arg = aparser.add_argument
    add_arg("-p", "--path", required=True)
    add_arg("-t", "--title", required=True)
    add_arg("-s", "--slug", required=True)
    add_arg("-a", "--author", required=True)
    args = aparser.parse_args()

    folder_name = args.path
    blog_title = args.title
    blog_slug = args.slug
    author = args.author

index_front_matter = f"""---
layout: default
author: {author}
title: "{blog_title}"
slug: {blog_slug}
permalink: /blog/
---

"""
index_front_matter += "# Welcome to The  Blog"


journal_path = f"{folder_name}/journal.md"
output_path = f"{folder_name}/_posts/"
slicer = "-" * 80

Path(output_path).mkdir(exist_ok=True)

dates = []
counter = -1
date_next = False
with open(journal_path, "r") as fh:
    for line in fh:
        line = line.rstrip()
        if date_next:
            try:
                adate = line[3:]
                date_next = False
                adatetime = parser.parse(adate).date()
            except:
                adatetime = None
            dates.append(adatetime)
            date_next = False
        if line == slicer:
            date_next = True
            counter = counter + 1
dates.reverse()

table = []
at_top = True
index_list = []
with open(journal_path, "r") as fh:
    for i, line in enumerate(fh):
        line = line.rstrip()
        if line == slicer:
            if at_top:
                at_top = False
                table = []
                continue
            adatetime = dates[counter - 1]
            filename = f"{output_path}{adatetime}-post-{counter}.md"
            h3(f"FILE: {filename}")
            with open(filename, "w") as fw:
                title = f"Post {counter}"
                slug = title
                if table[0] == slicer:
                    table = table[1:]
                maybe = table[1]
                has_title = False
                if table and maybe and maybe[0] == "#":
                    title = maybe[maybe.find(" ") + 1 :]
                    has_title = True
                slug = title.replace("'", "")
                slug = slugify(slug)
                top = []
                top.append("---\n")
                top.append("layout: post\n")
                top.append(f'title: "{title}"\n')
                top.append(f'author: "{author}"\n')
                top.append(f"categories: {blog_slug}\n")
                top.append(f"slug: {slug}\n")
                top.append(f"permalink: /{blog_slug}/{slug}/\n")
                try:
                    fdate = adatetime.strftime("%m/%d/%Y")
                except:
                    fdate = None
                link = f"- [{title}](/{blog_slug}/{slug}/) {fdate}"
                index_list.append(link)
                top.append("---\n")
                top.append("\n")
                top_chop = 2
                if has_title:
                    top_chop = 3
                table = [f"{x}\n" for x in table[top_chop:]]
                table = top + table
                print("".join(table))
                fw.writelines(table)
            counter = counter - 1
            table = []
        table.append(line)

index_page = index_front_matter + "\n\n" + "\n".join(index_list)

with open(f"{folder_name}/blog.md", "w") as fh:
    fh.writelines(index_page)

Look familiar? It should. It’s the previous project where I worked through many of these same issues.

This program takes the following arguments:

Now you might think I didn’t put all of this in sites.csv. You’d be correct, however the slug is always going to be /blog/ and the author is always going to be yours, truly. If in the future I use this system to manage blogs with different “slugs” and “authors”, I’ll update the system. Because it’s so readable now, such updates will not be a problem.

I rewrite this system in case you haven’t inferred by now because I lost grip on the reins. The wagon careened out of control. And when I downgraded my Ubuntu OS because LXD and broke it, I took it as a sign to do this tight re-write. Well the re-write is done and it is tight. It’s tighter than blogslicer. I will have to revisit that sometime in the future, but 80/20-friggin rule, gotta use it, it’s a tool.

Okay, so construct the blogslicer command for each site.

Let’s nail the path of that Python executable. Each running instance of Python knows where its own executable is stored. And it is knowing steps like this that keep projects like this from careening. Absolute paths are your friends. Fixed positions are your friends. Commands that can execute independently of each other (albeit with sequential order dependence) are your friends. THAT’S the point I’m up to here. Tight grip! Find that Python.exe… sys.executable! So I’m already importing os and what do I care for memory. This isn’t one of those scaling apps. So import sys too!

python = sys.executable

No problem! Now blogslicer is going to need to be called from that.

Okay, the first attempt to execute the blogslicer command is succeeding on some sites and failing on others. We need to zero in on the point of failure.

Okay, remember the dependencies:

pip install python-slugify
pip install dumbquotes

Okay, wow… I think I have it basically working. I am waiting for this epic post to push out and sync on Github Pages.

The code is getting too long for screenshots, so I’ll just put this here:

# export

import os
import sys
import argparse
import pandas as pd
from pathlib import Path
from collections import namedtuple
from subprocess import Popen, PIPE

python = sys.executable
home = Path(os.path.expanduser("~"))
blogslicer = Path(home / Path("github/blogslicer/blogslicer/core.py"))
if hasattr(__builtins__, "__IPYTHON__") or __name__ != "__main__":
    from IPython.display import display, Markdown

    h1 = lambda text: display(Markdown(f"# {text}"))
    h2 = lambda text: display(Markdown(f"## {text}"))
    h3 = lambda text: display(Markdown(f"### {text}"))

    file = "sites.csv"
else:
    h1 = lambda text: print(f"# {text}")
    h2 = lambda text: print(f"## {text}")
    h3 = lambda text: print(f"## {text}")

    aparser = argparse.ArgumentParser()
    add_arg = aparser.add_argument
    add_arg("-f", "--file", required=True)
    args = aparser.parse_args()
    file = args.file

h1("Generaring sites...")
file_obj = Path(file)
df = pd.read_csv(file_obj, delimiter="|")
df = df.applymap(lambda x: x.strip())
df.columns = [x.strip() for x in df.columns]
Site = namedtuple("Site", "path, apex, title, gaid, tagline")

h2(f"Python: {python}")
h2(f"Blogslicer: {blogslicer}")
print()
for index, series in df.iterrows():
    site = Site(**series.to_dict())
    h3(site.apex)
    here = Path(home / site.path)
    cmd = f'{python} {blogslicer} -p {here} -t "{site.title}" -s "blog" -a "Mike Levin"'
    print(cmd)
    print()
    with Popen(args=cmd, cwd=here, stdout=PIPE, stderr=PIPE, shell=True) as pout:
        for line in pout.stdout.readlines():
            print(line.decode().strip())
h2('Done!')

Are we done here? Hmmm, no. I have to make the git portion of this work as well. Ugh, that means for the multi-OS aspect, I need git for Windows. Check if I have it… I don’t. And I really need to to be available not just under Windows, but from the Jupyter Desktop Terminal context. Google git for Windows and learn the latest gory details https://git-scm.com/download/win

Good news, everyone! It seems there is a command you can execute from PowerShell to install git. And since the Jupyter Desktop Terminal is PowerShell, there is a decent chance that git will be available as a command under Terminal.

winget install --id Git.Git -e --source winget

After a huge install of mingw64 and what looks nearly like a cygwin install, I still can’t run git from a Jupyter Terminal. Ugh! It does run from a normal PowerShell. Damn it, figure out what to do. Look at the Path.

This lets you look at PowerShell environment variables:

dir env:

Looking over the printout, I think the culprit is PSMODULEPATH

Under Jupyter Terminal vs. Under Microsoft Terminal:

C:\Users\mikle\Documents\WindowsPowerShell\Modules;C:\Program Files\WindowsPowerShell\Modules;C:\WINDOWS\system32\WindowsPowerShell\v1.0\Modules
C:\Users\mikle\Documents\WindowsPowerShell\Modules;C:\Program Files\WindowsPowerShell\Modules;C:\WINDOWS\system32\WindowsPowerShell\v1.0\Modules

Nope, they’re identical! Figure out where git is located:

(get-command git).Path

Returns:

C:\Program Files\Git\cmd\git.exe

Okay, pshwew! Make sure that runs from Jupyter Terminal!

C:\Program` Files\Git\cmd\git.exe

Okay, got it. You need a back-tick before the space. Wow! Make sure you’ve got git on both sides.

Ugh, nonsense! Okay…

pip install GitPython

Nope! I take that back too. However, the GitPython package did give me the answer:

import os
if os.name == 'nt':
    git_exe = r"C:\Program Files\Git\cmd\git.exe"
else:
    git_exe = "/usr/bin/git"
os.environ["GIT_PYTHON_GIT_EXECUTABLE"] = git_exe

And I think I’ll introduce my first function.

def git(args):
    cmd = [git_exe] + shlex.split(args)
    with Popen(args=cmd, cwd=here, stdout=PIPE, stderr=PIPE, shell=True) as pout:
        print(f"GIT: {shlex.join(cmd)}")
        print(f"STDOUT: {pout.stdout.read().decode()}")
        print(f"STDERR: {pout.stderr.read().decode()}")
        print()

Ugh, Windows git was wreaking havoc on my files, especially the big repos.

git config --global core.autocrlf true
git config --global core.eol crlf

And I need a better method of streaming output from Popen. As much as I like the context manager, it has to go:

def git(args):
    cmd = [git_exe] + shlex.split(args)
    process = Popen(
        args=cmd,
        cwd=here,
        stdout=PIPE,
        stderr=PIPE,
        shell=False,
        bufsize=1,
        universal_newlines=True,
    )
    print(f"GIT: {shlex.join(cmd)}")
    for line in process.stdout:
        print(line)
        sys.stdout.flush()

I’m still having issues with Windows git changing every file in a repo.

I think I have to do this too:

git config --global core.filemode false

And a few tweaks including blanking the folder before-hand so renamed blog posts don’t sneak in as duplicates:

# export

import os

if os.name == "nt":
    git_exe = r"C:\Program Files\Git\cmd\git.exe"
else:
    git_exe = "/usr/bin/git"
os.environ["GIT_PYTHON_GIT_EXECUTABLE"] = git_exe
import sys
import shlex
import argparse
import pandas as pd
from git import Repo
from pathlib import Path
from collections import namedtuple
from subprocess import Popen, PIPE


python = sys.executable
home = Path(os.path.expanduser("~"))
blogslicer = Path(home / Path("github/blogslicer/blogslicer/core.py"))
if hasattr(__builtins__, "__IPYTHON__") or __name__ != "__main__":
    from IPython.display import display, Markdown

    h1 = lambda text: display(Markdown(f"# {text}"))
    h2 = lambda text: display(Markdown(f"## {text}"))
    h3 = lambda text: display(Markdown(f"### {text}"))

    file = "site.csv"
else:
    h1 = lambda text: print(f"# {text}")
    h2 = lambda text: print(f"## {text}")
    h3 = lambda text: print(f"### {text}")

    aparser = argparse.ArgumentParser()
    add_arg = aparser.add_argument
    add_arg("-f", "--file", required=True)
    args = aparser.parse_args()
    file = args.file

h1("Generaring sites...")
file_obj = Path(file)
df = pd.read_csv(file_obj, delimiter="|")
df = df.applymap(lambda x: x.strip())
df.columns = [x.strip() for x in df.columns]
Site = namedtuple("Site", "path, apex, title, gaid, tagline")


def git(cwd, args):
    cmd = [git_exe] + shlex.split(args)
    h2(f"git cmd: {shlex.join(cmd)}")
    process = Popen(
        args=cmd,
        cwd=cwd,
        stdout=PIPE,
        stderr=PIPE,
        shell=False,
        bufsize=1,
        universal_newlines=True,
    )
    h3('git stdout')
    for line in process.stdout:
        print(line.strip())
        sys.stdout.flush()
    print()
    h3('git stderr')
    for line in process.stderr:
        print(line.strip())
        sys.stdout.flush()
    print()


h2(f"Python: {python}")
h2(f"Blogslicer: {blogslicer}")
print()
for index, series in df.iterrows():
    site = Site(**series.to_dict())
    h3(site.apex)
    here = Path(home / site.path)
    [x.unlink() for x in Path(here / "_posts/").glob("*")]

    # Blog Slicer
    cmd = f'{python} {blogslicer} -p {here} -t "{site.title}" -s "blog" -a "Mike Levin"'
    print(cmd, end="\n\n")
    with Popen(args=cmd, cwd=here, stdout=PIPE, stderr=PIPE, shell=True) as pout:
        for line in pout.stdout.readlines():
            print(line.decode().strip())

    git(here, "add _posts/*")
    git(here, 'commit -am "Publising Blog Posts"')
    git(here, "push")

h2("Done!")

I need to tie this process to a vim macro again.


Thu Jun 16, 2022

There’s No Place Like Home & You Get To Decide Where That Is… Or Do You?

category: about

This video amounts to:

We are turning ~/ from /home/healus to /mnt/c/Users/mikle

The rest is foundational background knowledge.

For the years that I’ve been using WSL (Windows Subsystem for Linux), I’ve been functionally using my Windows home folder C:\Users\[usernmae], cd’ing to it with my .bash_profile script that runs every time a Linux Terminal is opened. When I deleted my Unbuntu 20.04 to downgrade to 18.04 for some technical reasons, it turned out having all my important “home” files on the Windows side saved my butt. It gave me the freedom to delete the Linux container without fear because the home directory lived outside of it.

In the video I made on the topic, I made a big deal of being able to pull down all your work again from Github, but honestly I knew that I had a safety net by having had used this other approach which I felt I was ready to abandon. Now having gone through the process and realized there are files here and there I didn’t have in Github and am glad I had Windows-side. The straw that broke the camels back is my vim spellcheck library. I’m going to improve my approach. I’m going to do it here live at 5:00 AM as the sun rises.

I’ve given myself a good half-hour cushion between announcing (scheduling) the stream and now. So get your ducks in a row. Know the issues.

It looks like there’s plenty of information on this one askubuntu.com link:

https://askubuntu.com/questions/465493/how-can-i-symlink-my-home-folder-from-another-drive

Looks like a Stack Exchange site. Explain to the nice folks the difference between Stack Exchange (the Q&A rating reputation CMS-system) and instances of sites created with it like StackOverflow.

Anyhoo, the magic words are:

ln -s /extra-home/username /home/username

…which for me means:

ln -s /mnt/c/Users/mikle /home/healus

And that should do it!

However, as healus you can’t very well go changing such a major thing about yourself while you are you, so for a short while it’s good to be somebody else. That somebody is superuser:

sudo su

It pains me to be doing such foundational stuff OS-wise still into my 50s. But this symlink stuff is lifetime foundational. Under *nix operating systems, it’s akin to piping and everything being a read/write file. It teaches you about the Unix way and will serve you for your entire life. I should have used symlinks more through my life, but it’s never too late.

Bring up docs on the ln command. Explain the -s switch.

Oh! Show the nice people manual pages and how it eliminates the need to pipe –help to ls! I should have shown that long ago. Maybe a separate video?

Okay, but did it work? It’s going to take a shutdown and restart to be sure. From a Powershell or COM:

wsl --shutdown

And afterwards, it’s still going to look like ~/ in the prompt, so print the working directory:

pwd

Hopefully, that’s the successful video.

Maybe go into setting up your virtualenv as a bonus. Oh, that’s going to make my new startup script create errors because there are locations that aren’t there. Whoops. Disable your old .bash_profile so those errors don’t occur. Just rename it. Hmmm, no. That’s one of the few files that’s gone. I have it in the helpers repo so it’s preserved but won’t cause the error. This is good.

I am ready for the video.

Learnings:

Symbolic links are easy to make.

They are made using the source’s “folder name”

The source’s folder name is created in the target location.

You cannot so easily remap your ~/ Linux/Unix home location to the WSL /mnt/c/Users/[username] location even though it’s soooooo tempting.

Instead, create symbolic links from your favorite Windows-side folders (github) to /home/[username] locations Linux-side.

This is a compromise, but satisfies my 80/20-rule needs. If you forget to commit to git repos and delete your WSL Linux container, your work is still there.

But beware losing .config files from ~/home (needs baking up)


Wed Jun 15, 2022

Microsoft Edge history is now syncing (turn off)

category: tech

I just loaded Edge tonight after a reboot and was greeted with this happy message:

Microsoft Edge history is now syncing. We’ve added sync for browsing history and open tabs so you can continue browsing on any device. You can always customize what to sync in settings.

Microsoft Edge History Is Now Syncing

Not good. It’s bad enough how much Google tracks us through our gmail logins. Now Microsoft is trying to stay in parity. Let’s shoot a video on how to turn this off, protect your privacy (a little bit) and make this the video and page when people go searching on this topic.


Wed Jun 15, 2022

LXD on WSL2 is a Few Months Ahead Of Its Time

category: linux

I am almost always doing what people are really going to want to do a few months from now.

Next try! Remind everyone why this is so worth it… that pic!

Do this until it is done. Don’t give up and let the nice folks see the Charlie Brown routine. Kick at the football and end up on your back as many times as it takes. But don’t be tricked. Intelligently adjust technique.

There still seems to be confusion about LXD vs. Docker. Clarify. Show how Canonical is the project lead (a recent discovery of mine).

https://linuxcontainers.org/

And THAT’S where we begin!

This is a holding pattern… the same way I’m on Windows 10 but telling people on Windows 11 to wsl –install, I’m biting the bullet and chasing the rabbit for benefit today.

It’ll come into better shape tomorrow, Guaranteed.

healus@LunderVand:~$ sudo lxd init
Would you like to use LXD clustering? (yes/no) [default=no]:
Do you want to configure a new storage pool? (yes/no) [default=yes]:
Name of the new storage pool [default=default]:
Name of the storage backend to use (btrfs, dir, lvm) [default=btrfs]:
Create a new BTRFS pool? (yes/no) [default=yes]:
Would you like to use an existing block device? (yes/no) [default=no]:
Size in GB of the new loop device (1GB minimum) [default=50GB]: 10
Would you like to connect to a MAAS server? (yes/no) [default=no]:
Would you like to create a new local network bridge? (yes/no) [default=yes]: n
Would you like to configure LXD to use an existing bridge or host interface? (yes/no) [default=no]: y
Name of the existing bridge or host interface: eth0
Would you like LXD to be available over the network? (yes/no) [default=no]:
Would you like stale cached images to be updated automatically? (yes/no) [default=yes]
Would you like a YAML "lxd init" preseed to be printed? (yes/no) [default=no]:

I have overcome big obstacles:

Charlie Brown kicked the football, but doesn’t know how to play the game now. I need to put the time in on learning the lxc commands and general API.

I am feeling bleeding edge pain of early adopters.


Wed Jun 15, 2022

Reverting WSL2 From Ubuntu 20.04 to 18.04 To Get LXD Containers Working

category: linux

In a previous video, I Charlie Brown kicked at the football… and missed. In the voice of Lucy:

Oh, Charlie Brown. When will you stop trying to win the prize?

NEVER! Just don’t chase the rabbit forever.

Improving Backoff & Retry Methodology

Figure out what backoff and retry in this case properly means. I’m not alone in wanting this. Others have explored this particular Wonderland. Figure out how to walk in their footsteps.

Make the above the beginning to a new video!

lxd-on-wsl-ubuntu-20.04-focal-fassa-to-18.04-bionic-beaver.JPG

(py310) ubuntu@LunderVand:/mnt/c/Users/mikle/github$ sudo lxd internal error, please report: running “lxd” failed: timeout waiting for snap system profiles to get updated

Ah, information! That helps. But get straight the documentation you’re going to be following. Make this primarily about:

https://wsl.dev/wsl2-lxd-funtoo/ https://blog.simos.info/how-to-run-lxd-containers-in-wsl2/

After all this, it may still be a colossal fail! Let the world see you try to kick that football Charlie Brown!

Don’t be embarrassed!

The Methodology of venv, VirtualEnv and LXC / LXD

LXD is like Docker, but part of Linux proper.

LXD is the Container system allowing virtual machine-like Linux instances but better because they are faster and more efficient (sharing host machine’s kernel i.e. resoures). It comes from the Unix chroot command, but is much more user friendly now.

Put your keys back in place so that git cloning works, and you get your stuff back!

.vimrc Key work repos (git repositories, usually off of Github)

ZFS is better because the old Sun Microsystems is generally better at everything than everyone. Research OpenSolaris (or just Solaris) and ZFS in particular. It was designed for nearly zero- compromise against data-loss.

The OS tied truly to the hardware is special. The OSes tied to virtual machine management layers are interchangable and more flakey.

Good fixed-file-location habits… you’ll never wonder where that that thing is again!

There’s no place like ~/github

Docker won… in a lot of place. Better for resume. However Docker is not an official part of Unix. LXD is core Linux knowledge now… Docker isn’t.

LXD is just like venv or virtualenv… it’s just about packaging dependencies and keeping play-work from polluting your important main OS.

Think about Python without venv or virtualenv… it’s unthinkable.

WSL2 installs on Windows 11 easily… not so on Windows 10, so my Pied Piper message of easy wsl –install ONLY WORKS if you’re on Windows 11. Why?

Windows 10 makes you (too much for the average user):

On Windows 11, you just type:

wsl --install

…and it does all the above steps.

tmux vs screen screen is part of the official GNU command set (every Unix or Linux has it)

LXD (if not today then in the near future) is always there. It’s generic tech.

Generic tech (as unsexy as it sounds) is ALWAYS better than fad du jour.

Python is better than JavaScript… huh?

Python has replaced PERL in Linux default distros… more important than JavaScript in a browser because browsers are just one API (to humans). Linux is the API to EVERYTHING. Python is the new Unix Bash Script.

Timeless LXD… timeless.

3 Timeslots you might catch me:

~8:00 AM EST before work ~12:00 PM Noon EST (lunch) ~6:00 PM - *wildcard (whenever I’m inspired)

I plan on getting new *nix commands “into my fingers”… that means it’s ready to use any moment of your life from now forward… no question of whether it’s there. No questions of how it works or whether it might ever let you down. No, it’s now like it’s part of your body.

As an in-house SEO very happy where I am, I don’t have to pander to the resume keywords… I can develop my skills how I like… Kubernetes

Heavyweight tech liability, both on the actual hardware (even if cloud) and in your mind. MySQL, PostgreSQL, Redis, MongoDB… because mikey likey sqlite.

SQLite good

I’d be on venv full-time, except all sample code uses virtualenv


Wed Jun 15, 2022

Windows Updates KB5014699 & KB5013887 & LXD on WSL2 Under Ubuntu 20.04

category: tech

Wow, this latest Windows Update was a rough one. I had been doing a number of low-level OS things especially on the Linux side of the Windows Subsystem for Linux (WSL, WSL2) lately trying to get the LXD container system to work and figured it was time for a full system reboot. I saw the little-red-dot on the power icon of Windows 10 indicating there were downloaded updates to apply, and I thought “great!” This must be the updates I’m waiting for.

21H2 View Update History Windows Feature Updates

These Are Not The Windows Updates You’re Looking For

But after a very long update process on the blue-screen it went to a black-screen and never came back. I gave it a good generous 10-minutes… and no luck! Force a restart with the “soft” hard reset button (there are no longer real hardware reset buttons on laptops. That’s insane.

After the forced restart I went into Settings / Windows Update to see if I could get a description of what just happened to my machine and it occurred to me to do this livecast. It’s early. It’s of interest.

Get into the Zone… The Flow… The Bear Necessities…

Forcibly get into the zone this morning because of distractions that could not be allowed. One controls their mind. Your mind is yours. No one or nothing has the right to derail or otherwise precondition you for the day in ways you do not want for yourself. Journals like this and playlists like that are tools to help you condition your mind.

Yesterday I tried getting LXD (standard Linux containers) working under WSL2 (Windows Subsystem for Linux)… and failed due to a number of reasons. Things didn’t work as advertised. The remedies were deep rabbit holes. I didn’t have a successful journey through Wonderland yesterday.

Windows Update KB5014699

June 14, 2022—KB5014699 (OS Builds 19042.1766, 19043.1766, and 19044.1766)

On May 19, 2022, we released an out-of-band (OOB) update to address an issue that might cause machine certificate authentication failures on domain controllers. If you haven’t installed the May 19, 2022 or later releases, then installing this June 14, 2022 update will also address that issue. For more information, see the Before installing this update section in this article.

Windows Update KB5013887

June 14, 2022-KB5013887 Cumulative Update for .NET Framework 3.5 and 4.8 for Windows 10, version 20H2, Windows Server, version 20H2, Windows 10 Version 21H1, and Windows 10 Version 21H2

The June 14, 2022 update for Windows 10, version 20H2, Windows Server, version 20H2, Windows 10 Version 21H1, and Windows 10 Version 21H2 includes cumulative reliability improvements in .NET Framework 3.5 and 4.8. We recommend that you apply this update as part of your regular maintenance routines. Before you install this update, see the Prerequisites and Restart requirement sections.

These are not the updates I’m looking for.

So Give A Try At LXD under WSL Again

So try getting sudo lxd init “unlocked”

Edited zfs out of /etc/modules

Restarted Linux under WSL

Following instructions under https://blog.simos.info/how-to-run-lxd-containers-in-wsl2/ pedantically.

No luck, no luck, no luck!

LXD on Ubuntu 18.04 works vs 20.04 where it doesnt?

Ugh! Is it being on Ubuntu 20.04 versus 18.04? Maybe. Yes, just maybe.

Switch your main laptop over to Ubunutu 18.04? Simultaneously? Delete 20.04 for fewer moving parts? Make my next video out of it? Ugh! Maybe. Just maybe.


Tue Jun 14, 2022

Failed Getting LXD Working Under WSL2 Under Ubuntu 20.04 systemd & snap

category: linux

Trying to get LXD working under WSL2

The prize is big;
On turf I’m first.
To crack this nut
I have a thirst.
The itch I scratch
Make no mistake’s
The bug you catch:
Windows ESCAPE!

—Mike Levin, 2022

First Attempt Failed Despite Heroic Rabbit-Chasing Efforts

Play pied piper showing the way to fullscreen Linux Terminal goodness.

Ugh! The last livestream attempting to

sudo lxd init

Failed on 2 fronts:

  1. It did not present zfs as an option
  2. The it did not result in anything listed when trying:

    sudo lxd list

There were error messages and it seemed network-related, so I have 2 things I’m going to try:

More information on making zfs available as an option: https://stanislas.blog/2018/02/lxc-zfs-pool-lxd/

It suggests that you first install the kernel headers to allow us to compile and install kernel modules:

apt install linux-headers-$(uname -r)

It them suggests we install ZFS AND DKMS modules (old instructions didn’t have DKMS):

apt install zfs-dkms zfsutils-linux

Next is says we need to load the module:

modprobe zfs

And it also says that after every restart it needs to be reloaded, which we can do with this:

echo "zfs" >> /etc/modules

Tue Jun 14, 2022

Trying Again To Get LXD Working Under Microsoft Subsystem For Linux

category: linux

Wow, get right to the wizard:

sudo lxd init

This one is unusual because it is the workday, but it is a major dependency I have to satisfy before I proceed even with my day-job, so get ‘er done. Turn off monetization on the livestream on this one to avoid any issues. This is under the category of “the best way to learn is to teach” and if livestreaming this step is the best way to do it to get over this hump, so be it.

What comes after sudo lxd init?

Load the official LCD instructions: https://linuxcontainers.org/lxd/getting-started-cli/

Be on the lookout for:

They get used in an instantiation command:

lxc launch <image_server>:<image_name> <instance_name>

For example:

lxc launch images:ubuntu/20.04 ubuntu-container

If you get this far in this video, you’re golden.

STOP!

Name of the new storage pool [default=default]:
Name of the storage backend to use (btrfs, dir, lvm, ceph) [default=btrfs]: zfs
Invalid input, try again.

Name of the storage backend to use (btrfs, dir, lvm, ceph) [default=btrfs]:
Create a new BTRFS pool? (yes/no) [default=yes]:
Would you like to use an existing empty disk or partition? (yes/no) [default=no]:
Size in GB of the new loop device (1GB minimum) [default=30GB]: 5
Would you like to connect to a MAAS server? (yes/no) [default=no]:
Would you like to create a new local network bridge? (yes/no) [default=yes]:
What should the new bridge be called? [default=lxdbr0]:
What IPv4 address should be used? (CIDR subnet notation, “auto” or “none”) [default=auto]:
What IPv6 address should be used? (CIDR subnet notation, “auto” or “none”) [default=auto]:
Would you like LXD to be available over the network? (yes/no) [default=no]:
Would you like stale cached images to be updated automatically? (yes/no) [default=yes]
Would you like a YAML "lxd init" preseed to be printed? (yes/no) [default=no]:
Error: Failed to create network 'lxdbr0': Failed adding outbound NAT rules for network "lxdbr0" (ip): Failed apply nftables config: Failed to run: nft
table ip lxd {
chain pstrt.lxdbr0 {
        type nat hook postrouting priority 100; policy accept;
        ip saddr 10.49.21.0/24 ip daddr != 10.49.21.0/24 masquerade
}
}
: Error: Could not process rule: No such file or directory

      ^^^^^^^^^^^^
Error: No such file or directory; did you mean chain ‘out.lxdbr0' in table ip ‘lxd'?

      ^^^^^^^^^^^^
ubuntu@LunderVand:/mnt/c/Users/mikle/github$

Kicked at the football… and missed. Oh, Charlie Brown. When will you stop trying to win the prize?

NEVER! Just don’t chase the rabbit forever. Figure out what backoff and retry in this case properly means. I’m not alone in wanting this. Others have explored this particular Wonderland. Figure out how to walk in their footsteps.

Make the above the beginning to a new video!


Tue Jun 14, 2022

Sun Microsystem’s Zettabyte File System (OpenSolaris ZFS) Under WSL2

category: linux

Sun Microsystems… Son, Mike Knows Systems

WHAT: Linux under Windows under WSL gets you a Linux terminal, but you don’t want to FUBAR it.

WHY: Commitment & Consistency… tell the nice people you’re doing it and you will force yourself to do it. (life-hack)

Thank You Robert Cialdini For Your Work (Influence, Science & Practice)

There’s no surer way to push yourself forward than commitment & consistency. Remind the folks about Robert B. Cialdini and correct the pronunciation.

The Best Laid Plans of Charlie Brown (Kicking The Football)

Don’t make a big long rambling video like before. Just get set up the steps necessary. Get the preview going now! Okay, done by 6:45. Not bad. Some folks will get notified in time. I’m getting more and more of a feel for this. Like anything else, it comes with practice. But being on the bleeding edge as I am, expect your muscle memory to have to be retrained and retrained. This is NOT like driving a car. It’s like flying a jerry-rigged Wright brothers bicycle airplane. So be it. To get the benefit today, it’s worth it.

AFTER THE FACT NOTE: This video started out being just a quick lesson on installing the ZFS filesystem (from Sun Microsystems) as a prerequisite to having Linux containers (LXD) under the Windows Subsystem for Linux (WSL) but due to changes in how the Linux service manager (systemd) works under Ubuntu 20.04 (under WSL?) and difficulty installing the SNAP store, things became much more complicated. But I overcame! I prevailed. And on the next video, we’ll actually activate LXD containers under WSL… a vital and life-changing Linux trick.

Come See How The Sausages Is Made (Not For The Feint Of Heart)

Document process and jump right into it with the folks this morning…

I’ve been putting this off. Find the YouTube link to the guy. KeepItTechie. Go subscribe to KeepItTechie. He’s awesome.

When is a good time? Now is a good time!

zfsutils

You have access to all the best developer stuff now that you have a Linux terminal under Windows.

Windows 10 is Wonderland (no friends, everything is harder) Windows 11 is OZ (you have friends, everything gets easier with friends)

Installing the Zettabyte File System (ZFS)… talk history (quickly)

Sun Microsystems… Commodore of Workstations

Like Commodore, Sun did everything better than the industry at large (at first). It must not have been that great, because they’re both gone.

Sun bought by Oracle… Berkeley DB (NoSQL before SQL) and MySQL (of the LAMP stack before that died). Redis MongoDB CouchDB… PostgreSQL

Before Oracle bought Sun, Sun free and open sourced a whole bunch of stuff… so before the kibosh was put on Sun’s awesomeness, Sun let much of their tech free…

OpenSolaris, a version of Unix you can think of like a branch different from FreeBSD.

Yay! I have ZFS on my system… next step is getting LXD active.

Everything I’ve shown you so far in this video is to be able to answer zfs on the question:

Name of the storage backend to use?

Docker is nice… but docker is for dev’s… not everyday Linux users unless you’re just running stuff distributed on docker (SNAP store).

But when you want a persistent but throw-away local Linux playground, Docker is way confusing and overkill. Linux governance thinks so too and didn’t adopt it (or its methodologies) for standard Linux. They used generic Linux containers under a generic Linux daemon (system service), which have come to be known as LXC and LXD, respectively. You run LXC’s on an LXD.

Like this…

systemd is the daemon manager for Linux akin to Windows Service Manager.

Linux version was lifted… from Apple’s system management work on their Darwin branch of Unix (BSD-lineage).

Okay, so now jump to it:

sudo lxd init

Wait, what? It’s not on my system? Okay so then:

sudo apt install lxd

No luck. Here’s the output. That doesn’t look good.

(py310) ubuntu@LunderVand:/mnt/c/Users/mikle/github$ sudo apt install lxd
[sudo] password for ubuntu:
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following NEW packages will be installed:
  lxd
0 upgraded, 1 newly installed, 0 to remove and 0 not upgraded.
Need to get 5532 B of archives.
After this operation, 79.9 kB of additional disk space will be used.
Get:1 http://archive.ubuntu.com/ubuntu focal-updates/universe amd64 lxd all 1:0.10 [5532 B]
Fetched 5532 B in 1s (9364 B/s)
Preconfiguring packages ...
Selecting previously unselected package lxd.
(Reading database ... 58500 files and directories currently installed.)
Preparing to unpack .../archives/lxd_1%3a0.10_all.deb ...
=> Installing the LXD snap
==> Checking connectivity with the snap store
===> System doesn't have a working snapd and LXD was never used, skipping
==> Cleaning up leftovers
System has not been booted with systemd as init system (PID 1). Can't operate.
Failed to connect to bus: Host is down
Failed to disable unit, unit lxd.socket does not exist.
Failed to disable unit, unit lxd.service does not exist.
Failed to disable unit, unit lxd-containers.service does not exist.
umount: /var/lib/lxd/shmounts: no mount point specified.
umount: /var/lib/lxd/devlxd: no mount point specified.
Unpacking lxd (1:0.10) ...
Setting up lxd (1:0.10) ...
(py310) ubuntu@LunderVand:/mnt/c/Users/mikle/github$

HAVING DIFFICULTIES

WTF? FUBAR? Get up, Charlie Brown! Okay, so you kicked at the football and missed. Lucy pulled it away. This is not your fault. This is not the way it’s supposed to work. Don’t get down Charlie Brown. Backoff. Retry. But corner Lucy. Make her acknowledge what’s happening. Control conditions. Restart your Linux WSL2 system. This is low-level stuff going on because of systemd and snap store messages. Reboot. What was that command again?

PS C:\Users\mikle> wsl --help
Copyright (c) Microsoft Corporation. All rights reserved.

Usage: wsl.exe [Argument] [Options...] [CommandLine]

Arguments for running Linux binaries:

    If no command line is provided, wsl.exe launches the default shell.

    --exec, -e <CommandLine>
        Execute the specified command without using the default Linux shell.

    --
        Pass the remaining command line as is.

Options:
    --cd <Directory>
        Sets the specified directory as the current working directory.
        If ~ is used the Linux user's home path will be used. If the path begins
        with a / character, it will be interpreted as an absolute Linux path.
        Otherwise, the value must be an absolute Windows path.

    --distribution, -d <Distro>
        Run the specified distribution.

    --user, -u <UserName>
        Run as the specified user.

Arguments for managing Windows Subsystem for Linux:

    --help
        Display usage information.

    --install [Options]
        Install additional Windows Subsystem for Linux distributions.
        For a list of valid distributions, use 'wsl --list --online'.

        Options:
            --distribution, -d [Argument]
                Downloads and installs a distribution by name.

                Arguments:
                    A valid distribution name (not case sensitive).

                Examples:
                    wsl --install -d Ubuntu
                    wsl --install --distribution Debian

    --set-default-version <Version>
        Changes the default install version for new distributions.

    --shutdown
        Immediately terminates all running distributions and the WSL 2
        lightweight utility virtual machine.

    --status
        Show the status of Windows Subsystem for Linux.

    --update [Options]
        If no options are specified, the WSL 2 kernel will be updated
        to the latest version.

        Options:
            --rollback
                Revert to the previous version of the WSL 2 kernel.

Arguments for managing distributions in Windows Subsystem for Linux:

    --export <Distro> <FileName>
        Exports the distribution to a tar file.
        The filename can be - for standard output.

    --import <Distro> <InstallLocation> <FileName> [Options]
        Imports the specified tar file as a new distribution.
        The filename can be - for standard input.

        Options:
            --version <Version>
                Specifies the version to use for the new distribution.

    --list, -l [Options]
        Lists distributions.

        Options:
            --all
                List all distributions, including distributions that are
                currently being installed or uninstalled.

            --running
                List only distributions that are currently running.

            --quiet, -q
                Only show distribution names.

            --verbose, -v
                Show detailed information about all distributions.

            --online, -o
                Displays a list of available distributions for install with 'wsl --install'.

    --set-default, -s <Distro>
        Sets the distribution as the default.

    --set-version <Distro> <Version>
        Changes the version of the specified distribution.

    --terminate, -t <Distro>
        Terminates the specified distribution.

    --unregister <Distro>
        Unregisters the distribution and deletes the root filesystem.
PS C:\Users\mikle>

Questions From The Chat… Yay! Keep Those Questions Coming In!

What’s Better, Docker or LXD?

Docker is better for distributing packaged software in the SNAP store due to how it contains dependencies and composites software like patches… transparent overlaid layers. Cool, but difficult.

LXD is better for tire-kicking Linux seekers who need a playground where they can:

rm -rf /

Everyone should have this experience. It is asserting control over Linux. But you don’t want to do it on your main Linux system. That is why I created the great and tiny resetable Levinux distro of Linux… more technically a “respin” than a distro, but still.

LXD is official from the Linux governance organization… who? Research this for a future video.

systemd is the issue… must be enabled.

Whoah! Here is the answer:

This apparently worked under Ubuntu 18.04 but then stopped working under 20.04 due to a change in systemd and the snap store.

I’m frankly surprised that the snap store is required, but whatever. And I don’t even see how it gets installed from his instructions, but let me document here what I typed to get sudo lxd init to work…

sudo apt install -yqq daemonize dbus-user-session fontconfig

sudo daemonize /usr/bin/unshare --fork --pid --mount-proc /lib/systemd/systemd --system-unit=basic.target

exec sudo nsenter -t $(pidof systemd) -a su - $LOGNAME

And to test it:

(py310) ubuntu@LunderVand:/mnt/c/Users/mikle/github$ snap version
snap    2.54.3+20.04.1ubuntu0.3
snapd   2.54.3+20.04.1ubuntu0.3
series  16
ubuntu  20.04
kernel  5.10.102.1-microsoft-standard-WSL2
(py310) ubuntu@LunderVand:/mnt/c/Users/mikle/github$ sudo lxc list
[sudo] password for ubuntu:
If this is your first time running LXD on this machine, you should also run: lxd init
To start your first instance, try: lxc launch ubuntu:18.04

+------+-------+------+------+------+-----------+
| NAME | STATE | IPV4 | IPV6 | TYPE | SNAPSHOTS |
+------+-------+------+------+------+-----------+

OMG, everything is matching the video. Remarkable. It’s time to test:

(py310) ubuntu@LunderVand:/mnt/c/Users/mikle/github$ sudo lxd init
[sudo] password for ubuntu:
Would you like to use LXD clustering? (yes/no) [default=no]:

OMG, the setup wizard for LXD started. Wow, wow, wow!

Okay, this livestream has gone on for long enough (>1 hour).

Cut it here. Bank your win. Document it for the blog.

Done, done and done!

See you folks soon for the really exciting part: LXD under WSL, a.k.a. local Linux playgrounds for Linux system service (daemon) development.


Mon Jun 13, 2022

My Formative Years at All-Boys Science Summer Camp / Integrated Life

category: about

Wow, it’s still only 6:45 AM. The early-bird really does catch the worm. Get a 7:00 AM livecast in. Go set it up so people know it’s coming…

Growing a backbone, I am. (in the voice of Yoda).

Joke: Two Pigeonholed Millennials Were Standing In An Elevator…

Life is one giant integrated thing—at least that’s my belief. How is everyone and everything not interconnected? I mean come on, how could anyone believe in the disjointed disunion of things anymore? But they do. Example case in point, hen I was standing on an elevator in Manhattan for one of marketing agency jobs, I forget which, I heard one of the flocking millennials say in a contemptuous tone to a co-millennial:

[Such-and-such person] views life as one big integrated thing with no separation between work and personal. Can you believe that?

“Yes! Yes, I can,” I thought silently to myself:

Who are these pishers to pigeon-hole a soul into you-are-this here but you-are-that there? Didn’t the failed Voldemort horcrux slice-and-dicing of the soul story from Harry Potter teach you anything?

Ugh, but I kept my mouth shut of course. I am not these pishers’ parents. If their mommies and daddies didn’t teach them to do what you love and love what you do then it is not up to me. What their parents should have taught them was that if you love what you do, you will never work a day in your life.

If for no other reason, don’t split yourself down the middle so that you don’t have to spend energy keeping your act up. Experiences from one area in life being able to connect to experiences from other areas in life is part of having realizations, knowing yourself and growth. Don’t deny yourself that with little firewalls in your being. But if you have, I guess You make your own bed, so sleep in it.

Speaking Of Beds And Things You Spend A Lot Of Time In:

What else is like a bed that once you “make” you have to sleep in?

These are seemingly small things that have a profound impact on your life. They are given too little thought.

You live in these things. You live in more than these things than anything else in life, and if you’re not a tech person like me you can probably replace the keyboards and text editors with whatever the tools of your trade are. You do have a trade, right? A craft you can feel good about having mastered? Something of value that you can trade for economic power without having to rely on handouts and the charity of others, right?

No? Well that’s a problem because it’s closely tied to self-image. And no I’m not saying:

A Man Is What A Man Does

But rather, we are of the material world and how you weave matter into the stuff of life matters.

You Should Weave Stuff Of The Material World Because We Are Material Beings

(…in the voice of Madonna)

Most of us drive, right? Or maybe you hope to some day. If nothing else, being a driver is an easy way to turn everyday skills attainable to anyone into economic product. Driving once was a privilege of the rich like books and literacy. But just like books and literacy, driving is now one of the great easily obtainable options of the “in-need”. Driving is just like literacy. You’re not born with it, but you have to work for it.

Like Dune’s Pain Box, Driving Is A Test Of Your Human-ness

The real modern pandemic of anxiety is the actual blocker to driving. You have to overcome crippling anxiety to drive. Like the gom jabbar test of the Bene Gesserit, the ability to learn to drive is what separates those who are fit to lead (real humans) from those who are not (lower animals). Losing control of yourself while driving is simply not an option and some people are unwilling or unable to overcome that. So just like how reading for pleasure opens doors for you in life by allowing you to run the vicarious experience of others in your head without the same risks they took in life to acquire those experiences, driving similarly helps evolve you into being human by giving you access to worlds previously unavailable.

Not reading for pleasure is another worrying sign. Pick a book and start reading for pleasure. I make some good recommendations in this video:

Be Alert For Dystopian Futures

Join Me In Disbelief These Came From One Author

If you’re waiting for self-driving cars, you’re waiting for vicarious parents to step into your life and let you remain some sort of adult-child. Shame on you!

Not With A Bang But With Wimps

How can I communicate such depth of life-experience in one quick elevator pitch? That person you feel contemptuous of that views life as one big integrated thing is worlds-wiser than you, the upwardly mobile Manhattanite that T.S. Elliot addresses so succinctly in Hollow Men. Oh! Poets yes they can do a better job at expressing it than I. Let me be as humble as the narcissist I am accused of being.

I am an unpoetic dumbass.

I am 51 years old and greatly unaccomplished by the measures of the green-arrow yangs I covet. But so what? I do not have expectations on myself to be of the sort of greatest all-possible wizards that I covet. I am not them. I am not:

The Tools I Advocate:

Credit Where Credit Is Due (The Real Heroes are The Tool Makers)

Clever Thieves Are Heroes Too

Am I putting down the clever thieves? NO! The world is run on clever thieves. Why?

You know that expression “We all stand on the shoulders of giants”? Same thing. Another word for shoulders in this expression is stolen intellectual property. We all stand on the stolen intellectual property of others—some of it more stolen than others, but stolen nonetheless.

Implementation Counts

Linux is the implementation you all know you have today because of Linus being so vocal… such a git… and such a strong personality.

On Windows 11 under a Powershell with admin rights:

wsl --install

See? These are the brilliant wizards. And there are tons of others not credited here who are not part of the Unix/C-programming language lineage that swept through the world, forever changing it like a DNA-reprogramming RNA virus. Think I’m the first one to think so? Nope! Google:

Unix Haters Handbook

Credit where credit is due.

This is my sports. This is my music. This is my pastime.

The Unix Haters Handbook Unix Is A Virus

See? Can you imagine such hate for a thing that a hater’s handbook is published… and is successful! Much of this was written for a time when proprietary operating systems were providing a sort of polish that free and open source software could not yet provide. In fact, it wasn’t even clear yet when this was published that Unix was free and open source. The only fact that was true was that Unix was a fact of life that wasn’t going away. There had to be acceptence whether you liked it or not, hahaha!

You Don’t Know *Nix

And that was 1994 when it was first published. It seems so recent for an operating system that was developed over thirty-years prior. The virus took 34-years to spread so effectively that cartoon-drawing Luddites felt so compelled as to speak out against it. If it was already too late by that time and Apple jumped on the bandwagon in 2007 (13 years later) and then Microsoft jumped on the *nix bandwagon in 2022 (an additional 15 years after Apple, 27 years after the Unix Haters Handbook and 51 years after the invention of Unix itself), can you imagine how far behind you are today?

Knowing *Nix Is Just Plain Literacy (I’m No Better Than You)

Yes, people. We’re talking literacy. Not computer literacy. Not technology literacy. Just plain literacy.

And no I am not a pathological narcissist trying to set myself apart and above from everyone else by seeing this and reading spirituality into the fact I was born the same year as Unix, 50 miles away and got sucked into the Unix-like Commodore Amiga gravity-well, also 50-miles away from me during the 1980s when I was at my most impressionable to such things… Amiga computer… Atari 2600… which was in my house in the 1970s.

But I do believe in providence. And I do believe in serendipity. And I do believe in happenstance messengers having the responsibility to deliver a message…

…while staying humble.

I am the baker of goods.

I am not the maker of baking goods. Those are the wizards I enumerated above.

I am not saying I am better than anyone else.

I am saying I am slow on the uptake and that it took me 40-years to get it through my think skill.

All Boys Overnight Science Camp Watonka at Lake Wallenpaupack

I was on a TRS-80 at Camp Watonka near lake Wallenpaupack, 50-miles from where I now currently live as a result of the pandemic of 2020. I rejected computer sciency type things when I was there the first time, in favor of hand magic, Dungeons & Dragons, SciFi, comic books and art, but have been forced to “come back home” (such as it were) from New York City at least for a little while.

All Boys Science Camp Watonka On Lake Wallenpaupack Poconos Pennsylvania

Magic Is Real. Magic Is *Nix

I now see that:

It took me a long time to realize this.

I am deeply flawed.

I wasted much of my youth on things that could have made me Bill Gates, Elon Musk, Michael Rubin 1000x over.

The Journey Is The Reward

But am I sorry? A little bit, sure. But we’re all human. Coveting heroes doesn’t make us pathological narcissists.

The ability control your life… doesn’t mean you’re a control freak.

It makes us ask ourselves why our life is the way it is.

What is it about us that makes us not those super-achieving people?

Is it circumstances beyond our control, or isn’t it actually really choice.

..and so, I teach.

I love to teach and I’ve got a mic.
I love to talk into the mic.
I’m mic lovin’
So let’s begin.

—Mike Levin, 2022

There we go. Done.

Smile to myself, knowing I’m going to be more effective today because of this.

Going from:

wsl --install
to lxc/lxd being installed (container tech like docker, but the Linux standard)
to systemd Linux daemons (scheduling)
to data pipelining (python huey) to stay compatible with the new guy's work

Huey by the way is a data pipeline job queue thing that’s an alternative to Python celery. I’m not really that familiar with such stuff and need a pure Linux playground.

Solaris is a Unix variant better than most Unix & Linux… it’s Sun Microsystems who does everything better than everyone else, including disk formats. LXD uses Solaris’ disk format for containers.

Lxd Lxc Vm Virtual Machine Wsl Windows Linux Playgrounds

And A Tweet For Today:

Unix invented 1970. Unix Hater’s Handbook 1991 (its viral nature exposed). Mac switches to Unix 2007, saves Mac computer. All Mobile based on Unix or Linux. 51 years later, Microsoft releases Linux subsystem. You’re behind if you don’t know *nix

If You Don’t Know *Nix


Sun Jun 12, 2022

s-expressions, LISP, Linked Lists, Associative Arrays, Reductionism, Luddites & Dogma

category: tech

Yesterday I responded to Zoronoa01 who wrote: I can listen to the history of computer languages and OS narrated by you all day long!

Thank you Zoronoa01 and what a boost! I responded by using one of the most difficult car-trips I’ve had to do in my life to respond. Positive feedback loops are a good thing! Good and right decisions lead to more good and right decisions, and engaging with my YouTube audience is definitely feeling like a good path for me in life. It helps me:

Yeah, I’ve had concerns with pandering to the wants of my audience in the past versus talking about (and working through) the things that I wanted to talk about (and work through). But there’s such a thing as serendipity and the simple statistical fact these things are bound to align with so many billions of people in the world, more online all the time, and YouTube really doing such a good job as the connector of such people with the vlogging format.

Google was bound to happen and something needs to serve this function in the global digital nervous system. And I am a “search engine optimizer” (SEO) after all. Even though it all started with AltaVista, Google is the gift that keeps on giving, and so I should use my skills to be more and more on the receiving end of those gifts… in all the best ways that align to my vibe, morals and evolving mission and purpose.

I missed many of the early dot-com and AdSense money-grabs in the SEO-world. I hate empire-building. I don’t like being held under the sway of money-grubbing alpha-investors. They judge themselves and others by money, money, money and the love gets sucked out of things. I’m sensitive to that deflation-feeling… with what’s being deflated being the inherent love of things. It’s fragile and disappears easily when the host body gets corrupted.

This must be why the story of Fernando Corbato’s fighting the corruption of computer-time-hoarding elitists leading to Multics appeals to me, as does the subsequent fighting of the corruption of Multics by big business and Ken Thompson’s reaction of making Unix and leaking a distribution at Berkeley (BSD) appeals to me. So what led Linus Torvalds to copy Unix? Unix was too expensive! Wow, what a story, right?

Ugh! Okay, why not live-cast a bit. Think through your next steps. Get it very clear in your mind.

Go go pack & think on it!


Sat Jun 11, 2022

Say The Magic Words: Open Terminal!

category: about

Show the nice people how the sausage is made. Don’t bore the magic seekers, but do give deep insight to the spiritual seekers. Break this en in two. Yeah, break it in two then go see your kid.

All I have to do is blow up on YouTube and all my woes are over, you say? The first 1000 followers are the easiest, you say? [Tab 2]

It’s time to address the elephant in the room. Few of you here are really here to learn about Linux from me and how it’s going to change your life. Most of my happenstance audience just wants to see more hand magic. [Tab 3]

More finger magic. We’ll here’s some finger magic for you… no really, it’s called the French Drop and I learned it from my camp counselor Mike Silver, the openly first gay person I knew, at science Camp Watonka at Lake Wallenpaupack 40 years ago when I was 12 years old—right around the corner here from where I life today… at least for the next 2 weeks. [Tab 4]

This was the summer before my trip to Israel with my friend Guy Bruchstein and his family. Yeah, that’s a lot to unpack, right? So more videos to come on my most singular life experiences sandwiched between the rise of Unix and the death of Commodore. I’ve got so much to teach this generation of unreachable TikTokers and Boomer-knockers. Maybe you’re a Boomer…

I am not a Boomer (in Arnie voice… anyone? Anyone? Bueller?)

Well let’s focus now on Camp Wataonka and the most important magic which I did learn there and which I didn’t learn there. I took the hand magic and I left the computer science. It was booooooring and not for me. I mean adding and shifting in binary? Really? I had to know that? What kind of brain fart people were into computers anyway. Me? Give me art, magic, comic books and Dungeons & Dragons any day. [Tab 5]

Why did nobody reach me about abstractions and switches gradually fading away into a sort of magic as layered up APIs simplify human/interfaces until all that remains is a sort of spontaneous creative expression like the Mickey’s Sorcerer’s Apprentice, but gone right? [Tab 6]

Sigh, couldda wouldda shouldda, am I right? We’ll, if I knew then what I know know, what would I have done? I would have told someone like you looking for mere hand trick magic to listen to someone like me who offers a more beautiful life-changing sort of magic which is every bit as spiritually satisfying and audience-impressing. Ever watch finger-magic computer hackers in the movies, working their magic with nearly telepathic control of computers, near and far? War Games from my generation comes to mind, but there’s also Mr. Robot from more recent. Yeah, I offer you that. [Tab 7]

Twelve and Thirteen years old were very formative years in my life, and if the removable finger trick is the bookmark to my 6-year old self and my relationship with my “special” mailman Mr. Sucro, the French Drop is the bookmark to my adolescence and my “special relationship” with my overnight camp counselor and U.K. magician Mike Silver. Both of these men imprinted on me deeply. Deeply in all the good and right ways. This is not going where you think and the fact that that is the automatic response in todays society is a problem. May I suggest you all read The Road to Oz where Dorothy meets the Shaggy Man, ye-old Scooby Doo’s buddy and a fine overlooked example for our culture to emulate. [Tab 9]

We’ll happily the Web puts a giant firewall between Shaggy Men and your kids but still let that special mailman or camp counselor into your life without todays stranger danger paranoia. And so a buddy like me can show you the French Drop in the spirit of the great Penn & Teller, bad boys of magic who revealed the disappearing handkerchief trick to the world before it was cool to admit Magic was all tricks. It’s all tricks, people! Stranger danger is a trick to control you. The weird isn’t really that dangerous and some of the most valuable information needing to be transmitted between generations is going to come from peeps like my mailman Mr. Sucro, my camp counselor Mike Silver, or indeed me right here right now with the French Drop. [Tab 10]

I’m one-take Mike. I rarely edit, but when I do I get a million views. So come with me now as we go behind the scenes now and wax-on and wax-off. Wax-on and wax-off. Listen to Mr. Miyagi. He knows he can’t convince you to internalize the movements through practice without tricks. Without force. You have to want something so badly, like freedom from bullies, that you’re willing to put in the work on faith. You won’t know you’re learning exactly the skills, the very muscle memory, the precise movements you will need to defend yourself and ultimately kick butt in life. [Tab 11]

How do you know you’re onto something important or new? A thread you must follow? How do you know you found something loveworthy? How do you know you’re walking into the circles of your Ikigai? It feels like Christmas Eve. You know gifts are coming. You know a light is about to come into your life.

Let me show you something. It’s called The French Drop.

Let me introduce you to someone. Her name is Miss. Direction.

Next step: More finger magic, hand magic

Lure ‘em in with tricks… move onto what’s important…

Life-skills… most valuable… most relevant… and just feeling good about your abilities in the most number of situations.

All tech is text.

The single most important skill (a.k.a. trick) in life is developing your wax-on, wax-off ability in the vim text editor.

Text wars…

VSCode… while crazy-great …also blocking you from vim!!!


Fri Jun 10, 2022

category:

Okay, so I produced what I think could be a break-out video for the whole wsl –install thing. Technically, it will be the first of a series that gets to all my best Unix/Linux tricks. Tricks… tricks… ugh!

Let’s Look at My YouTube Analytics. Is It All Still Finger Magic?

Yes, most of my YouTube search and suggest traffic comes from finger magic and hand magic. Long ago the Raspberry Pi traffic faded away, certainly replaced by all the competition that jumped on the bandwagon since. But my finger-magic has continued in steady recurring spikes. The public has spoken:

Finger Magic Trick Hand Magic Trick

There’s something appealing to both people and the YouTube algorithm about that quick little off-the-cuff video I shot to capture one of my all time favorite tricks. Perhaps it was because it was accompanied by a very human story of my mailman growing up who taught me that trick and had a tremendous impact on me for life.

I didn’t learn the trick easily. I think part of why it stuck is that I would wait for that mailman to come around again to see how I was doing and to give me tips and pointers each day. I had a one of those precious and innocent adult-child relationships with him that would be suspicious and frowned upon in today’s world. What is that mailman doing? Stay away from my child!

I love The Shaggy Man from The Wizard of Oz series. Even back then such a character had to carry a “Love Magnet” for his relationships with people, including children such as Dorothy, to be believable. One of the more sweet moments in the Oz series is when The Shaggy Man gave up his Love Magnet to Ozma, the ruler of Oz, in order to live in the Emerald City and ended up still being unilaterally loved by all his friends still anyway.

That’s magic

This goes deeper than just teaching people magic.

They need that mailman in their life.

They need Mr. Sucro.

I will be Mr. Sucro to a new generation.

I will teach you hand magic. I will teach you finger magic.

Then I will move you along to the magic of adulting.

I will move you along to the magic of lifetime expertise in things more valuable than school or money.

I will teach you the tricks of future-proofing your skills, resisting obsolescence in your expertise and becoming immune to disruptions to technology, the economy and society.

Unix/Linux won and nobody really gets it.

I really need to settle on the way to refer to them together. *nix makes the most sense, but then you have both the markdown and the upper-case ambiguity problem. But seeing nothing better, *nix it is.

*nix magics?

StarNix Magicks?

Oh, the possibilities!

Don’t go registering new domains. Just use the term. Domain grabs are stupid now with so many new top-level domains (TLD’s).

I’ve got domains. Use LPvg.org. Make some sort of introduction or elevator pitch.

The pied piper of vim?

Yeah, probably something like that.

That’s what the YouTube algorithms are rewarding me for and likely want to see more of from me. Bring them in with the hunger for magic.

So you’re hungry for magic, my child? YouTube tells me you probably are not a child. You are the same age I was when I was trying to transition from old childhood magic-hobbies to more powerful adult magic. My YouTube analytics tells me that the seekers of finger and hand magic lean surprisingly older than I would have thought. The peeks are from 25 to 44 years old with a decent amount of 18 to 24 with surprisingly few of the youngsters I’d have imagined. This may be a data-skewing issue, but it’s fair to assume it’s bored adults looking for neat attention-grabbing tricks whom I’m attracting.

Finger Magic Hand Magic Audience Age

Hello you! I can help you grab better than attention. I can help you grab skills for life which will feed that same hunger you’re trying to feed learning magic tricks. I can teach you good old fashioned magic tricks too, but I have moved on from them. I want to visualize you better. You’re all guys, right?

Youtube Analytics Male Or Female

Oh, will you look at that! A full third of you are female. A large portion of you must be 25 to 44… not the demo I thought I had. Well it just goes to show you. I’m going to assume that people on the web searching for finger and hand magic are trying to entertain someone, and therefore have the performance bug.

Well, old fashioned magic ticks was a part of my past, just like comic books, Dungeons & Dragons, Magic The Gathering and a whole bunch of other interests I outgrew. But that’s because the hunger I was satisfying with these things I satisfy with other things that serve me much better in life. The myth building of those interest though helped, though I would credit the Science Fiction book reading perhaps more than these other interests.

More Venn diagrams, methinks. I can make it work. Be a pied piper of a new generation. Of course.

Be of interest to my own child, and probably even my own wife for that matter, by making it clear that what I have to offer is of great value to the population at large. Allow them to be curious about what’s going on.

Magic tricks… ugh! Finger magic. Coin magic. Linux magic. vim magic.


Fri Jun 10, 2022

Remain Positive & Know Signs Of When You’re Doing Something Right

category: tech

The iron is hot, such as they say. Things are busy on all fronts in life and I have to start really focusing on my move back to New York now. I’m really going to miss the Poconos, but my child is only going to be going through this critical 11-years-old Alice/Dorothy coming-of-age period once in their life. I need to be closer for setting a better example of how to live, no matter how people force you into unfortunate situations. But if you’re going to be Charlie Brown and keep kicking at the football, receive acknowledgement from Lucy that she won’t pull the football away. That way when she does, you have proof which opens the door to greater learnings and better remedies.

The human brain is powerful. Activate it more fully in everyday interactions. Look at things in a way so that when you look at them again, you know you have already seen them. Take note. Activate the other senses if you can. Smell, touch, taste, whatever is available. The brain works by cross-linking messages, and also a little how indexes work in an actual database. Not everything is the “raw data” going in. That’s like an undifferentiated data-stream. It’s like how a baby experiences the world before they have language and… well, experience. If you stay in that state and do not let your experiences somehow actually change you, then you are slowing down growth, and indeed actually cheating yourself out of the opportunity of growing and bettering yourself as a person… by your own criteria.

For you see, even just choosing what kind of human being you want to be is part of that open-mindedness and mindfulness when encountering new experiences. We start out as infants just taking in a data-stream. We gradually make sense of things and knit together cause-effect relationships. Some of us create if-this-than-that theories and strive for more in life by testing those theories and moving our lives forward sort of like a scientist. Others get stuck in that infant-like state just crying out like babies and demanding feeding (and later, money) from those around. Any form of higher-human interaction (which will be labeled “fighting” and “your fault” automatically by such children-adults) brings out a sort of crying and escalation. Root causes are not allowed to be addressed. That ends now.

Continue to do what’s right.


Thu Jun 09, 2022

wsl –install Rocks Your World with Linux (Learn Linux Fast!)

category: linux

Install Linux in 6 Minutes on Windows 11, Teach Yourself Linux!

Can you teach yourself Linux? Can you learn Linux fast? Yes, if you’re on Windows 11, you can now install and start learning Linux in under 10 minutes! And really if that’s all you want, start this video at 3:16 and only watch to 9:08. That’s 6 minutes from opening a Powershell in Windows to having Ubuntu 20.04 installed, and that’s with plenty of talk and instruction. Those six-minutes will change your life.

If That’s What 6-minutes Can Do, Imagine The Whole Video!

If you only get 10-minutes into this video, your life will be on a new path. Incorporate full-screen Linux Terminal into your everyday life. The single most important thing after getting Linux onto your machine with the “wsl –install” command is to start practicing the eternal vim text editor. Microsoft doesn’t want you on it. They especially don’t want you using git to manage your text-files. No indeed, they developed VSCode and bought Github for the express purpose of keeping you from following that route.

Get to vim FAST, Keep a Daily Journal to Practice

I will teach you that route. I will help you use a full-screen Linux terminal to run the vim text editor to keep a daily journal.

This is my second biggest gift to you, using the need to learn vim as an excuse to start keeping a textfile-based daily journal, thereby giving you the perfect place to practice your vim-skills before even taking up a programming language!

FACT: Unix & Linux Won. It Is Modern Literacy.

Unix and Linux won. Apple knew it in 2007 when they switched Mac OS9 to Unix-based OS X. Fifteen years later and Microsoft is finally following suit, but with Linux instead of Unix, and Ubuntu in particular, meaning Windows users now have access to hundreds of thousands of free and open source software packages a mere “apt install” away. Get to learn Linux Terminal and learn vendor-independent generic technology—indeed, *nix capability is literacy in the modern world.

Chase The Rabbit

Linux not as intimidating as you think. Chase the rabbit into Wonderland or accompany Dorothy into Oz. Get used to your main user interface to the computer being a beautiful full-screen text terminal. Learn the basic commands in a Unix/Linux crash course, promote yourself to superuser and learn how you might destroy the system with such elevated privileges, but don’t (there’s Levinux for that). Lose your fear and own your future.

Install & Teach Yourself Linux In Minutes

Windows 11 lets you easily install Ubuntu Linux on your Windows system, proving the best way for beginners to teach themselves Linux or to learn Linux fast. But don’t think Linux has much to do with desktops. While WSL does support graphics apps coming up just like Windows apps, the real value is in the text-based interface known as the Terminal or Shell.

You Already Have a Graphical Desktop

Linux is not a graphical desktop. All graphical desktops are the same. Windows, Mac, GNOME, KDE, xcfe, it doesn’t matter. It says almost nothing about the underlying operating system, the part that’s really important. That’s why learning Linux is about those text-based type-in windows and not point-and-click. Microsoft’s approach ditches the idea of adding the Ubuntu, Mint or other popular desktops. Graphical desktops prevent you from becoming technologically literate.

Technological Literacy Is About Terminals

Supporting genuine Linux terminals is a step Microsoft had to take for Windows to continue being taken seriously as a development platform, but it now gives a way to break free if Microsoft dependency. Future proof yourself by learning the terminal interface. Ironically, it never terminates. Make yourself resistant to obsolescence and immune to disruption by knowing what won over both Apple and Microsoft and is considered mandatory for developers (*nix availability).

Desktop User Interfaces Come & Go and Do You Harm

Particular graphical desktops will come and go over the years. They are tied to particular platforms like desktop and mobile. They are tied to fashion and style. They are the grounds for fighting for your attention through notification distractions, news, alerts, noisy icons, desktop clutter and… well, I could never stop. Desktop UIs are just the enemy of productivity and the development of good long-term internalized habits.

Become Forever Faster, Develop Muscle Memory, Stay In The Flow

Your habits, skills, muscle memory and general expertise will go with it (remember when Microsoft Office switched to the “Ribbons” interface). Think what an interruption to your flow-state it is when you stop typing to move your hand to the mouse. Worse yet, if you don’t know exactly where that feature moved to, the hunt. This makes you behave a lower skill-level, proficiency-level and indeed intelligence-level than is the reality and which you deserve. Microsoft and Apple are actually doing you harm. And so learning Linux has nothing to do with a graphical desktop. Not GNOME. Not KDE. Not xcfe.

Flocked to VSCode

Microsoft does everything they can to keep you in the flock. Mostly, they want to flock you with VSCode.

It’s Mike Levin on Thursday June 9, 2022 and I usually don’t script these things, but this is a very special video, perhaps the most important one I’ve made in my life and I want to get it right.

It’s Never Too Late (& The Past Matters)

This is Rafiki, perhaps the most important character to ever appear in a Disney movie. Rafkiki knows that the past matters. Rafiki knows that if you made mistakes in the past, they are real… and that it is never too late to set things straight. We all get a chance at redemption.

Apple Switched to Unix in 2007

In 2007, Apple switched their proprietary OS9 operating system to Unix. It was an invisible switch from a user perspective, but it is credited with saving the Mac line. That was a decade and a half ago when Apple was in dire straights against the rising tide of commodity Windows PCs.

Michael Dell told Apple to close shop and return money to shareholders.

Fast-forward 15 years. Apple was brilliant. They did the right thing. They knew it and stuck to their guns. Today, Windows has not switched over to Linux, but they have made it historically easy with the “wsl –install” command that I’m going to show you today.

Ivory Tower Linux Vs. Linux Groundswell (GPL2)

Wheras Apple sided with “Unix” (the pure ivory-tower *nix), Microsoft sided with “Linux”, the bad boy of *nix, albeit a good canonical choice (more on geek jokes later).

Windows 11 Gets The Linux Install Right

It’s far easier to get on Linux today on Windows 11 than on Windows 10, and it is in fact now as critical to the survival of the Windows platform as Apple’s move to Unix was fifteen years ago–because no platform has credibility if it does not make available the generic *nix tools to developers. Microsoft jumps on this bandwagon not because they want to, but because they have to.

It’s not too late for Microsoft.

It’s not too late for you.

Microsoft is giving you the greatest tool ever… to free you from them! To make sure it doesn’t actually free you, they hired the creator of Python, bought Github and created VSCode.

Then they turn around and give you all the new lingua franca of tech: Linux, Python, vim & git…

Get Your Butt to vim

…except for vim.

They want you on VSCode.

You can depend on it.

I will teach you how to get to vim, right now in this video. :q!

This will make all the difference

I promise.

There is much to know, but what I’m showing you here in this video are the most important bits.

They will make you free.

I’m Advocating Today’s True Basic Literacy

Without *nix know-how, you are illiterate in the technology world. You may not think so with whatever crop of power-tools you’re on, but they will let you down. As platforms change, as fads change, as hot skills change, only *nix will remain.

It will make you fear.

You must not fear.
Fear is the mind killer
Fear is the little death that brings total obliteration
I will face my fear
I will allow it to pass over me and through me
And when it is gone, I will turn the inner eye to see its path
Where the fear has gone there will be nothing
Only the Linux terminal will remain

Future-Proofed & Timeless (Welcome to OZ)

I propose to teach you this world… to bring you to Oz… to help you chase the rabbit down the rabbit hole and master Wonderland.

To give you forever-skills

It will someday be a part of our decendents’ DNA.

Make it part of yours today.

How To Pronounce Ikigai

It is the fourth circle of my Ikiagi…

How to pronounce it (thank you, Kamil Havlicek (am I pronouncing that correct?)).

This Should Be Million View Video #3

I’ve had 2 million-view videos:

Follow me!

Script For The Video

How to Promote wsl –install


Thu Jun 09, 2022

Installing WSL2 on Windows 10 Requires Kernel Update And Hypervisor Turned On

category: linux

Get your ass to Linux! If you’re on Windows, it’s most easily done immediately after the Windows 11 update. But if you’re anything like me, you’re going to stay on Windows 10 as long as possible. But did you know that your attempts to turn on “the good stuff”, i.e. WSL2 is undermined by three separate steps, none of which is that is that easy or intuitive. If you try, you’ll get this error message:

WSL2 requires an update to its kernel component. For information, visit https://aka.ms/wsl2kernel

Translated: WSL2 on Windows 10 Requires These Steps

What’s the big deal? The big deal is performance, network context, real Linux kernel and connectivity between Windows and Linux OSes (file/drive sharing). In other words, it’s a question of whether the Linux you’re using under Windows is as good as genuine Linux (WSL2) or not (WSL1). It’s worth jumping through a few hoops for WSL2.

So what to do? If you search in Google on the error you get trying to upgrade a WSL instance to version 2, you get this incredibly useful bit silent guy:

…but the dummy jerk doesn’t even talk during his video. He just sort of pantomimes his way through expecting you to watch and know what he’s not even talking about. I decided to embed his video here and break it down for you.

Okay, first that shortcut URL is going to forward you to this actual address: Manual installation steps for older versions of WSL This seems like a misnomer because it jumps right into checking requirements for running WSL 2. That does not seem like “older versions of WSL” to me. I think someone’s a little confused. It’s no wonder this stuff is nearly impossible on Windows 10. Anyhoo, there’s really just 3 steps:

Step #1: Enable Windows Subsystem for Linux

dism.exe /online /enable-feature /featurename:Microsoft-Windows-Subsystem-Linux /all /norestart

Step #2: Enable Hypervisor

dism.exe /online /enable-feature /featurename:VirtualMachinePlatform /all /norestart

Step #3: Download & Run This Patch

And here’s the step that trips everyone up. A patch needs to be run.

https://wslstorestorage.blob.core.windows.net/wslblob/wsl_update_x64.msi

Did I say three steps? Well, also do this. It will make the next wsl instance you create version 2 automatically. You don’t want to go having to convert wsl 1 instances into wsl2, especially if you just went through this rigmarole.

wsl --set-default-version 2

And that’s it! Follow these steps and you can start your journey to getting your ass off Windows too. But you’re not getting off of Windows if you’re not in the process of mastering (10 years or 10K hours) Terminal-based Linux (or Unix).

Follow this dude on YouTube. He seems to know what he’s talking about.


Wed Jun 08, 2022

Thirty-Year Anniversary of Dad’s Death (I Am Not a Check Casher)

category: about

Today is the 30th anniversary of my father’s death, Charles Levin, June 8th, 1992. The Amiga computer died. Dad died. Early signs of my mother going crazy were setting in, and my perpetually angry sister and her crazy friend Trava that she brought with her from California moved in on me, being in town for my graduation and suddenly with no signs of leaving. I had to take over my dad’s check cashing store, because ostensibly, it was all he had to leave my sister and me and as executor of the estate, I was legally bound to preserve the assets.

Charles Levin Thirtieth Anniversary Of Dads Passing

And in the course of running that check cashing store, a man tries to rob me and I shoot him four times with a gun I didn’t yet have a license to carry. I’m charged for a crime myself and quickly released, but my “career” coming out of college with stars in my eyes being the kid beamed up to the Commodore mother ship came crashing down around me. Meanwhile, a snot-nosed neighbor I had a childhood rivalry with shoots onto the Forbes 40 under 40 list. And why do I even care? The idea of being him is reprehensible. So what do I want? And why do I even write this?

Well, I write this because it’s been thirty years since Dad’s death, and it honors his memory for what it’s worth to remember him, and maybe does some good for me. I was twenty-one. I’m now fifty-one. Thirty-one and forty-one came in-between. Where was I then? Was I keeping a journal like now? Forget about what I would have done differently. What do I know now that I would have liked to have known then? What gifts can I give to my child based on the advantage of years and having experienced a thing or two?

Easy! Listen more closely to your own internal voice than to the voices of others, pop-culture, expectations of parents, or even your teachers, college professors, heroes or whoever. There’s a voice deep inside you that’s hard to hear, but it’s there. And be careful ‘cause there’s really a lot of different voices inside of you, and some belong to the worm, the fish, the frog, the lizard, the ferret, the poo-flinging primate, and then finally that most special and most meta of all, the ringmaster.

We all have a circus inside of us. The analogy to a 3-ring circus is good. There’s different acts like the lion-tamer, the high-diver, the tightrope walker, the trapeze, the elephants balancing on balls and all the rest. As I point out more and more, all the animal acts are actually quite inhumane and should maybe be retired like the Ringling Bros. and Barnum & Bailey Circus actually had to do. But for now, the image is still strong and can probably do you some good. We all need the Ringmaster in charge. The inner voice you should be listening to is the Ringmaster’s. The Ringmaster’s voice is the one that will eventually come out if you write regularly into a diary or a journal, like I am essentially doing right now.

Okay, so what does the Ringmaster say to me right now? The most important concept I have encountered in recent years is that of Ikigai. It is a Venn diagram of the intersection of:

Ikigai What You Love Are Good At Get Paid For World Needs Venn Diagram

Is this a post about my dad’s death after 30 years or about other stuff I found along the way? It’s about honoring his memory, and whoever I am today is a result of many of the lessons that he knowingly and unknowingly instilled into me. I didn’t like to disappoint Dad. On the few times I did, it still resonates in my head like it was yesterday. He taught me about much I feel in this concept of Ikigai. He told me about his dreams and disappointments in life and how it was from listening to the bad advice of his uncles, Jews from the textile industry where a Jew could actually get jobs and stay employed.

You have to remember he comes from a time when IBM wouldn’t hire Jews. His dream of being going to Drexel and becoming an aeronautical engineer seemed unattainable. I went to Drexel. I started out in mechanical engineering with similar dreams of actually being one of the people helping to colonize our solar system. I think today I might have been happier with microbiology like that other Michael Levin working on epigenetic. Shaping who you are through force-of-will like Green Lantern being real is beyond all hope and imagining regarding the type of world we live in. Wow! How could anyone be a pessimist?

Everything relates to our journey here on this material world. My dad took his turn here. I’m taking my turn here. You’re taking your turn here. Doors are opened for others through having kids. My parents opened the door for me. Your parents opened the door for you. If you were to calculate the probability of your existence you’d see it’s rather unlikely and quite a gift. It doesn’t hurt to keep reminding yourself of that. It’s yet another reason for gratitude. Thanks Dad! Thanks for you being here and thanks for opening the door for me. Thanks to you too, Mom. I’ll give you your homage later.

So maybe we’re just emergent properties of evolution or maybe we’re spirits from other realm invited in by virtue of the “vibe” established by the process of sperm and egg combining in conception and a new unique code. That’s your most basic vibe and the “you” that you may be happy with, or may wish to override when you become meta at 9, 10, 11 or 12 years old. By the time you’re 13 or 14, you’ve done your first self-crafting, either intentionally or unintentionally. If you’ve done a good job and kept yourself open-minded and dynamic, you’re golden. If you’ve become close-minded, embittered and angry, you’re going to have a tough life full of blame and not a lot of control over your own destiny.

Human free-will is a thing. Or as the Green Lantern corps would point out, it’s actually sentient-life free-will that’s a thing. Lots of being-types can override the highest probability predetermined clockwork-universe type outcomes of things. If you were a betting God, that’s where things get interesting because that’s where things become surprising. How boring it must be to God, knowing all that ever was and ever will be. The past is the future and the future is the past and all that exists exists like a crystalline sculpture to pick up and examine as a piece of art, but where’s the surprise? Where’s the excitement of creation?

Surprise God. Make your life a life worth tuning-in and paying attention to. Don’t do what your uncles tell you to do because of their view of the world. Observe for yourself. Have confidence in your ability to help shape and change the world right as you engage in it. Observe patterns and step into the path of love-worthy things. Do what you love. Do what you’re good at, although that may take a bit of time to accomplish. Do what you can get paid for. And ultimately, try to align it to what the world needs.

My dad knew the world needed aeronautical engineers. Who from his generation didn’t know that (the space race). But whereas my dad allowed his dream to be crushed, got into a bad marriage, and only barely transmitted to me his love of things before he clocked out way too early (63 years old), find your love earlier. I’m fifty-one years old. I’m only twelve years younger than my dad was before he died. I do not think I only have twelve years left. I have so much to live for, and I won’t let the circumstances of my life crush me like it crushed him. No, I’m going to love what I do going into these later years.

I am not a check-casher. Thank you Dad for teaching me that when I was only twenty-one. I can imagine that would have been a lesson for me to learn in my fifties and sixties like it was you.


Wed Jun 08, 2022

Do The Right Thing Morally, Spiritually & Technically

category: about

If you haven’t made it by now you’ll never make it, says the nattering nabobs of negativism. Is my posting here and on YouTube despite never having “blow up” actually the foolish consistency warned against as the hobgoblins of little minds? I have to ask that about myself. Should I just give up and behave like some beaten down dog?

No! The foolish consistency is, and has always been, actually listening to the voices of negativity! Pessimists are wrong! Listen instead to your own strong inner voice. Is it telling you that you are right:

If You Know You’re Right, Stick To Your Guns

If yes on all counts, then stick to your guns! Grow backbone and get up and just break your improper consistency in favor of a more effectiveness and consistency. Fear is the mind killer.

Tweak process, but don’t give up.

Correcting Course Is Okay. How To Try New Things.

If something hasn’t been working for you:

And so it’s pretty obvious the stuff I’ve been doing spot-on correct are:

I’m Mic Lovin’

Who am I? My name is Mike.
I have a Mic & like to talk.
I like to talk into the Mic.
I love to talk into the Mic.
My name is Mike, I’m Mic Lovin’
That’s how you say it. Let’s begin!

—Mike Levin, 2022

Of Chicken Lips & Mother Ships

My tales are mine.
That’s what life’s for.
Now let’s begin with Commodore:

The box of grey…
Or was that beige?
For ten years straight, was all the rage.

They hired me.
Adventure… Weeeeee!
Little did I know

The mothership
Blew up. I quip
From here, where do I go?

—Mike Levin, 2022

A Foolish Consistency is the Hobgoblin of Little Minds of QA Engineers

So I went to a Norwegian software company who go taken over by the same foolish hobgoblins of consistency, the very same little minds who ruined Commodore by sacrificing the future on the altar of yesterday. In other words, who they were (remaining compatible) inhibited their growth (breaking compatibility).

Realizing this and internalizing the message and applying it to me and my own life has been the single hardest lesson of my life. My own father was a quality assurance engineer, hahaha, albeit in textiles and not technophiles. There’s something oddly dream-crushing in the “vibe” of the field of quality assurance, at least in my opinion. Someone is set up as the arbiter of what is good and what is bad, and they bring it with a holier-than-thou attitude to areas of life in which they’re not qualified. My dad was less guilty of this than the Commodore Q/A-engineer that everyone hated but for some reason, we could never get rid of.

We are who we are both in the moment and from the past. But the mistake is to limit who we can become now and in the future based on who we were in the past. We are not so excessively colored, stained, tarnished, cast or whatever other crafty metaphor you want to use, by our pasts as we allow ourselves to believe.

Re-invent yourself!

Break compatibility with past selves.

Cast off your old Ironman suits and try on new attire for size & features.

It doesn’t even need to be armor if you’re in a new situation in life that does not call for it.

You might be surprised how much more you you can be.

You may not be (and probably are not) the you who you think you are.

This is terribly hard to see and even harder to admit once you see it. Truth is effing hard. And often, truth can be disparaging to yourself or others… by one way of seeing it. By another way of seeing it, that disparaging is really just healthy self deprecation. Don’t take yourself too seriously, and don’t let others make things more serious than they are.

Honk their noses. They’re clowns. They need their noses honked.

Who I Was, Excessively Tied to Commodore

So you have my childhood story. Philly suburbs, tossed into Commodore in Westchester, PA due to some unlikely connections with friends whose fathers were engineers at Commodore and the related MOS semiconductor facility in Norristown, PA. I had “an in”. And this was enhanced by my going to Drexel University where they required all incoming students (CompSci majors or not) to buy a computer… and a Macintosh, at that! The timing was such with my going to school there that the new “Educational Marketing” department at Commodore was looking to recruit and nurture “on-campus consultants” to give out literature and answer questions regarding Commodore computers. I was the president of the Philadelphia Amiga Users’ group (plural possessive&151;always plural possessive) at the time, and so Howard Diamond, the man who gave me my first copy of The Art of War, sought me out at a local Commodore trade-show.

And so I went to work for Commodore. And so I became involved in marketing, which seemed only natural given my predilection and natural abilities in art. There’s a lot of art in marketing, right? Seems like a good fit.

But then Commodore went away. And there I was stuck holding a lot of shitty inapplicable Commodore skills and nowhere to go…

In The Company of Other Rats Jumping Off A Sinking Ship

…except for all the other boats the rats were jumping to from that sinking ship, as the draconian sysadmin who was to be my hateful mentor used to love saying. You learn many of your most important and deep meaningful lessons from some of the most hateful and despicable people you will ever meet in your life. It’s one of life’s great ironies that life itself sometimes only moves forward due to the actions of people who would have fried you in butter if they could have.

The sysadmin of the company I went to work for after Commodore (after a 1-year detour of running my recently deceased father’s check-cashing store where I had the misfortune of having to shoot a robber) was one such person. I learned a lot from this person who would spec out and order equipment for me then keep it for himself, take sound cards and other parts out of my and other employee PCs during the night because he “needed them for testing” and endless other cases of subtle punitive actions against people he disliked.

Just Because You’re Paranoid Doesn’t Mean They’re Not Out To Get You

Watch out for subtle punitive actions against you by people who dislike you. They dislike you because they recognize qualities in you that they wish they themselves had. They are of course jealous of you and think it’s unfair you can live your life that way and make it work for you, when their attempts at similar have come up empty. That’s because subtlety and nuance count, and they can’t get all those finer points down. They’re stuck on surface detail.

And so, fast-forward to my move to New York City. I had to get out of that friggn’ poisonous environment of that once-Norwegian but now firmly suburban (Pennsylvania) company. I contractually earned a percentage of company gross (not net!) there, and it was sooooo hard to give up. I only gave it up to become a vice president at the same public relations company that launched Amazon.com to success. Can you believe that? Yup, I had stars in my eyes and skills under my belt, as imminently obsolete as they were soon to become.

The Safe Harbor of Microsoft Was Not Actually So Safe

Huh? Yeah, by this time I got off all the Commodore tech that was no more and moved onto all the Microsoft Web-tech. Microsoft is safe harbor, right? Fool me once, I won’t get fooled again, right? Well I was fooled again. Well, I was fooled again because Microsoft Active Server Pages (.asp files) were deprecated, along with all their supporting tech in favor of Microsoft’s .NET stuff, which was all copied from Java which was horribly jerry-rigged for the Web with a whole bunch of framework nonsense you had to religiously buy into in order to get back even rudimentary capabilities you had under the old .asp system. I’m talking about stuff they called code-behind, postback and viewstate (just for starters). It was not for the feint of heart, nor for the non-professional programmer. And it basically required you to use VisualStudio.NET, something else entirely than the VSCode that almost shares its name and role in Microsoft forced-tool-usage today.

Mike Levin, SEO in NYC… No More. Or Again?

This takes me up to the move to NYC when branded myself Mike Levin, SEO in NYC. This is where the next chapter begins. I wind it up here because I hardly think I could talk about it without all the negativity that I’m trying to put behind me, suffice to say that I have to man-up now for the child that I had during this time-period.

They are 11 years old now and really starting to think. With that ability to think comes the freedom for me to act… to act correctly morally, spiritually and technically. I have not been able to do that until recently for lack of energy following the divorce and rigors of a new marriage and the pandemic.

But I’m gradually marshalling my energies and the next chapter is beginning. The next chapter has begun.


Wed Jun 08, 2022

I’ll Blow Up On YouTube When I Want To

category: about

Part of the YouTube algorithm wants to reward me. Here’s the last 365-day view of my YouTube analytics. I’m different than other YouTubers in that I go back to 2006 and resist either dropping off the map nor polishing my act. So either you wash out or, in Geoffrey Moore’s terms, go inside the tornado. I do neither. I live at the vibrating edge, always the fuel and spark to blow up, but never the ignition. There’s 8,760 hours in a year so 178,223 hours of watchtime is 20.3 years of continuous watchtime among my audience; small compared to big YouTubers but not bad for low on-the-side effort.

A Year Of Watchtime On YouTube Analytics

A Year Of Watchtime On Youtube Analytics

The story gets even more interesting if I switch to “lifetime” time-period. The evidence is strong that had I dedicated myself to YouTube back in the early days before all the bombastic hyperbolators jumped on the bandwagon, I’d be a big YouTuber today. Of course I would have needed to have polished my act, done lots of editing and pandered to the audience. Watch time isn’t available before September 2012, and that’s around the time I was the first (or among the first) to unbox a Raspberry Pi on YouTube, recognizing its significance. Happily for me, I recognized the importance of gaining expertise in Linux, Python vim & git as being even more important than becoming a YouTuber.

The Times I Almost Blew Up on YouTube

The Times I Almost Blew Up On Youtube

While not as valid a metric as watchtime, views are still interesting. Here you can see my pattern of million-view videos. While I don’t produce them all the time, they do occur. And clearly the rule-of-threes is in play here. In addition to being the first to introduce the world to $35 computers easily incorporated into inventions and science projects, I also taught a generation of children the removable finger trick. Others did it later, but I was first and was testing theories in the ever-evolving field of SEO (search engine optimization). You have to walk the walk if you want to talk the talk. Google “removable finger trick” to see what I mean.

Pattern of Million View Videos In YouTube Analytics

Pattern Of Million View Viideos In Youtube Analytics

Again, I am first pointing how how significant wsl –install is today under Windows 11. It’s bigger for computing than the Raspberry Pi and a better magic trick than the removable finger. And this time, I’m going to take advantage of the new developments in the world, namely it’s easier than ever to record and upload video now that iPhone lets me upload one while recording the next. That’s the key to lifecasting (yes, Android could do it for awhile, but iPhone is what’s always on me). And Github Pages helps too. Honestly, getting people off Windows (eventually) and onto generic text-based Unix/Linux with a devil-may-care attitude towards the graphical desktop is now my calling in life… the “what the world needs” circle of my Ikagi.

More Open Minded People On YouTube (Subscribers)

More Open Minded People On Youtube Subscribers

And finally and probably most important of all is a measure of how many more open minded people there are on YouTube. In the past, my channel-focus would have had to been more focused, which in my mind is pigeonholed, myopic and disintegrated. I live the great big “everything is the same” integrated lifestyle. Integrated mind, environment, job and life. This doesn’t mean narrow-interests. It means everything’s interrelated in meaningful ways. Lockdown, quarantine and work-from-home certainly has helped my cause, driven more seekers like me onto YouTube and I think ultimately helps my mission to blow up on YouTube despite all the polishistas. The world just might be ready for me. I feel twitches in the web.


Tue Jun 07, 2022

Is There Anyone Cheesier Than Whitney Avalon?

category: tech

I don’t consume much mass media but when I do, it’s witty lyricists. If Linux, Python, vim & git didn’t end up being my calling, I would almost certainly have been drawn to this sort of performance art with all my snarky double entendre poetry and a sort of magic about it—but I’m not nearly as cute.

Don’t Be So Humble, Whitney!

                         OKAY SO, THESE ARE
                        DAIRY-FREE CHEESE BALLS
                             AND SINCE I'M
                           A LONGTIME VEGAN
                         WHO MAKES SILLY FACES
                             FOR A LIVING
                              I AM ALSO A
                        DAIRY-FREE CHEESE BALL
                       LET'S SEE WHO'S CHEESIER!

                             WHEN YOUR DAY
                            HAS BEEN ROUGH
                         YOU DESERVE SOMETHING
                           CRUNCHY AND YUMMY
                       A SNACK THAT'S NOT TOUGH
                        ON YOUR DIGESTIVE TRACT
                             OR YOUR TUMMY

                     THEY'RE ADDICTIVELY DELICIOUS
                       YET INCREDIBLY NUTRITIOUS
                        SO DO YOURSELF A FAVOR
                      AS YOU SAVOR ALL THE FLAVOR
                     THE SNACK IS NEVER BLAND 'N'
                    IT'LL MAKE YOUR DAY OUTSTANDIN'

                             I'LL TELL YA 
                           ONCE AND FOR ALL

                           JUST HAVE A BALL!

                            THEY'RE SO GOOD
                              AND SOMEHOW
                         EVEN CHEESIER THAN ME

If You Don’t Love Cheesy Poofs, You’d Be Lame!

OMG, OMG, OMG, your lyrics never disappoint, Whitney! You are the first performer I supported on Patreon and the best double entendre artist since Benny Hill. Glad you love the Cheesy Poofs. You are definitely not lame. Not Night Court in its fifth season lame—no, you are fresh as the day you told Cee Lo where to go. Whit, you’re a top-tier Disney Princess in my book.

Who Controls The Balls? You Control The Balls!


Tue Jun 07, 2022

Learning Linux Lesson #2: RUN! Run it from CLI!

category: linux

Two legs good. Four legs baaad. The second lesson in learning Linux is to not have panic attacks when you open those little black type-in boxes. And believe me, no one will tell you this or teach you how to get over it but me. Suppress the frustration. It’s not nearly as bad a place as you think. It is a good place. It is a love-worthy place. It is your (and everyone else’s of substance) future. Microsoft knows it. Apple knew it. And you should know it now too.

Nuke the place for morbid… uh, I mean from orbit, it’s the only way to be sure. Windows is was born bad… bad to the BIOS, but we’re stuck with it, at least for awhile. So in the meanwhile, RUN! I mean run commands from the command-line interface, also colloquially known as the CLI, Terminal or Shell. Any will do and all are equally confusing. Don’t be confused. It’s just the same as Dorothy landing in Oz. Instead of B&W going color, color goes black… but oh what a beautiful black with highlights in emerald green and turquoise blue. Give it a try. Trust men, you’ll love it. And you’ll never go obsolete again.

Linux is lots of little commands. Get started with ls, cd, and pwd (for “print working directory”) to move around directories. Most people are on Windows and Windows 11 makes installing Linux historically easy. And so my message starts at Windows 11. And you’re already on a graphical desktop (Windows 11) so it’s a perfect time to understand they’re all the same and you don’t need yet another. It’s the black command-line window that’s important. You have to RUN. So take my advice and RUN. Make sure you’re running with copies of things and that no single instance is all that important.


Tue Jun 07, 2022

Start Learning Linux by Learning About What Pissed Off Fernando Corbato

category: linux

The first thing to do when learning Linux is to understand that a young MIT student got frustrated from snooty elitists keeping him from having computer time in the 1950s when there were like only a handful of computers in the world. He invented “time-sharing” called Multics which got corrupted by evil corporations striving for a dystopian future. Fernando’s corrupted Multics panicked young Bell Labs rebel Ken Thompson who made a version that carried over many good principles, but otherwise chopped it’s balls off. It was called Eunuchs… uh, I mean UNIX.

But Unix got embroiled in a custody battle handed from one sue-happy foster to another until FreeBSD… uh, I mean Linux came onto the picture. Linux Torvalds said: Oh, this computer-scicency OS that allows piping standard out to standard in, which treats everything like a read/write file and which lets software get ported between different hardware easily is the right way to go with computer operating systems, so I’ll make changing the world into a summer project. And so he did. And any idea of learning a “graphical desktop” as anything having to do with Linux (even GNOME, KDE or xcfe) has the entirely wrong idea.


Fri Jun 03, 2022

Go For Broke Connecting The Dots

category: tech

Connect dots that I have been too scared to connect. First, speak freely about your life to your YouTube audience. Readily admit that you’re using your channel for self-therapy. One needs to be able to honestly talk to one’s self. Admittedly, there are dark corners we’re not going to want to over-share to a YouTube audience, and we all have them. It’s called our Shadow, and Carl Jung has taught this for ages. So, talk to your shadow in other ways. But the YouTube audience, they get your inner Johnny Carson… The Great Carnac!

Ah, showing people my vim-typing. Showing people my greatest skill. Some people become surgeons. Some auto-mechanics. Some have green thumbs and some play with 2-headed worms. Ah, how I covet that other Michael Levin who is unravelling that most important mysteries of life: mind-over-mater. Well, I’ll try tackling it in my own way, and that is such an onus for all this journaling… all this dot-connecting, all this thinking out-loud, public self-therapy, and edge of over-sharing.

So the eff-what? We’re only on this plant for like less than 100 rotations around the sun. If each of your fingers had 10 fingers, count them down. Can you imagine that. That’s how many 1-year rotations around the sun you’re going to be given on this material world—and that’s only if you’re really lucky and mindful. Mindful, mindful, mindful. I used to thing meditation was bullshit. I’m not so sure anymore. Green arrow personality-types think mediation is bullshit. Soft blues need it to assert voluntary control over their overly excitable selves.

And so what are some of those dots that need connecting that you’ve been so scared to connect?

Yup. Connect the dots. Go for broke… OMG, there’s an expression for that!

Correcting Wrong Turns On Road of Life

Packing The Comics & Saying by Goodbye to Winona Lakes

Game Theory & The Lightning Bruiser

Symbols are Real

When People Get Angry At You, You Know You’re On The Right Track


Wed Jun 01, 2022

How EMDR Lives Between Tetris-Therapy and Psychotherapy

category: about

Wow, June 1st already. June used to bother me like nothing else. Dad died June 8th, 1992. OMG, it’s 30 years! How can I remember this date and allow their (Mom & Dad’s) birthdays to slip by barely noticed? I have so many mental-blocks, it’s insane. I work hard to make sure I’m not insane. Worrying about whether you’re insane helps. I’m a child of the 1970s and I’m grateful to Alan Alda and the whole MASH series. I think he was the world’s therapist and self-deprecating trustworthy friend. In addition to his wonderful work on that series going crazy in front of our eyes, he said the most important quote I ever heard in my life that went along the lines of:

If you have the choice between fame and fortune, pick fortune.

—Alan Alda

Fame is overrated, he said. It makes you a prisoner in a way completely opposite to how you might imagine. My Ex-Father-in-Law (herein, my EIFL) who I ended up loving like a father has such great stories. At one time he ran into Alan Alda on the street where a mob of people were coming after him. EFIL stops his car, opens the door and says “Hop in”… and Alan Alda does! I believe this to be true, as EIFL has many larger-than-life stories consistent with his powerful job. You can think of him as a sort of a sort of Dick Marcinko of the Code Red Navy Seals book.

30-Year Anniversary of My Dad’s Passing

Ugh, June again. Well, I’ve started to lighten up again in June, but there is something deeply emotional. Father’s Day right on the heal of the anniversary of Dad’s death, 30-years ago a week from tomorrow. I don’t remember any dates in my life like I remember that. I found him dead in the bedroom next to me at around 10:00 AM one morning when I was sleeping in. Did I hear him? Could I have done anything?

He was only 63 years old, only 12 years older than I am today, and a year shy of retirement age (back then) able to cash in on the social security he’d been paying into all his life… cheated! And my effing mom got it, she who divorced him and forced the sale of my childhood home and the pride of his life. Well, she’s dead now too, although kept I believe in my Uncle Bob’s closet because I wouldn’t come kiss his ring, LOL!

To Go Meta Or Not To Go Meta… THAT Is The Question.

Anyhoo, this is a nice friendly post, isn’t it? Well, I’m pretty good at introspection and tweaking-out of my subconscious how I really feel, bitterness and all. I believe people are often scared of this about themselves, but I think it’s really quite necessary. Like bringing all those impurities to the surface during detox, you can’t really be happy with yourself if you’re harboring all that poison. And you can only let it out by speaking and actualizing it–a two-edged sword, always.

There’s two lines of thought on this:

Tetris-Therapy, The Most Effective Treatment You’ve Never Heard Of

Huh? Tetris-style? Yup! There’s studies showing that when a likely traumatizing experience happens to you, the best thing to do is to not wallow in it. Play Tetris or anything else that’s guaranteed to engage you and transition you into the zone. This will prevent scarring synaptic pathways from being reinforced through a shock-induced feedback loop. In other words, immediately do something to distract. Otherwise, it could be the chip-on-your-shoulder, the cross that you bear, the elephant in the corner, for the rest of your life. This is also good advice for after when you stumble upon certain Reddit subreddit groups.

But We All Have To Talk It Out, Don’t We?

Oh, the crybabies? Yes, yes. Let’s address the crybabies who will say you have to talk it out. Okay, well that Tetris trick is what’s recommended to soldiers after say having to bayonet someone to death, or the modern equivalent of pressing a button to let a death-drone strike. Same thing. So to keep those solders functional, working assets of the military, of course the Tetris approach is desirable. We can’t have our volunteer murderers face the reality of being trained and somehow legitimized or “authorized” murderers. No, we can’t have that. And so, Tetris-therapy. Practical, but dehumanizing.

Suck It Up, Soldier! Life Won’t Wait For Adult Crybabies.

We are all soldiers. We must get on with life. That’s just the harshness of reality. Sure, talking it out can help, but not at the cost of a life of paralysis and dysfunction. All culture and society is a form of community brainwashing anyway so we can all just get along and not be killing each other all the time. The only difference between a cult and a culture—one that will decimate a country and whole geographical region over WMD’s (weapons of mass destruction) that didn’t exist because the actual country “didn’t have good targets”—is mind-tricks.

These Tricks Are Your Tricks. These Tricks Are My Tricks…

Yup… mind-tricks. That’s the only difference. We are all equally immoral from a standpoint as close to objective as we can hope to achieve in this subjective world. Cults and our own culture are equally guilty of horrible atrocities. This country was born on terrorist acts (refer to Washington’s crossing the Delaware). Whether these things are justified or unforgivable are matters of how the winners write history.

So why not use those very same mind-tricks to your advantage, crafting yourself to be functional? No reason why not. You should! Tetris-therapy is legit. Don’t let your leaders work you like a puppet, or the professional of psychoanalysis milk you for all your money. Pull your own effin strings.

If You Gotta Cry, Try Crying Into a Journal (and learn vim)

Write! Keep a journal and write. In doing so, you will administer your own psychotherapy. At first you will start to think you are going crazy, then you will realize you are more sane than you’ve ever been. Refer to the comedy-drama MASH and the book The Artist’s Way.

After awhile (everything takes practice), you will feel a healing, cathartic process happen. It will work to heal you gently over time, because your journaling activity will be persistent over time. Layer after layer after layer. It will take as many years to come out of trauma as it took to become traumatized and wallowed over it like a crybaby. Sure, your experiences were probably horrific beyond imagining. But so were those of Holocaust survivors and many of them went on to have functional lives, re-discovering the love-worthy bits that make it worth living forward-in-time rather than backwards.

Eye Movement Desensitization And Reprocessing (EMDR)

So is there something in-between Tetris-Therapy and Psychotherapy? Yes! Eye Movement Desensitization and Reprocessing (EMDR). This has been a recent discovery of mine. Actually, the discovery has been recent but I do believe I’ve been applying it to great effect for some time in the form of the Google Keep (Notes) app in which you can put images. When the images are perfectly square, they are presented in a 2xN scrolling grid. And so to “take them in”, your eyes dart from left-to-right reading the picture on one side and caption on the other. Is this not the perfect application of EMDR? Go Google it and see how such therapy sessions go and tell me that I haven’t been doing it intuitively all along!


Wed Jun 01, 2022

Fluorescent Ultraviolet Psychedelic Blacklight Snake Cobra Poster

category: about

Remember this? I do! It was on my wall growing up, and now it’s burned into my psyche like the red hot colors its made from. If you remember it too, you may be a kid of the seventies! Spencer’s Gits! Oh, what a gift it is that Spenser’s still exists. The USSR may come and go. The mall where you bought it may be gone. But such precious images persist.

Fluorescent Ultraviolet Psychedelic Blacklight Snake Cobra Poster

Wow, did I miss the 70s. I was born 1970 so it’s always real easy to calculate my age. I was 10 in 1980, 20 in 1990, 30 in 2000… wow, that seemed so far away! And here we are in 2022! My goodness! Maybe I’ll buy this poster again and re-listen to all the awesome music of the 70s.

This page is dedicated to Bob O from the YouTubes. Thanks, Bob!

Coming of Age Movies of The 1970s and 1980s Were The Best

That’s probably all I need to write about that. However, my mind goes racing. There were other things in my wall. Ahhh, a Tale of Two Heathers: Locklear and Thomas. I should show their posters here, though there’s really no particular tale to tell except the one you can infer about puberty. Refer to Fast Times at Ridgemont High

Ahh, the brat-pack and coming-of-age movies of my generation were best. I don’t care what you say about American Pie (Yeah, I’m that old to consider that new). Nothing holds a candle to Sixteen Candles or The Breakfast Club. And while not precisely coming-of-age movies, I came of age during Rocky, The Godfather, Alien and Star Wars. I mean come on now. The Greatest Generation may have saved our asses from the Nazi’s, but we got to watch Indiana Jones.

Now Let’s Pick Some More Movies for Movie Night!

Coming Of Age Movies Planning Movie Night For The Kid


Tue May 31, 2022

What To Do? How To Always Be Doing Step #1!

category: about

Time running out to get your work done today? Drawing a blank on “what to do”? I sure know I am. Don’t worry, it happens to everyone. Sometimes it feels like lack of air. Sometimes like writer’s block. Whatever it is, the following things will help:

First Steps to Almost Anything Requiring Motivation

Force Yourself To Do Jumpstart-Activities That Don’t Take Thought

Shake yourself. Maybe jump up and down. Maybe jumping jacks. Doldrums lead to doldrums lead to doldrums. Act-up, be-up! It’s the good version of “fake it to make it” because you’re not really faking it. The idea is to get more oxygen and nutrients in your blood and to get the blood circulating through your body more. So this is a case of merely going through the moves being exactly the same as doing the real thing

Going through the motions of getting your blood circulating actually gets your blood circulating.

Sometimes you’ve just got to force yourself to take that next next step. But even if you have the will-power to force, it doesn’t help with the question of what to do. That has to come not from force, but from finesse.

The answer is simple. Fire up your journal and start writing. Work through it.

Done With The Force. Now Onto The Finesse.

When it’s super-tough to start doing anything at all, just start writing in free-form free-association style. Just write and write and write some more. You may be surprised what happens. This is why you keep a journal. This is a place you should be practicing your vim-powers daily.

The key life-habit you’re trying to develop here is just being able to start writing anything at all. It’s just like getting your blood circulating.

So, start writing. Write anything. Maybe just the date. Ask yourself a few questions like you’re interviewing yourself.

Start Writing, But Not Writing About Writing!

Don’t get too meta about your feelings, however. Thinking about thinking about thinking about thinking about thinking will spiral you into a dark place. STOP THAT! That’s how you ensure writer’s block persists.

Instead, make some attempts at actually talking about that thing you need to do. You can do it! And no, me telling you to not think about thinking does not mean that it’s my fault you can’t stop it. That’s always the case, and with such weak-willed thinking, you will always find an excuse to fail, POOSER.

Interview Yourself

So grab yourself by the bootstraps and ask yourself:

You will feel yourself gradually gaining power.

Actualizing Into The Material World

That feeling of gradually gaining power comes from “actualizing” or “materializing” thought into the real world. Putting words onto paper (or bits into onto a storage media) as you funnel your inarticulate thoughts through your language-center of the brain, thus articulating them, is a somewhat magical, uniquely human experience.

That’s The Feeling of Being Human

So right off the bat, it reminds yourself that you are in fact a human and that you do have voluntary control over a certain amount of your thoughts and actions—perhaps surprisingly less that you might think at first, but still some, similar to the way we can take control of our breathing and blinking when we want to.

Try Breathing Exercises. Always Start With Breathing Exercises.

Okay, so to take control of your breath you can do breathing exercises:

There’s almost no way to count to 5 on every inhale and again on every exhale, then keep track of the overall count of breaths to 10 without shutting out, at least temporarily, the screaming voices of your inner less-evolved animal parts that desperately want you to stop doing this.

Stop That Out-of-Control Cortisol Production

Similar exercises can be used with blinking and walking, sometimes all in combination, to utterly and completely yank away the control from the panicking inner worm of the pituitary gland pumping out the stress-inducing cortisol hormone which isn’t doing you any good if the threat isn’t actually life-threatening. I’ve really got to cross-link the explanation here!

It’s Always The Same, It’s Just Fight-or-Flight, That’s All.

If this sounds familiar to you and such problems plague your life, shame on your parents. It means your parents didn’t let you rough-house and beat the shit out of each other, scrape your knees, throw rocks, and all that happy horseshit that makes us resilient and able to tell real life-threatening situations from bullshit.

1, 2, 3… 1? You’re Always At Step #1!

Okay, so what’s the exercise to get yourself working and finishing that project you’re putting off? It starts with counting, again. Why? Because every little thing gets done in steps and you can always list them 1, 2, 3… and you’re always at step #1. So you can always repeat the exercise listing 1, 2, 3… 1? Observe:

1, 2, 3… 1?

Visualize The Results

Describe what you’re trying to get done so you can at least have an objective measure of what you’re trying to do and what it will look like when you’re finished.

What I’m working on right now will be much like a copy of an instance of something that already exists.

Okay, so step one is copy the thing that currently exists. Rename it and change all the naming references to the new name… done! Next! 1, 2, 3… 1?

Done Step #1? Okay, Back to Step #1…

Oh, step one is now finding the new list of “properties” that will be used on the new instance of the thing I must produce. Go get that new list and put it where it needs to go, keeping the old list in-tact so I can compare and contrast for differences…okay, done. The old list is “commented out”, something you can do in programming languages disable lines of code while keeping them in place. See, not so hard. Okay, next?

Put Step #1 In Front Of Another And Soon You’ll Be Walking ‘Cross The Floor!

1, 2, 3… 1? Well, there’s definitely a difference in the number of items on the list, and the grouping of items in the list.

The new list has 19, 10 and 4 members.

The old list had 19, 10 and 5 members.

Wow, even though the members of the list changed a bit, the number of members in each sub-list is actually very close. I need only delete one row out of the copied template on the bottom group. This is much less work than I thought!

Put Step #1 In Front Of Another And Soon You’ll Be Walking Out The Door!

Lesson: you don’t even know how much or how little work you’re facing until you do a process like this!

Next step? Delete double-check your numbers from original files… done.

Okay, now delete that last row from the copied instance.

See? It’s getting easier. I don’t have to really do these “force myself” steps anymore. I just intuitively know now to update a few additional little things in the file then run the script… DONE!

But Are You Really Done? THIS Is Where You’ll Screw Up

And then we think through whether there’s anything we’re forgetting. Just when you think you’re done, you’re at your most vulnerable to mistakes.

Enumerate everything you know about the thing you just did…

Oh yeah! Add it to the scheduler and restart scheduler.

And then remember to look at it tomorrow to ensure it ran on schedule.

Now I’m done.


Tue May 31, 2022

Movie Night: From Mean Girls to Heathers and Beyond!

category: about

Last night was Movie Night with The Kid. The Kid is 11 years old, rapidly approaching twelve. Movie nights of recent past have included movies of their choosing, mostly Anime involving cats. Prior to that, I made sure I got in ET and Back to The Future. Some movies are just too important to see in order to understand ol’ Dad, and I’d argue life in general, though as Lex Luthor says in the Christopher Reeve Superman:

Some people can read War and Peace and come away thinking it’s a simple
adventure story. Others can read the ingredients on a chewing gum wrapper
and unlock the secrets of the universe.

—Lex Luthor, 1978

Yeah, the 1978 Superman is surely going on that list, as is the 1979 original Muppet Movie that came out close on its heals. My 8 & 9 year-old years were quite formative in my life with the type of suburban summer-camp my parents were able to provide me, but which I was not able to provide my own child through those years due to living in New York City and the divorce. Sighhhh. Well, Christopher. Thank you for touching me and my life. Now you will touch my child’s. One never need lose their backbone, even if you are a stranger in a strange land, and no matter what unfair knocks life deals you.

1978 Christopher Reeves Superman Movie

Redemption & Refinement

Making good begins with Movie Night. As a result of a very strange request to watch The Godfather given the non-stop references to it that exist in pop culture. I totally jumped at the chance. I checked with the The Kid’s mother who was OK. So I made The Kid an offer they couldn’t refuse: I’d sit and watch the movie with them all the way through, and if it got too much for them, we could stop it at any time. No strings attached. I wouldn’t be asking for any favors. It was a completely unconditional offer. Theory being that I’d lay down some contrast between myself and the characters they’re about to encounter… check!

A Tale of Three Court-Holders

Adi needed to see the court-holding personality types like Don Corleone, Lex Luthor and my Uncle Bob doing their ring-kissing thing. Favors for favors. Court-holding. By contrast, there are no conditions or strings attached when it comes to your good ‘ol dad, except where I can’t help it due to financial constraints or responsible parenting (which is a problem when it comes to getting them to clean their room). Oooh, ooh! Karate Kid… go add that to the list!

So The Kid made it up to the horse-head in the bed scene and sort of tuned out. Good. They don’t need the whole epic and sad story of Michael Corleone… yet. Suffice for them to be in a position to compare & contrast the Michael Corleone they just met with the Mike Levin they’ve known for a lifetime. Just when I thought I was out, they pull me back in!

Just When I Thought I Was Out They Pull Me Back In

Carrie On There’s Nothing To See Here

We don’t need no stinkin’ endings. Anyhoo, last night was Mean Girls. Important lessons? Soft blue Yin personalities covet and can easily get sucked into hard Yang (high school) cliques. Ugh! Ouch. Moaaaaan. My high school bully stories, while not as bad as Stephen King’s Carrie, did end in a couple of deaths. Frank Sessa who tagged me with the nickname “AIDS victim” drove me out of my middle school lunch-table and taught me who my friends were, which were very much not the people of my Synagogue culture. He fell down some steps and died. No connection.

Happily, and happily right up to this day, I found true friendship in a true Israeli who toughened me up and taught me to laugh, Guy Bruchstein. Thank you, Guy! And thank you Howard Gorchov as well. I missed Howard at the 30-year high school reunion (he moved to New York too), but I caught up with Guy and we laughed at Jordan Robinson and Ira… something-or-ohter… both sipping wine and obviously looking down at everyone, judging. But then so was I (projection, much haha!). Oh and that Kevin-what’s-his-face is still a bombastic showboating clown. I never held it against him, but I love hearing Guy point it out. He says what he thinks. Guy Bruchstein has been my Guy Gardner.

Oh, I love Guy. Thank you, again.

But I digress. The lesson of Mean Girls is resoundingly that:

Lessons Regarding True Vibes & Friendship

Lessons Regarding Strategy & Capability

From Movie Night to Stories Around Campfire Night

There’s so many other good things to say about Mean Girls, I neither know where to begin nor care to do so. Let’s move on. Besides movies, there’s other critical media of the reading-variety I’d love Adi to wrap their mind around. I think they need to read, or at least hear the Audible performances of:

Coming of Age Movies

I’m sure there’s others, but these are the three that pop right to mind. My Alice/Dorothy/Harry Potter coming-of-age stories are something of an obsession, as I feel I never quite got over that phase of growth in myself. In fact the fact I’d even think to state it that way is a problem in our society. One should never “get over” coming of age stories. One should always and forever remain a dynamic personality. And so:

Become Dynamic or Parrish

Again, I could probably go listing movies for movie-night forever. Suffice to say, these get across the next deep learnings The Kid must internalize. Jumanji shows how people get trapped in their own minds (or games, fantasies, youtubes, etc.) and become at risk of becoming adult-children, stuck frozen in time and never growing and enjoying the benefits of adulthood—namely, freedom from anxiety by virtue of knowing better.

Robin Williams As Alan Parrish In Jumanji

Getting frozen in a dissociative child-like state is I believe the source of crippling modern anxiety. If you can’t grow up, this disabled adult-children learn how to manipulate the other, more capable adults around them just like a crybaby having all your needs fulfilled by having it brought to you. This is what the online community of narcissists mean when they talk about narcissistic supply. It’s a baby-wants-their-bottle sort of thing. Or more accurately:

Baby want want their loophole?

Otherwise, you’ll just leach off of other people your entire life, learn to project an image of dignity, but really have imposter syndrome all your life, hollow inside, and ready collapse under any sort of challenge&#nbsp;but not until after threatening to rain hellfire from the sky in empty gestures of “Karen” posturing.

My child needs to have their eyes opened to people choosing to live this way and to be better than doing that for themselves. Getting trapped in your games of your own mind (Jumanji) without a certain outward-looking mindfulness and awareness of the objective world and how to make it on your own without relying on a surrogate parent to bottle-feed you, or else BAM, you perish… uh, I mean you’re Alan Parrish.

Happily, Alan Parrish in the end turned out to be a dynamic personality and escaped the feedback-loophole of his own mind. All’s well that ends well… another important Movie Night lesson.

Thank you, and R.I.P, Robin Williams. I love you. You’re definitely on a lot of Movie Night choices.

Thank You EVERYONE Involved in 1978 Superman! Thank You!

Regarding Superman, I’ll stick to my guns. Too many reasons this is a must-see to enumerate, suffice to say: Otisburgh (which I incorrectly called Otisville in prior videos). Global warming, second chances, true importance, deep channeling and parallels in the life of the actors… I’m choking up just thinking about it. Excuse me while I go blow my nose.

Know But Beware The Dark Recesses Of The Mind

And finally, Heathers. My kid took an immediate liking to Edgar Alan Poe. Good Philadelphian, Adi. I’m proud of you. And I’m extra proud that you can recite the first verse of The Raven. And yes, people were a lot harder to scare back then. But if you want a modern Telltale Heart, you can’t do much better than Heathers, especially as a follow-on to Mean Girls. Oh, this is going to be powerful. And I get the extra bonus of the highlighting of passages of Moby-Dick, which I strongly feel I will transition to in dramatic readings around the final campfires here in the Pocono mountains this June last month of our experience here as a character straight out of such stories yanks us back to where the air in stinky and the expenses are crippling.

Framing? Okay, Sure, Framing.

Think deeply, My Child. Think deeply. Yes, while this is framing—it is framing such that you think for yourself with the aid of writers and authors and master storytellers (I will not stop using the word “master”). It is the role of your father, of Good Ol’ Dad, to rough-house with your mind right along with your body to inoculate you against the frail fragility of bitter brittle prattling prima donnas of the world. They will empower your pathology. They do not want you to be well. There is nothing I can say or do that will teach you this better than the story-telling masters can.

I will instill in you a love of reading. Terrible labels! Just that phrase has the opposite effect of closing avenues of the mind, clamping shut the open-mindedness of a child with the tedium of chores and lessons. Ugh! The tyranny of language! Math… that’s another. Math lessons are the same in any country… is that not powerful Channeling of the True Spirit of Math by Cady Heron of Mean Girls? Cady, pronounced Katie and not Catty… get it, kid? Get it? Katie becomes Catty to content with Regina who’s a catty predator!

Well, if you liked Mean Girls, you’re gonna love Heathers. It goes dark in just the way you like to see, channeling the spirit of Edgar Alan Poe. It will call out to your inner Philadelphian. Think in extremes, but act in moderation. Just because your mind can go there doesn’t mean your body has to. But media lets your mind go there. Entertain the fantasy, but maintain a sharp line of demarcation. Learn from the fantasy. Be able to weight the consequences of making the worst decisions possible in life without having to suffer those consequences except in the freedom and imagination of your mind.

Empathy & Compassion Always Win Over Fear & Anxiety

Compassion is a survival trait. Anxiety, and teenage “angst” by extension, is an out-of-whack survival trait. The need for empathy and compassion persists. The need to strike out with deadly force over minor altercations and dark fanciful whims does not. The Telltale Heart will do you in. You can’t suppress your more noble inner human. And I’ll stop there because I may be remembering Heathers totally wrong.

Stone walls doe not a prison make,
Nor iron bars a cage;
Mindes innocent and quiet take
That for an hermitage;
If I have freedome in my love,
And in my soule am free,
Angels alone that sore above
Enjoy such liberty.

—Richard Lovelace, 1642


Tue May 31, 2022

Boosting YouTube Watch Time / YouTube Analytics

category: about

Last day of May. It’s March that comes in like a Lion. I got that so wrong on the videos. Just forge forward relentlessly. Also, show the nice people your YouTube metrics and your figuring out your next step. Don’t pander, but YouTube probably is my best prospect on having the supplementary income I require for going from $1500/mo rent to $2200/mo. Can you believe it? And that’s not even for living in Manhattan. That’s the suburban outer borough of Staten Island! Shit, I’ve got to tighten my belt and hunt.

Continue doing a kick-ass job “at the office”. Everything you do for your online presence and personal skills must also apply directly to doing a better job at work. The money I make at the “real” job is critical. I cannot sustain going unemployed for the stretch of time you’re supposed to be able to. And so do great work. That’s so important.

Okay, so work on boosting YouTube watch time. This is my current analytics:

Boosting Youtube Watch Time Analytics

So the question is, how to do that without compromise? How to do that while helping yourself in your day-job?


Mon May 30, 2022

One More Month of Respite From City Life

category: about

I’m at the Winona Lakes pool on our second day of Memorial day swimming. What a gift this community has been over the past two years–a true cure for the cabin-fever of tiny NYC-apartment living. I am so grateful for the little rental cottage and that fateful drive out route 280 to route 80 into the Delaware Water Gap. I had no idea we would end up minutes from my childhood “secret” ski-mountain and am sad we won’t have another winter season here. I only got Adi out on the mountain ski-boarding once, but that’s okay.

Insert pool pic (very zoomed out)

It’s been nonstop trampolining, campfires, ziplining, canoeing, gocart racing and the like. I mean really, it was an experience of a lifetime… or maybe not. With year to year leases and the mobility of a homeschooler and telecommuter, you never know. Don’t rule anything out. For now, we can make the return to this area a special weekend activity we come out and do, scheduled lessons and all. I can help make snowboarding and whatever else stick.

Insert picture of the yard

Posting and sharing this idyllic existence while I’m enjoying it really isn’t in my nature. Call it superstition, but is there not an evil eye jealously looking down on us punishing us when things look too good? There is, I tell you. There is! But I’m not superstitious, but spit, spit. Maybe it’s for the best that circumstances being what they are conspired to compel me back to the land of pretentious faux Greek columns and Manhattan-hating New York locals. The rule is grin and bear it… as opposed to my current existence which is more like grin at bears watching them knock over the trash cans.

Insert picture of bear

I signed a lease last Friday on a nice 3-bedroom around the corner from one of the great “Green Belt” parks of Staten Island. On the whole, the parks are much nicer than the people in my estimation. Staten Island is most decidedly not my vibe, but my child is having a difficult time, and I’ve received my last crying phone-call that I couldn’t do anything about but remote-time. No, now it’s time to be there and face the issues driving them to dark places in their mind. I start by making sure they are involved in the selection and “Yes, this is where we’re going to live approval process. So much of this kind of mental state is from lack-of-say over one’s own life.

Insert YouTube here

Saturday was rainy, so Sunday was our first day here at the pool. I am grateful that I did not have to return Adi last night for the Memorial Day parade that her mother told me they had to be back for on Monday. I got a text mid-day yesterday saying they weren’t attending the parade, and so Adi had the opportunity to stay. Adi is instantly hitting it off with the local kids as they always do, and I’m sitting here typing as I hear shouts of Marco Polo and the giggles of a good time. Adi loves using “My Dad told me to socialize” as an easy way to introduce themselves to new kids. There’s an exchange of names, and bammo whammo let the good times role. I am so proud of their socialization skills.

Maybe Another Pool Pic

Of course it’s the pride of a parent talking, but in addition to that deep comfort level in the water, easy socializing, the kid is also a natural mimic. Adi does their spot-on Mickey Mouse impression and I say I wish I could do a Fry impression as I’m wearing my Planet Express T-Shirt today in anticipation of the upcoming 1.5-hour drive. Without missing a beat, the kid busts out a perfect “Shut up and take my money”. I am both floored and proud beyond measure. My movie-quoting bantering roommates of yore would have laughed their asses off as I called their nonsense prattle as they called it banter. It seems I can’t escape Darmok and Jalad at Tanagra. Sighhh. When the walls fell.

Well the walls are not falling anymore. This dumb delivery boy is at ease with his holophoner without a single smart-worm up my butt. I’ll have a bunch of trips back and forth as I move most things into cheap public storage here PA-side, like the canoe and the bicycles. Yes, the 2 Mongooses and the Schwinn are a problem, living up a flight of stairs as we do. But maybe not. Maybe I just carry them up and down the stairs. There’s the electric bike too which maybe I’ll be able to keep folded at the bottom of the flight of stairs. But these are the little bits of ironing out of details. They’re the small details, nuance and subtlety that don’t really matter. Nothing involving matter really matters much.

The matter that matters is the matter that lives. Live, laugh, love. If the matter can do any of those things, it matters more than any other matter. Pets and other living things come in a very close second. Hand-crafted or nature-made irreplaceables come in very distant third. There’s nothing so irreplaceable that it should put much of a dent on your spirits if lost. What, are you going to take it to the afterlife? Nope! In fact, the only thing that might even go with you to the afterlife is the information that permeates your awareness and being; your experiences.

Okay, wrap this one one, but definitely capture more mental snapshots and vignettes of your life. You’re wrapping up a time here at a place you’ll have lived at for 2 years by the time it’s done. There’s been so many places over the years, but this one was different because it as during the first few years of the pandemic and it was during Adi’s 10 to 11 years-old times when there’s so much important stuff going on in the brain. Adi deserves to see Dad working at his best. I may not have a lot of money to work with, but I have a lot of brains and heart, and they saw them in action.

Fast-forward to about 8:00 PM. Have to start the drive back to Staten Island. It was a gift to have Adi this “extra” day of the Memorial Day holiday. I’m starting to think that the cutting-short of time may be deliberate and my diligent documenting of it may be paying off. We just watched Mean Girls with Adi. It’s feeling like the highlights of my life, the simple pleasures of watching a movie with and connecting with my kid.


Mon May 30, 2022

Gathering My Stories

category: about

This post is about as open-ended as it gets. Beware, read at your peril.

To blow up on YouTube here on this “dirty” channel without a squirrel obstacle course or counting to 100,000 is going to take some creative thinking and connecting some unusual dots… dots… dots… dots… caught in a feedback loop. Oh, there goes another one of those fugue states. You really need to know about Batman’s Azrael character. Here, let me go get that comic. Yes! Put solid images in peoples’ heads about these silly meataphors you use. Hmmm, what’s a meta for… I/O! WhatsaMetafor.io. Organize your material later. Don’t slow down! Oh, show the people a vim command:

Bonus vim! I really should start tagging this stuff. Do yourself a favor because you’re probably never going to come back and organize this stuff. Fine then. Build up MikeLev.in more and more. Put most of your eggs in one basket. These other domains you register are really just side-bets and ones to keep the stuff here from getting TOO overwhelmed by shop-talk (MikeLevinSEO.com). I can shop talk until you’re blue in the face. No need for that… move on!

Ah! It’s time… it’s time to list a few topics… my stories. These are just a few of the topics and sub-topics that roll off easily…


Sun May 29, 2022

Levinux, a Small Linux Distro without GUI for Education

category: linux

In this post, I answer the following questions:

…and then go off on a complete tangent, fixing an obscure little issue in my distribution of Linux called Levinux.

I have a Linux of my own.
My name is Mike Levinux.
It’s not a distro I have grown
So much as a respinux!

—Mike Levin, 2022

  1. Can I teach myself Linux? Yes!
  2. How long will it take to learn Linux? About 5 minutes.
  3. Is Linux hard for beginners? No, not with an open mind.
  4. Can you learn Linux for free? Yes.

But Not in This Article. This Is Not Even a Levinux Lesson.

This article is me rambling on forever about all the profoundly cool stuff that went into this still applicable today timeless fossil of a learning-piece for those taking up Unix and Linux (collectively known as *nix) operating systems. For a lighter overview, may I suggest:

But by all means, continue.

Face your FEAR OF HEADLESS SERVERS with this ~20MB Virtual Linux w/ Python, vim & git. Stick your hand in the *nix gom jabbar with a double-click from the desktop of your Mac, Windows or Linux PC.

Now Onto Some Truly Geeky *nix Talk

Genuinely new Linux distros are actually few in number, or at least mainstream popularity (plenty of tire-kickers) because it’s not easy building Linux from source and actually adding value in a way that’s of interest and use to the public. There are a few and the two king-of-the-hills are (as of the time I last cared to research):

A Few From-Scratch Linux Distros

You know, this list really probably goes on a bit. Maybe genuine Linux distros are few in number versus the nearly infinite second-order derivatives like Ubuntu. But I’ve hit some of my favorite high-points. There’s others mixed in and some are even surprisingly popular Unix versions, one you’re running right now and don’t even know it.

The Invisible Hands Reaching Into Your Intel-PC Right Now

Minix in particular is of interest to me. I looked at basing Levinux on that back in the day. Intel had been baking it into your processors since around 2008. I didn’t start looking for the beating heart of Levinux until around 2012 or so, when my kid was 2 years old and I realized I needed a sort of “Noah’s Arc” for my work. Looking around, I found and fell in love with Tiny Core Linux, which comes form this guy Robert Shingledecker who is the first person to bring Linux to municipal government in the United States. He’s got a great story and is another one of those decisive be and divisive individuals.

I guess I ought to shout out to QNX Unix. It’s a proprietary version of Minix, basically baked into everything from Blackberries to coin-op laundry machines. If it’s got a computer inside, nobody’s has the time for truly proprietary shit anymore, so it’s going to be some tiny form of Unix or Linux… always. Did I mention drones?

Curses! Foiled Again.

I tried building my own Linux from scratch (Refer to LFS above) along with compiling QEMU from source. I could have gotten the 20MB zipped distro down to probably 2MB to 5MB and not had the QEMU disappearing pointer problem. The key to this appeared to be MinGW (today MinGW-64), a subset of Cygwin (a way of turning Windows PCs into POSIX-compliant *nix machines) plus a wonderful library (I still want to use it) called Curses. Curses lets you do a lot of neat tricks with text-terminals that you think are reserved for Graphics. The popular Python pip install rich makes wonderful use of it today, validating my suspicions 10-years ago when I was researching this stuff.

A Very Deep Rabbit Hole

Roomba can only crash into a wall, turn a bit and retry so many times. Channeling the spirit of The Incredible Hulk with only a touch of Bruce Banner to manage the turns (how I work) wasn’t good enough. This task was going to take a whole lot more Bruce and a whole lot less Hulk.

Ugh! Did I mention Rabbit Hole evaluations. This rabbit hole was too deep for me. Controlling the linking of libraries in C to bake-in static libraries that resisted for a thousand-tiny reasons… plus learning Curses and about a zillion QEMU compile-options.

Uncle! Uncle! I cry uncle.

80/20-Rule To The Rescue!

Oh wait, wasn’t everything just working perfectly a moment ago but for the pointer disappearing. And you know, maybe someone out there wants the options for graphics in the Levinux console. And a 20MB download isn’t really so bad. And many other sour grapes I could conjure.

Just focus on a good Alice in Wonderland-style experience. People will appreciate it in any case. Did I mention the 80/20-rule? And those QEMU-binaries… there’s that one from Japan everyone uses. And there’s that one for the Mac that was mainstream there for awhile. Okay, okay, mix them together in this magical cocktail based around Mac’s packaging system that copies my & Howard Harrison’s WBScript for the Amiga back in the 90s (I later found out Sun Microsystems also used this directory-as-an-app trick), shake well, then bash a few times… voila!

Bob’s your uncle! (he really is.. I’ve got an Uncle Bob)

The Basic Ingredients for a Tiny Linux Server

So, what I chose for Levinux and some of the ingredients and components that went into it are:

Pshwew! Honestly, it was a miracle this whole thing worked in the first place. It’s a miracle it still works as well as it does today and still serves its purpose for education so quintessentially. A paragon of perfection, as I said in the YouTube description. Every time I see the Dropbear SSH server doing its thing and the busybox-httpd webserver doing its thing, and using the default vi program built into BusyBox Linux, I get the chills. It is a true example of how Unix/Linux skills are generic in nature and when you go this route, you’re not learning one particular thing, but really technology in general.

That’s because Unix/Linux simply has ascended to unchallenged already-won dominance. Even Microsoft has come around in the most “I meant to do that” way.

And a final note before we get onto the nitty gritty of what I’m fixing here, QEMU is actually in fact using the much maligned lately technique of PC EMULATION (vs. virtualisation or containerization). It’s much maligned because emulation is very slow compared to virtualisation, and practically not moving at all compared to the so-popular containerization (docker, LXC/LXD) of today.

Stick Your Hand In The Pain-Box. Go Ahead-less!

In other words, don’t expect Levinux to do anything for you other than to kick off your schooling in old’s kool text-based Unix/Linux. I view it as the Gom Jabbar pain-box of Dune that is used by the Bene Gesserit sisterhood (kind of like Minix) to determine whether you’re fit to be a leader of human race. Levinux is a similar little black pain-box testing whether you’re fit to learn *nix.

Dune Gom Jabbar Pain Box Levinux Linux

Don’t Get All Diffie w/Me. You Can Go To Hellman.

Go ahead. Download Levinux, unzip it and double-click the icon for your OS. You’ll get the server-build phase and be invited to SSH. I bet you can’t. Not with the instructions I give. That’s because since I released Levinux, SHA-1, the encryption/checksum system for secure communication has been hacked. Something in the system changed, probably the ssh programs run from Powershell, PuTTY, default on the Mac and such changed to give warnings. It couldn’t be Levinux because nothing big has really much changed there… for 7 years!

QEMU Virtual LAN & Network Address Translation

So I researched it and figured it out. SSH programs can be told to go ahead and work with SHA-1 anyway. The nattering naybobs who will yell at you for this don’t know what they’re talking about. It’s about the same as the ninny’s who think you shouldn’t pickle data in Python even if it’s not a program to be used by the public. That’s like saying you shouldn’t shoot a gun in a shooting range. Levinux runs in QEMU’s virtual LAN (local area network) context. It’s got its own subnet and DHCP server. No traffic is getting in unless you create a NAT-rule (network address translation). It’s safe.

To understand more about the technology being used here:

OpenSSH Legacy Option

Whoah, whoah there Hulk! Slow down there a minute. No amount of smashing is going to make it work. You need to let Bruce Banner take the wheel for a minute. Just for a minute, Hulk. Then you can smash Linux with

sudo su
rm -rf /* 

…all you want. But to actually log into Levinux using a modern ssh client, you’re going to have to type:

ssh -oKexAlgorithms=+diffie-hellman-group1-sha1 tc@localhost -p 2222

Hulk SMASH!

Okay, got it? No? Well, the password is “foo”. Stop and read man, will you?!?! Oh yeah… The Hulk. Anyway, type in:

foo

…as the password for user “tc” when challenged. You won’t see it… no, Hulk! No smash. It’s okay… not seeing passwords is intentional. It’s a security precaution. Think about it… oh yeah, The Hulk.

You okay, boy? You in? Okay now. Follow the instructions. Maybe install Python. Maybe check out vim. You don’t really have to do either if you’ve got some vi-skills or are willing to develop some.

Bruce Banner Controllig The Hulk

When I advocate vim, I do so in part because it’s little sibling, vi, is built into everything including 500KB busybox Linux, I shit you not.

Is that good, Hulk? Hulk vi?

No? Not of interest? Still want to smash… sigh, okay. Type:

sudo su
rm -rf /

Now you can go into the Reset folder, double-click whichever reset script appears to be the right one for your OS. If Windows, override all the resistance. If Linux, the execute-bits may not be set right or the double-click action not registered or any one of a thousand other little details.

Eash, Hulk! The Gom Jabbar test will tell you if you’re human.

Still with me?

Congratulations, You’re Human

You may even be qualified to lead humanity into the future. Just learn yourself the Bene Gesserit ways of Linux, Python, vim & git.

Mom! He’s using the voice again!

Candace Flynn Phineas Turned Into A Badass

Next Step After Levinux

I’ve really given up tiny versions of Linux. I wanted them as a sort of Noah’s Arc so that my (mostly) Python code was not tied to one particular server, Raspberry Pi, Cloud service, or whatever. I wanted my code to live on my keychain in an always-ready-to-run with a double-click sort of Kung Fu way.

I was so hurt so many times by technology going obsolete on me that my desire to future-proof myself led to a sort of paranoia that led to the concept of tiny virtual Linux (or maybe Unix) servers. I don’t feel this way anymore. And I still totally think tiny linux servers are as cool as talking frogs, I’ll still kiss the frog. Otherwise, this frog is cooked.

But A Talking Frog Is Cooler Than A Princess Engineer Joke

Play with tiny linux servers from time to time, but don’t rely on them as your main development machine or anything. Maybe if you work on embedded systems like those in drones, they become your main thing. But 99 times out of 100, just use a mainstream OS that supports a good Unix or Linux. Macs ARE Unix with fully operational bash-like Terminal Shells and Windows has recently become good enough, thanks to Windows Subsystem for Linux (WSL). And ChromeOS has a good Debian-based bash Terminal.

The Promise of Unix / Linux has Borne Out

In other words, the promise of Unix / Linux has borne out and you don’t have to carry an extra one around with you as a Noah’s Arc backup. Just keep your code in Github. Clone it onto any machine. Run it.

So long as you use forward-slashes instead of back-slashes and beware of the Mac’s slightly non-standard directory structures, your code will pretty much run anywhere. No Noah’s Arc required.

Python, vim & git play into this delivered promise too. Python 3.x code will pretty much run on anything with Python 3.x. For an added bonus, resist direct reference to file-systems and instead go through Python’s pathlib. But even that’s not really necessary. The world has yielded to forward-slashes and the Unix/Linux directory structure.

Death by broken Windows paths is over.

This will now be true for every hardware platform past, present and future, which will always include some version of vi or git (if those platforms want to pretend to exist). Developers won’t allow otherwise.

And with standardization, comes the great unwashed hordes of developers, with which comes the great groundswell of a new literate class of global citizens who would laugh at a proprietary Windows without putting WSL Linux capabilities first.

The new lingua franca of tech has settled down, and it’s known as LPvg:

Final Shout-Out to Jupyter

Getting started ain’t easy. The universal code-runner that I sought in Levinux is now replaced in my heart with JupyterLab Desktop. Download it. Install it. Forget all this Linux, vim & git stuff for a few hours. Just let the Pythonic pythoniness of Python wash over you for a little.

In JupyterLab Desktop, the “Hello World” program is:

"Hello World"

Sun May 29, 2022

If I Were Green I Would Die. Abba Di Abba Di.

category: about

My kid would like to be a professional YouTuber or TikToker and I tell them:

Those who can do and those who can’t,