Future-proof your skills with Linux, Python, vim & git as I share with you the most timeless and love-worthy tools in tech through my two great projects that work great together.

A Place to Put Things

I'm taking you on a journey to become a modern tech literate person. Learn the basics of PyTorch, group keywords, and build robots. Familiarize yourself with Linux, Git, and Vim. Set up a GitLab repo and gain the freedom that comes with it. Join me as I guide you through this tech literacy adventure!

Embark on a Tech Literacy Adventure: Learn PyTorch, Group Keywords, and Build Robots!

By Michael Levin

Thursday, December 22, 2022

There is general knowledge and there is specific knowledge. This is where a lot of people (including myself) go wrong. Specific knowledge saves the day. It’s what gets implemented in the here and now, because it applies to the specific situation you’re in. But specific knowledge goes bad at a faster rate than general knowledge. This is the case with JavaScript frameworks with the great flocking form React to Vue to Svelte. The irony is that in frameworks trying to generalize Web app development, they become overly specific, get an expiration date and become perishables.

I now have to get something up on Pipulate. Continue refining the look & feel as simple, professional and cool as possible, but also get some real content there. Get people to set up a server then have the first editable pip install in place. I need a library that can be imported in Python but which is still a folder in the repos directory. Figure that out. I’ve had difficulty with ohawf but I think it’s because of the init.py files I needed for the old nbdev. Perhaps porting ohawf to the new nbdev should be one of my side-project priorities. I need the new nbdev 2.0 experience anyway.

Yes, I need to start building out on Pipulate project, by project. I need to copy/paste the best stuff from by blogging over here and put it over there on Pipulate. Domains provide such a good place to put stuff. I am able to put stuff on Pipulate. I have a few standards now for myself. Blog directly onto this site, MikeLev.in. Extract and distill best examples and plug them over there on Pipulate.com.

Few places can you appreciate a standard so much as just how you plug one thing into another, usually to provide power or transfer information. Without compatible parts, no powering-up is occurring and no information is being exchanged.

Linux is powerful because it rarely makes you wonder where and how you can plug things into each other. It’s usually a file system, but when it’s not, it’s posing as one. Except for drivers. Always except for drivers. I’ll talk about exceptions later. For now, stick to most common use cases… standards.

You need a place to plug things in. But avoid the use of plugins. If it’s a file system, use that. If it’s a key/value storage system, use that. But the flexibility of “pure” Linux runs out fast when you start exploring how to solve a large variety of the world’s problems. Most solutions these days leading you to pip install something.

Python’s pip program that uses the repository at PyPI.org is the right sort of place to plug things in to gain superpowers, but to not be overly dependent on plugins.

Commonly available and commonly used pip installable packages are better than fragile long-term habit endangering plugins for vim, VSCode, Photoshop or the like. Even though capability-enhancing plugins are very sexy, they but bad for you in the long run. They make you dependent on things that might not be there in the future. Even when you pip install, you should choose your libraries carefully.

Be good at using the most conventional ways of plugging wonderful new capability in, and lean into the strengths of whatever tools and medium you’re using. In the Python world it usually means use things built into the Pytnon Standard Library first. This is the stuff that’s part of the main Standard CPython distribution form Python.org.

When the time does come to pip install secret weapons, Pandas and Requests are good first choices. When Requests can’t keep up, there’s the API-compatible httpx stepping up keeping all your Requests abilities relevant. This is the kind of API timelessness and momentum you want to find, even in 3rd party packages. Xarray uses Pandas API, but for jagged N-D labeled arrays instead of rows and columns. Python makes my heart sing.

When you’re getting ready to do some simple Ai@Home machine learning work, look at ScikitLearn first. Do some linear regression. So some random walks. Familiarize yourself with the basics before jumping onto PyTorch. Group keywords, then build robots. NLTK, the Natural Language Toolkit, which although is not part of the Standard Library anymore is in this category. They are reliable long-term tools. Non-disrupt-able craftsmanship and mastery over time is achievable.

So where to plug things in? You gotta get command of the Linux command-like. Know how to open one on your desktop. Know how to find home and jump to a few different locations. Have a running Linux server under your command where a 24x7 service (called a Linux daemon) is running. Take this on as a challenge, like here’s how to make your little tech lackey who’s going to listen to and carry out your every command.

That’s a place to plug things in. It’s small surface-area, so you know where to look. It’s probably one task or queue controlling everything that runs, so you know precisely what to do. More things schedule? No problem. Write one python .py file that that does the work, another .py file that manages the schedule, and a 3rd file that goes into /etc/systemd/system and voila! The Python scheduler is now a Linux service.

That’s a place to plug things in. Forget plugins. Just know what magical text files go where to control the world. Information infuses machines with automation. Automation in the digital age is always a matter of putting text files in particular places in particulars ways that have particular meaning. And that these days has a standard called Linux with systemd. It’s a wind-up spell-casting musical box. Jupyter is a spell casting machine with a lever. One pull, one run.

There’s a vast chasm of divide separating those who can automate in the Linux daemon 24x7 sense and the Jupyter sitting there and pressing the button sense. The Jupyter button pushers rely on always paid-for, always platform locking-in cloud services to make the transition. I say it’s just first principle Linux knowledge everyone should know.

Better still, writing and running Linux daemons is just foundational ability everyone should have. You should know how to write and run an actual Linux daemon, or else you’re not really modern tech literate. Linux won. Systemd won. Vim won. Python winning, so far so good. And of course, you just have to use git. And not through VSCode. At least get yourself infinite undo in your text files, you git reset –hard HEAD^

And yes. And yes. Until you set up your own GitLab repo at home, GitHub.vim with its private repos is a place you plug things in. Microsoft still manages to code a dependency upon them into your day to day work. Just know that down the path I’m taking you on is eventual freedom, even from that.