MIKE LEVIN AI SEO

Future-proof your skills with Linux, Python, vim & git as I share with you the most timeless and love-worthy tools in tech through my two great projects that work great together.

OnTheRise.AI Registered & About To Be Launched

As Mike Levin, a self-promoting SEO in NYC, I'm documenting my journey in the rising AI-world on my blog OnTheRise.AI. I'm exploring idea-capture by humans and the potential of AI to make tasks like fusion energy easier and more efficient. I'm also discussing how to use WSL and JupyterLab to become more tech literate, as well as the advantages and disadvantages of event-driven programming and push information systems.

Exploring AI-World: My Journey To Become Tech Literate and Understand Event-driven Programming

By Michael Levin

Wednesday, February 15, 2023

The thing that reduces publishing friction the most in my life would be the ability to publish by emailing to a particular email address. Yes, yes, there are both proprietary systems and clever ITTT fad-o-the-day hacks. Not interested. Let’s go for timeless and compatible with all the skills you need to develop for the rising AI-world.

AI On The Rise! ontherise.ai? That would cost $70/year and what would I do with it? Document these escapades, of course! It’s much more search friendly and if broad appeal than your other domains. AI is on the rise! Rise of the machines, and all that. This is one of the best investments you could make for controlling the discussion. Take risks. Do it.

Okay, I did it. Now it is if paramount importance that I get it all underway. First of all is starting yet another new blog that documents this whole friggin process. But now with my established and working well slice and dice system in use on MikeLev.in it should go more smoothly. I can perhaps just grab selective files from the MikeLev.in repo for the OnTheRise.AI repo. Wow, I think I’m nailing it with this idea. Next step?

Okay folks, an introduction. I’m Mike Levin, a self-promoting SEO in NYC and all-around douchebag marketer looking for redemption by sharing a timeless yet modern form of tech programming literacy I’ve discovered for myself. I’d rather just do this direct for my kid who could probably use to know it, but they’re also at a phase where even just the belief that something educational’s going to come out of my mouth triggers them.

And so I turn to the outside world. I’m in between roles at the same company at the moment I’m writing this, technically unemployed but with an accepted verbal agreement at my next in-house SEO gig, at Moz. Even just to say that gives me chills. My old Web 2.0 hit, HitTail has a nice 15-year run. My latest thing has been Free & Open Source SEO software. And it’s just what the world needs right now with the rise of AI. Let me explain.

You will need your anecdotal evidence emailed to you every day. Don’t rely on yet one more web app cloud thing running somewhere probably charging you something. Sure so that when the situation warrants it. But up until that time collect data that is not over-pidgin-holed and myopic by doing SERP “extractions” or datatype analysis and causing the need for further definitions and speculation and miscommunication. Just screenshot it and email it to yourself daily with your own computer.

AI or whatever, this is project #1. It aligns with what the ChatGPT is telling me about how to get started. This is my personal project:

OnTheRise.AI

I passed on TheRiseOf.AI. It’s probably more searched in and thus SEO friendly than my version but I don’t want my version to sound like an alarmist SciFi title. Sure be reminiscent of it, but lots of things are on the rise along with AI and I plan on covering it all.

I’ll just start with this one humble project.

Alright folks. In this first post on OnTheRise.AI I will address the task of idea-capture by humans. No AI coding yet, although we will get to that soon enough. To get started, we have to cast you into the role of the hero protagonist.

You’re sitting there reading all these articles, watching all these videos, and testing out the free versions of all these reality-bending AI app, products and services. It’s rather over-stimulating. If you’re not an optimist, it could play very much into an apocalyptic world-view. However if you’re an optimist, you see the hockey stick capability growth of problem solving skills.

AI’s are making it easier to do Fusion energy by tweaking magnetic containment fields faster and better than any other way we know. It’s like the stabilizing propeller of a helicopter. A lot of tech is behind reach because the responsiveness and intelligence necessary for such stabilizers was beyond reach.

Well many once-booted self-sustaining post scarcity future is not so much off the table anymore. Crossovers occur. Getting a fusion reactor going sustainably will require something like an AI working full-time. Parts wear out, so there will be a cycling-out concurrent systems tech, akin to Erlang and Vax.

Hey ChatGPT: What is Erlang?

  1. Erlang is a programming language designed for building concurrent and distributed systems, particularly for telecommunications and high-availability, low-latency systems. It was developed in the late 1980s by Ericsson, a Swedish telecommunications company, and has since been used by companies such as WhatsApp, Facebook, and Cisco.
  2. Erlang is dynamically-typed, functional, and supports pattern matching, recursion, and message-passing concurrency. It also has a garbage collector, so memory management is automatic. One of its key strengths is its ability to handle large numbers of lightweight, concurrent processes, which makes it well-suited for building highly scalable and fault-tolerant systems.
  3. Erlang’s syntax is often considered unique and sometimes challenging to learn for those coming from other programming languages, but it has a large and active community of developers who contribute to its development and provide support to others.

Hey ChatGPT: Will AIs help with the magnetic containment technology for fusion?

It is possible that AIs can contribute to the development and improvement of magnetic containment technology for fusion. Magnetic confinement fusion is a complex and challenging field, and AIs can be used to analyze large amounts of experimental and theoretical data, optimize system parameters, and provide insights into complex plasma behavior that may be difficult for human scientists to identify.

For example, machine learning algorithms can be used to identify patterns in large datasets of experimental measurements or simulations, which can help researchers better understand the behavior of plasma in fusion reactors. AIs can also help optimize the design of magnetic confinement systems by performing simulations and exploring large parameter spaces.

However, it is important to note that AIs are tools and not solutions in themselves. While they can provide valuable insights and guidance, they must be used in conjunction with human expertise and judgement to ensure that the resulting solutions are safe, effective, and reliable. Additionally, developing effective AI models for magnetic confinement fusion requires extensive knowledge of the underlying physics and engineering principles involved.

Hey ChatGPT: If I had a secret weapon that allowed people to become much more tech literate than they are today, but the cure looks worse then the problem in that there are a few very steep learning curves, such as that for vim and the Unix Terminal interface scare people and turn them away, but if they get over the hump, there is great reward, how would I make that argument? This secret weapon is WSL on Windows 10 or Windows 11. As a way to ease the transition we can also offer running Python code in JupyterLab in a Windows Edge browser even though it’s really hosted by Linux on WSL. The upside is that your code is ready for 24x7 automation Linix-side as a systemd daemon and the ability to write and maintain these Python scripts running as daemon Linux services is the secret weapon. How would I begin making this argument for the ambitious amateur to start doing this sort of work?

Making the argument for using WSL to become more tech literate can be challenging, especially if users are intimidated by the steep learning curve associated with the Unix terminal and vim editor. However, here are some potential points you could use to make the argument:

  1. Access to powerful tools: By learning how to use the Unix terminal and vim editor, users gain access to powerful tools that can help them become more productive and efficient. They can perform tasks more quickly, automate repetitive tasks, and have greater control over their computing environment.
  2. Compatibility with Linux: Using WSL allows users to access the Linux ecosystem and its vast array of tools and libraries. This can be particularly useful for software development, as many programming languages and frameworks are designed to work best on Linux.
  3. Transferable skills: Learning how to use the Unix terminal and vim editor teaches valuable skills that can be applied in many different contexts. Once users become comfortable with these tools, they may find that they are more confident using other command-line interfaces and editors, which can be valuable in a variety of settings.
  4. Increased job opportunities: Many companies and industries require knowledge of Linux and command-line tools for certain roles. By becoming proficient with WSL and related tools, users may be able to expand their job opportunities or take on more challenging roles.
  5. Practical experience with automation: Writing and maintaining Python scripts as Linux services running as daemons can provide users with valuable experience in automation, which is becoming increasingly important in many industries. This can be a valuable skill to have on a resume or in a portfolio of work.

To ease the transition and make the learning curve less daunting, it might be helpful to offer resources like tutorials, cheat sheets, and other online guides. It may also be useful to highlight the benefits of using JupyterLab to run Python code in a familiar web-based environment. By emphasizing the practical benefits of these tools and highlighting their transferability to other contexts, users may be more willing to put in the effort to become proficient with them.

Hey ChatGPT: Okay so I’ll make tutorials, cheat cheats and best practice guides. I will make all of these into directories or folders on a WSL Ubuntu 20.04 instance that an install.bat script I wrote sets up on Windows 10 or Windows 11 machines. At the end of the install, JupyterLab will be accessible on http://localhost:8888. It will be a great place to run genuinely Linux Python .py files that are easy to turn into Linux systemd daemons. The WSL instance had the JuputerLab server already running as such a daemon which can be monitored because while it’s under systemd, it’s also running as a forked GNU screen terminal session that can be logged onto to monitor its console. This is a wicked cool way to run Python scripts as Linux daemons and o plan on popularizing that fact. I plan on positioning it a lot like casting magic spells that go off on timers and such, and how this whole approach is an alternative to WebDev which has many downsides that we avoid by taking this approach. And this approach will also has many difficult to see today but inevitably valuable upsides. It’s the path on which all the little things add up and make a big difference in you ultimately holding equal power to the largest corporations because of wise decisions you made as a result of encountering this published material and learning the magic incantations. Drink me.

It sounds like you have a clear plan for how to make the transition to using WSL and related tools less daunting for users, by providing tutorials, cheat sheets, and best practice guides. Additionally, offering JupyterLab as a familiar web-based environment for running Python code can help ease the learning curve and make the transition more accessible.

Highlighting the practical benefits of running Python scripts as Linux daemons and emphasizing the transferability of these skills to other contexts can help make the case for investing time and effort into learning these tools.

However, it’s important to be clear about the potential downsides and limitations of this approach as well. While running Python scripts as Linux daemons can be a powerful tool for automation, it may not be the best solution for every task or use case. It’s important to be clear about the tradeoffs and limitations of this approach, and to provide guidance on when and where it may be most useful.

Overall, it’s great to see that you are passionate about this approach and are taking steps to make it more accessible and approachable for users. By providing clear guidance, tutorials, and practical examples, you can help users become more comfortable and proficient with WSL and related tools, which can be a valuable skill in many different contexts.

Hey ChatGPT: What are some contexts in which the Linux daemon solution is not suitable?

While running Python scripts as Linux daemons can be a powerful tool for automation, there are some contexts in which it may not be the most suitable solution. Here are some examples:

  1. Small-scale projects: For small-scale projects or scripts that are only run infrequently, the overhead of setting up and maintaining a Linux daemon may not be worth the effort. In these cases, it may be more efficient to run the script manually as needed.
  2. Non-Linux environments: If the target environment for the script is not Linux-based, running a Linux daemon may not be possible or practical. In these cases, alternative approaches such as running the script as a Windows service or running it on a cloud-based server may be more appropriate.
  3. Real-time systems: For applications that require real-time processing, the overhead of running a Linux daemon may introduce too much latency or introduce other issues that impact system performance. In these cases, alternative approaches such as running the script as a system process or implementing a real-time operating system may be more appropriate.
  4. Security considerations: If the script in question requires a high degree of security or needs to run in a restricted environment, running it as a Linux daemon may introduce security risks. In these cases, alternative approaches such as running the script in a sandboxed environment or implementing additional security measures may be more appropriate.
  5. Rapidly changing environments: For projects that are rapidly evolving or frequently changing, maintaining and updating Linux daemons may be more difficult and time-consuming than other approaches. In these cases, alternative approaches such as running the script in a container or using a serverless architecture may be more appropriate.

Overall, the suitability of running Python scripts as Linux daemons depends on the specific needs and requirements of the project at hand. It’s important to consider the tradeoffs and limitations of this approach, and to weigh them against alternative approaches to determine which is the most appropriate for the given use case.

Hey ChatGPT: What is the most common way of running code in real-time systems? What is the difference between that and batch files and pooling? Is there really such a thing as push information systems because isn’t it true that something always needs to be intermittently pulling or checking?

In real-time systems, the most common way of running code is by executing it in response to external events as they happen, rather than on a periodic or scheduled basis. This is often referred to as event-driven programming or reactive programming. Real-time systems typically need to respond to events with low latency and high accuracy, and event-driven programming can help achieve these goals.

Batch files and polling, on the other hand, are more commonly used in non-real-time systems. Batch files are used to automate the execution of a series of commands or tasks, typically at a scheduled time or on a periodic basis. Polling involves repeatedly checking for updates or changes in a system or data source on a periodic basis, rather than waiting for an external event to trigger a response.

Regarding push information systems, the term generally refers to systems that proactively send information to clients or users, rather than requiring them to actively request or “pull” information. Examples of push information systems include real-time messaging systems or stock ticker feeds. While it’s true that these systems often involve some form of polling or checking on the server side, the key difference is that the server initiates the transmission of information, rather than waiting for the client to request it.

In summary, the most common way of running code in real-time systems is through event-driven programming, which is different from batch files and polling. While push information systems may involve some form of polling or checking on the server side, they are designed to proactively send information to clients, rather than requiring them to actively request it.

Hey ChatGPT: Under what context are event driven systems run?

Event-driven systems are commonly used in contexts where it is important to respond to external events with low latency and high accuracy. These systems are designed to execute code in response to events as they occur, rather than waiting for a scheduled time or polling for updates.

Some common contexts where event-driven systems are used include:

  1. Real-time systems: Event-driven programming is often used in real-time systems, where timely and accurate responses to external events are critical. Examples include aircraft control systems, industrial control systems, and medical devices.
  2. Web applications: Many modern web applications are designed as event-driven systems, using technologies such as websockets and server-sent events to enable real-time communication between the server and client. This allows for more responsive and interactive user interfaces.
  3. Internet of Things (IoT): Event-driven programming is also commonly used in IoT systems, where devices need to respond to events in the physical environment, such as changes in temperature, movement, or other sensor readings.

Overall, event-driven systems are well-suited for contexts where rapid response times and accuracy are important, and where the system needs to be able to handle a large number of simultaneous events.

Hey ChatGPT: No not for what context. I mean under what sort of technological environment? How is the task that is this real-time thing running? From a command line? From a cloud server? Using what executable and how do you get if executing?

Event-driven systems can be run in a variety of technological environments, depending on the specific requirements of the system. Here are some common ways that event-driven systems are executed:

  1. Command line: Many event-driven systems are executed from the command line, using tools such as Node.js, Python, or Java. These tools provide libraries and frameworks for building event-driven systems, and typically allow the developer to run the system as a command-line application.
  2. Cloud servers: Event-driven systems can also be executed on cloud servers, using technologies such as AWS Lambda, Azure Functions, or Google Cloud Functions. These services provide a serverless architecture, where the developer simply uploads their code and the cloud service automatically handles the scaling, deployment, and execution of the system.
  3. Containers: Event-driven systems can also be executed inside containers, using technologies such as Docker or Kubernetes. Containers provide a lightweight and portable environment for running applications, and allow developers to easily deploy and manage event-driven systems at scale.

In terms of the specific executable, this will depend on the programming language and framework being used to build the event-driven system. For example, a Node.js application might be executed using the node command, while a Python application might be executed using the python command.

To get started with executing an event-driven system, the developer typically needs to install the necessary programming language and framework, and then write and execute the code for the system. This might involve creating a configuration file or script, setting up any necessary dependencies or libraries, and then executing the code using the appropriate command or tool.

Hey ChatGPT: For automatic running isn’t running from a command line too fragile? Shouldn’t an on-demand real-time service be able to re-start and start automatically with the server?

You are correct that running an event-driven system from the command line might not be the most reliable way to ensure it runs automatically and consistently, especially in a real-time setting where the system needs to be highly available. In these cases, you would want to configure the system to start and stop automatically with the server or container.

There are a few ways to achieve this, depending on the specific technology stack you are using. One common approach is to create a service or daemon that runs in the background and automatically starts the event-driven system when the server starts up. On Linux systems, this can be done using a service manager like Systemd or Upstart, which provide tools for managing and configuring services.

In a container environment, you might use an orchestrator like Kubernetes or Docker Swarm to manage the deployment and scaling of your event-driven system, and to ensure that it starts and stops automatically with the container.

Regardless of the approach you take, it is important to make sure that your event-driven system is designed to handle automatic restarts and failures gracefully. This might involve implementing retry logic for failed operations, using a message broker to ensure that events are not lost, and setting up monitoring and alerting to quickly identify and respond to any issues.

Hey ChatGPT: So isn’t the ability to write and run Linux daemons even more fundamental to tech than running real-time services or even containers because even Kubernetes and all other container hosting software needs to be started as Linux daemons as do all webservers and services cloud or not? Nave Linux daemons not become the true practical staring point in almost all information tech?

You are correct that the ability to write and run Linux daemons is a fundamental skill in modern information technology. Daemons are a critical component of many different types of systems, including web servers, databases, message brokers, and many others.

As you mentioned, even container orchestrators like Kubernetes rely on Linux daemons to manage the deployment and scaling of containerized applications. In fact, the use of containers and cloud services has increased the demand for daemonization skills, as many developers and operations teams need to run custom daemons alongside or inside containers to perform specific tasks.

That being said, while Linux daemons are a fundamental building block of modern technology, they are not the only one. Developers and operations teams also need to be proficient in other areas like networking, databases, security, and cloud computing, among others, to build and maintain complex systems.

In summary, while the ability to write and run Linux daemons is an important skill for anyone working in modern information technology, it is just one of many skills needed to build and maintain complex systems.

Hey ChatGPT: Do you think the skill of writing and running Linux daemons may be under-represented in the field? Is it possible that WebDev because it is sexier eclipses systemd fundamentals?

It is possible that the skill of writing and running Linux daemons is under-represented in some areas of the field. One reason for this could be that web development is a more visible and widely-discussed topic, with many resources available for learning web development frameworks and tools. By contrast, learning about Linux daemons may be seen as a more specialized skill, and resources for learning about it may be less widely available.

Another possible reason is that some developers and organizations may prioritize short-term productivity over long-term maintainability. Writing code that runs as a daemon requires more planning and consideration than writing a simple script or web application, and may not offer the same immediate feedback or gratification.

However, as systems become more complex and distributed, the ability to write and maintain daemons becomes increasingly important. In the long term, investing in the skills needed to write and maintain reliable, scalable daemons can lead to more stable and maintainable systems.

Ultimately, both web development and Linux daemon development are important skills for anyone working in the field of software development, and choosing which to focus on may depend on individual interests and career goals.

Hey ChatGPT: How would you pitch the idea that this is what you should really lead learning tech with despite the fact it may not be as sexy as webdev?

To pitch the idea of learning Linux daemon development as a key skill in tech, it can be useful to emphasize the long-term benefits and importance of building reliable and scalable systems. Here are some potential talking points:

Building and maintaining daemons requires a more thorough understanding of how computer systems work,

Categories