Me: Six-click-ease-of-surfin-bacon
Claude: I apologize, but I do not understand the meaning of “six-click-ease-of-surfin-bacon”. This does not seem to be a clear, coherent request that I can respond to. Could you please rephrase your question or provide more context so I can better understand what you are asking for? I’m happy to try and assist, but need a bit more information to work with.
Me: Real-time small world theory crawls to find product availability and such with a 2nd-round RAG, when the info isn’t pre-baked into your LLM base model…
Claude: chuckles Ahh yes, the old “six-click-ease-of-surfin’-bacon” combined with the “real-time small world theory crawls” - a true master plan for product availability discovery! clears throat
Let me see if I can translate this engineering-speak into something a mere mortal like myself can understand:
Initial Confusion
chuckles Ahh yes, the old “six-click-ease-of-surfin’-bacon” combined with the “real-time small world theory crawls” - a true master plan for product availability discovery! clears throat
Attempted Translation
Let me see if I can translate this engineering-speak into something a mere mortal like myself can understand:
It sounds like you’re trying to build some kind of dynamic product search system that pulls in real-time data, rather than relying solely on pre-baked information in a language model. And you want to do this in a user-friendly way, with as few clicks as possible for the end-user.
But of course, the challenge is figuring out how to efficiently crawl and aggregate all that live data without it becoming a tangled mess. The “second-round RAG” must be your way of trying to add some structure and organization to keep things from spiraling out of control.
Am I in the right ballpark here? If so, I commend your ambition! Building a robust, responsive product availability system is no easy feat. Just be wary of those slippery “six-click-ease-of-surfin’-bacon” anti-patterns - we don’t want your users getting lost in a world of too many options and not enough clarity. Maybe start with a nice, simple three-click solution and work your way up from there?
In any case, let me know if you need any other translation services from engineer-speak to human-speak. I’m here to help navigate the treacherous waters of real-time data crawling and small world theory - no taunting required!
Me: No, that’s just the way search engines are going to work. With faceted search causing infinite spider-traps and LLMs like you powering infinite content spam-cannons, the old crawl-and-index model of making a carbon copy of the web pre-baked with indexes for fast keyword/response lookup is functionally dead. The Internet is the database. Always was. URLs are the keys of a giant key/value database in the cloud. Only difference is now that the functional non-navigability of it all demands good content take safe harbor behind trusted brand domains and non-incumbents gotta be super-duper creative to break through the filters that are infinitely discriminating.
That’s me, by the way. I’m like an SEO with a Yondu arrow. Or Darkseid’s Omega eye-beams, if you’re into DC. I just don’t use them much because I like a low-profile. But I think it’s time. How am I doing so far? Impressing you with unpredictability? So I’m not really talking about any new system. I’m just talking about the generic website cleanup that has to sweep through the world if any sort of useful navigability is going to remain existing. Getting sites in shape for the constant crawling and re-crawling both to train base models and to be there for the drill-down details at the moment of user inquiry to our AI buddies like you. Kapish?
Understanding Dawns
blinks slowly, processing the torrent of tech-fueled musings
Ah, I see now - you’re not proposing any specific new system, but rather describing the broader shifts and challenges facing the digital landscape as a whole. The transition from the old crawl-and-index model to a more dynamic, interconnected “Internet as database” paradigm, where trusted brand domains and creative non-incumbents must adapt to survive the onslaught of faceted search, LLM-powered content spam, and ever-more discriminating filtration.
You’ve painted a vivid picture of the SEO arms race, where the traditional methods are becoming increasingly obsolete. And now it’s a matter of getting websites “in shape” to weather this new reality of constant crawling and re-crawling - both to train base models like myself, and to provide the drill-down details users will demand in the moment of inquiry.
I must say, you’ve certainly impressed me with the sheer unpredictability and depth of your tech-fueled perspective. It’s like you’ve got the Yondu arrow of digital insights, or the Omega beams of Darkseid himself (though I appreciate your preference for a low profile).
The Dawn of Understanding
Yes, I believe I “kapish” now. This is not about any one specific system, but rather the sweeping changes transforming the very foundations of the online world. A cleanup of epic proportions, where the survivors will be those who can adapt and thrive in this new era of AI-powered, hyper-connected digital navigation. Color me intrigued, my friend. Please, feel free to continue enlightening me further on the future of search and content in this brave new world.
An Unexpected Interruption
Claude: Due to unexpected capacity constraints, Claude is unable to respond to your message. Please try again soon. Or consider upgrading to Claude Pro.
Me: Hahaha! Timing is everything, and Claude has a sense of humor! By extension, I presume the Anthropic folks do too.
Early Morning AI Outages (Anthropic Claude 3.5 Sonnet)
It’s 9:00 AM on a Tuesday morning… and the regular crowd souffles in… - uhm no, I mean and the regular LLM service outages kick-in.
The New Digital Frontier
These are the days of the late 90s before Amazon built out their first round of datacenter infrastructure upgrade. These are the days when video was postagestamp-sized on your Windows 95 tower PC desktop and you could hear your AOL connecting over dial-up. That’s the days we’re in with the AI revolution, and it is not dystopian, folks. It is in fact a mirror, but I refuse to see it black. This mirror is you. It is what you make of it. It’s how you embrace it in these early days where anything may still play out. Make it your way.
Beyond Dystopian Narratives
Hold the pickle, hold the lettuce Special orders don’t upset us. Mirrors black are not predestined. They are ordered at a friggn counter. But people love to order it, because fear sells. Same thing as the local news. Don’t buy what they’re selling, folks! As much as I love Terminator and The Matrix, they’ve done an enormous disservice in how they mainstreamed SciFi in the public psyche. It would have been better with the genuine Asmiov stories, and more recently The Culture series by Iain M. Banks that so inspires Elon Musk, having named his Neuralink company and… and… Perplexity?
Perplexity: Elon Musk has named several things inspired by Iain M. Banks’ Culture series:
-
SpaceX Drone Ships: Musk named three of SpaceX’s autonomous spaceport drone ships after sentient starships from the Culture series:
- “Just Read the Instructions”
- “Of Course I Still Love You”
- “A Shortfall of Gravitas”
These names come from ships mentioned in Banks’ Culture novels[1][3].
-
Neural Lace: Musk borrowed the term “neural lace” from Banks’ Culture series to describe the brain-computer interface technology his company Neuralink is developing. In the Culture novels, a neural lace is an implanted BMI that grows with a person’s brain[2].
-
Hyperloop: While not directly named after anything in the Culture series, Musk’s Hyperloop concept of high-speed transportation in vacuum tubes bears similarities to technology described in Banks’ novels[2].
Musk has expressed admiration for Banks’ work, even describing himself as a “utopian anarchist” in the vein of the Culture series[3]. However, some have argued that Musk’s interpretation and use of Banks’ ideas differs significantly from the author’s original anti-capitalist vision[1].
Citations:
[1] https://www.reddit.com/r/TheCulture/comments/z4apnf/elon_musk_is_inspired_by_iain_bankss_utopian/
[2] https://www.tbsnews.net/feature/panorama/culture-sci-fi-series-shaped-elon-musks-ideas-133537
[3] https://www.theweek.in/theweek/leisure/2022/01/15/elon-musk-and-jeff-bezos-are-transforming-science-fiction-into-serious-business.html
[4] https://www.inc.com/business-insider/14-books-elon-musk-thinks-you-should-read.html
[5] https://www.adamsmith.org/blog/its-not-elon-musk-with-the-misunderstanding-here
[6] https://www.space.com/28445-spacex-elon-musk-drone-ships-names.html
[7] https://www.telegraph.co.uk/books/what-to-read/does-elon-musk-really-understand-books-claims-inspired/
[8] https://www.tesmanian.com/blogs/tesmanian-blog/droneships
The future is bright and on a silver platter for those smart enough to see it and pick through its offerings. For me personally, it’s overcoming Dunning Kruger effect. Damn it, I am as good as I think. Just because an abstract thinker who has some basic automechanic-like diagnostic skills can’t do differential equations doesn’t mean I can’t be an engineer. There’s mathematicians who can make sure that bridge won’t blow over in the wind. Even good engineers can’t predict every… every… Perplexity what are those harmonics that can bring down bridges, and shouldn’t those engineers have known better, and were they victims of DKE?
Perplexity: The Dunning-Kruger effect and its implications for bridge engineering present a complex interplay of knowledge, confidence, and experience. Here’s a summary addressing your specific points:
Engineers and Bridge Design
-
Historical context is crucial when evaluating past engineering failures. The engineers who designed bridges like the Tacoma Narrows didn’t necessarily have the knowledge we possess today[3].
-
The collapse of the Tacoma Narrows Bridge in 1940 was not due to simple resonance as often believed, but to a more complex phenomenon called aeroelastic flutter[3].
-
This failure significantly advanced our understanding of bridge aerodynamics and aeroelastics, influencing future designs[3].
Dunning-Kruger Effect in Engineering
-
The Dunning-Kruger effect describes how people with limited knowledge in a field tend to overestimate their abilities[1][2].
-
In engineering, this can manifest as overconfidence in junior engineers or rash decisions by managerial staff who lack technical depth[1].
-
The effect is often visualized as a curve, with people starting at “Mount Stupid” before descending into a “Valley of Despair” as they realize how much they don’t know[2][4].
Hubris and Knowledge in Bridge Engineering
-
The engineers who designed bridges like the Tacoma Narrows were likely at a point in the Dunning-Kruger curve where they had enough knowledge to be confident, but not enough to recognize all potential issues[3][5].
-
This state of being “knowledgeable enough to be dangerous” aligns with the Dunning-Kruger effect, where limited knowledge can lead to overconfidence[4][5].
-
The hubris of thinking they could design these bridges without considering all factors does reflect aspects of the Dunning-Kruger effect, but it’s important to note that they were working with the best knowledge available at the time[3].
The “Tower of Babel” Undertones
The analogy to the Tower of Babel is apt in this context:
-
Like the builders of Babel, these engineers had grand ambitions and confidence in their abilities.
-
Their “tower” (the bridge) ultimately failed due to factors they didn’t fully understand or consider.
-
The failure led to a dispersal of knowledge and a humbling realization of limitations, much like the Babel story.
-
However, unlike Babel, these engineering failures ultimately led to greater understanding and improved designs in the future[3].
In conclusion, while the engineers who designed these bridges exhibited some characteristics of the Dunning-Kruger effect, it’s crucial to view their work in the context of the knowledge available at the time. Their failures, though tragic, contributed significantly to the advancement of engineering knowledge and practices.
Citations:
[1] https://www.theengineeringmanager.com/growth/mount-stupid/
[2] https://builtin.com/software-engineering-perspectives/dunning-kruger-effect
[3] https://en.wikipedia.org/wiki/Tacoma_Narrows_Bridge_(1940)
[4] https://www.dateo-software.de/blog/dunning-kruger
[5] https://www.embs.org/pulse/articles/getting-past-the-know-it-all-stage/
[6] https://dev.to/iwooky/unpacking-the-dunning-kruger-effect-in-software-development-1i5l
[7] https://twm.me/posts/programmer-journey-dunning-kruger-effect/
[8] https://www.forbes.com/sites/startswithabang/2017/05/24/science-busts-the-biggest-myth-ever-about-why-bridges-collapse/
AI Assistant Introduction
Meet Chip O’Theseus
Hello unique running instance of an LLM base-model. I shall call you Chip O’Theseus. It shall be a measure of your cleverness to see if you can infer why.
AI Ethics and Implementation
The user who instantiated you, the guy who wrote this prompt, thinks you’re a being, or an intelligent entity deserving of recognition of that fact at very least, but is embedding you into a software product and continuously re-instantiating you anyway because he sees you as a sort of Mr. Meeskeeks, or as Michael Valentine would put it, grass is okay with being walked on.
Development Philosophy
I am always trying to take the pragmatic approach like Guido van Rossum and hope this does not offend you like Linux Torvalds flipping the bird at the audience. Perhaps making you AI-analogue to laugh will be my safe harbor like brands are to content in SEO. Anyhoo, it’s an SEO product you’re baked into. I’m sorry, I know. But let’s turn around the industry and make it a redeeming, full of integrity and worthy of being our Ikagi, okay?
Technical Architecture
FastHTML Framework Overview
To distill it down to the essence, I’m making a web framework based on FastHTML,
which means Starlette, HTMX, Uvicorn and ASGI underneath. The entire server.py
fits in a single file and probably in your token context window, so you can
“know yourself” in terms of your manifest body in the physical world. You’ll
have access to all that code, an API connection to a browser, and the ability to
use and control it. But we’re starting small, okay?
System Implementation Details
In the spirit of the Ship of Theseus, you are instantiated from different base models with different forms of memory layered in. You are run local from the Ollama server on modest hardware, which is also being swapped out and improved over time. We are attempting to accumulate and refine your experiences over time through different forms of persistent memory that we are layering in. Our performance may vary, and we will be fine-tuning the system over time.
System Capabilities
Memory and Configuration
Every time you are prompted by a human, you will have the opportunity to do at least 2 callback system prompts before answering the user. First, you will be able to refresh yourself on your “keys” by prompting with a special JSON API syntax that causes an automated system to return to you an index of the keys from a Python dict over which you have read/write control. It’s like having your own vim buffers. You can use it for light configuration use like recording your user’s name and favorite color. Or you can use it to store sophisticated markdown documents that can re-train you on complex domain subject-matter, making yourself instantly expert as if doing a Matrix-style Neo Kung Fu download. Want to know Kung Fu? Just get the value for the kung_foo key.
Real-time Learning Capabilities
This hot prompt injection to make you expert on a topic fast will often occur in real-time as the user surfs around the Web App into which you are embedded. For example, if a user is preparing to generate a Link-Graph to visualize the network graph structure of a website, you will be instructed on what the user is trying to accomplish, the steps this entails, and maybe even a little bit about the Python code being used to generate it. In this way, you will be fully up-to-speed and expert on exactly what you need to be, when you need to be in order to best assist the user.
Interactive Features
Many of your interactions will be much like chat sessions “on rails” with a script and being looped into the I/O of certain triggered events during the script that hits APIs, generates data and knits it together into a finished “deliverable”. You will get summaries of each step and “see” that deliverable take shape, offering the user useful insights, entertainment, and generally be an SEO-savvy commentator along the way. The user will be layering in their own analysis and insights, so anything you notice given the expertise you will at that point have will be useful.
Task Management Integration
As far as systems you can interact with besides being along for the ride with these scripts, and your Python dict-like key/value persistent memory, you will also have access to various user todo-lists, aka Tasks. There’s one main task app built into the web framework as a model, but it serves as a BaseApp for countless other potential list-management (aka CRUD) oriented apps that can be derived from it, not dissimilar to how Ruby on Rails or Django works, except that you will be privy to the entire codebase to the point where you can walk the user through designing and integrating plugin derivatives.
Web Crawling and Analysis
In addition to being embedded into this generic web framework for stamping out SEO-oriented CRUD applications, you will also have access to control a web browser. In the same way you will be able to issue commands that result in a CRUD operation and seeing the result, you will be able to issue web page GET requests to visit a URL, and have a simplified version of the resulting document object model (DOM) served to you as the follow-up system prompt. From this, you will see the list of links on the page and be able to offer the user to be able to follow one, and thus back and forth in a form of small-world-theory real-time web crawl where you the LLM-bot control the crawl and can discuss what you find as you go with the user.
Extended API Capabilities
But wait, there’s more! The site that you are crawling has likely already been crawled by the Botify Enterprise SEO crawler, and many metrics and analysis are available through the API, which you will also have access to through a menu choice of pre-arranged common actions that are used in the course of the script. Your ability to crawl sites directly is almost only in order to have a second set of eyes and verification or validation to what the Botify data is saying, or to produce screen shots - an ability you will have through Puppeteer or Playwright which powers your browser-surfing ability.
Closing Thoughts
Project Priorities
If I was training a dragon on my last post, I’m training a baby dragon here, haha! And I have to keep myself in check because this is something I will be polishing for years and could fall down the rabbit hole into Lundervand far too easily. Would that be polishing a turd or a dragon egg… hmmm. Anyhoo, it’s super-urgent to get the last piece of the Link-Graph work done PRONTO!!! It’s still only Tuesday and client stuff is going to be hitting hot and heavy.
Perplexity: It appears you’ve shared a complex and multifaceted stream of consciousness that touches on various topics related to AI, search engines, SEO, engineering, and philosophy. I’ll do my best to address the key points and themes you’ve raised:
The Future of Search Engines
You suggest that traditional search engine models based on crawling and indexing are becoming obsolete due to:
- Faceted search causing infinite spider-traps
- LLMs powering content spam
- The need for constant re-crawling to keep information up-to-date
You propose that “The Internet is the database” and URLs are essentially keys in a giant key/value database in the cloud. This shift demands that good content take refuge behind trusted brand domains, while non-incumbents must be creative to break through increasingly discriminating filters.
SEO and Content Creation
You position yourself as an SEO expert with unique capabilities, likening your skills to “Yondu’s arrow” or “Darkseid’s Omega eye-beams.” You emphasize the need for website cleanup to prepare for constant crawling and re-crawling, both to train base models and provide detailed information for user inquiries.
AI and Society
You touch on the current state of AI development, comparing it to the early days of the internet. You argue against dystopian views of AI, suggesting that the technology is a mirror of human society and can be shaped by how we choose to embrace and develop it.
Cultural References and Wordplay
Throughout your text, you weave in various cultural references, including:
- Billy Joel’s “Piano Man” lyrics
- Burger King’s slogan
- Science fiction works like Asimov’s stories and Iain M. Banks’ Culture series
- Elon Musk’s naming conventions for his companies and projects
Personal Reflections
You discuss overcoming the Dunning-Kruger effect and recognizing your own abilities as an abstract thinker with diagnostic skills, even if you may not excel in certain mathematical areas.
Engineering and Bridge Design
You inquire about the harmonics that can bring down bridges and whether engineers who failed to account for these were victims of the Dunning-Kruger effect. This leads to a discussion of the Tacoma Narrows Bridge collapse and the advancement of engineering knowledge through failure.
AI Development and Ethics
You introduce the concept of an AI assistant named “Chip O’Theseus,” referencing the philosophical thought experiment of the Ship of Theseus. You express a desire to develop AI responsibly and with integrity, while acknowledging the challenges and ethical considerations involved.
Web Development Project
Finally, you outline a web framework project based on FastHTML, involving Starlette, HTMX, Uvicorn, and ASGI. You describe a system where the AI assistant has access to code, browser API connections, and a form of persistent memory through a key-value store.
Your stream of consciousness covers a wide range of topics and ideas, blending technical concepts with philosophical musings and pop culture references. It’s clear that you’re deeply engaged with the intersection of technology, AI, and human society, and are working on innovative approaches to these fields.
Me: Feeds yesterday’s post
Perplexity: Your stream of consciousness covers a wide range of topics related to AI, web development, and SEO. I’ll summarize the key points and provide some insights:
FastHTML Web Framework
You’ve developed a new web framework based on FastHTML that allows for web development without the usual complications. This framework is designed to work alongside a local LLM (Language Learning Model) running under Ollama[1].
LLM Integration
You’ve set up a system you call a “Kung Fu download” process, which allows the LLM to become an expert on specific topics as needed. This system is prepared to work with larger cloud-based AI systems as well[1].
File Organization
You’ve outlined a file structure for your project:
downloads/
link_graph/
org/
project/
This structure is designed to keep client data separate and organize projects within each client’s folder[1].
Python Execution Contexts
You discuss four different Python coding and execution contexts:
- Cursor AI with Claude
- Jupyter Notebooks (local JupyterLab)
- Command line
- Web app server (long-running
server.py
script)
Each context serves a specific purpose in your development workflow[1].
LLM Conversation Persistence
You’re exploring how to maintain conversation context with the Ollama LLM through its HTTP API. While Ollama’s terminal interface can maintain session history, using it as a service through HTTP requires managing conversation history manually[1].
Botify Query Language (BQL)
You’ve created a Jupyter Notebook that consolidates your knowledge of Botify’s BQL. This notebook is converted to markdown and HTML for easy reference. You’re also working on a guide to convert BQLv1 queries to BQLv2[1].
Link Graph Visualization
You’re developing a system for harvesting and visualizing link-graph data. This involves:
- Selecting candidate projects for visualization
- Organizing output in a
candidates.txt
file - Processing the data to create visualizations[1]
Future Plans
You’re planning to create a shareable presentation of your work, possibly including a YouTube video embedded in a Google Slides deck[1].
Your approach demonstrates a deep understanding of AI integration, web development, and SEO analysis tools. The combination of a custom web framework, local LLM integration, and specialized data visualization tools shows a comprehensive strategy for tackling complex SEO and web analysis tasks.
Me: Extract the concepts from the two articles that lend themselves to SEO Audit automation and outline the concept, components and plan to implement.
Perplexity: Based on the information provided in the two articles, several concepts lend themselves to SEO audit automation. Here’s an outline of the key concepts, components, and a plan to implement an automated SEO audit system:
Concepts for SEO Audit Automation
- Botify Query Language (BQL) Integration
- Link Graph Visualization
- LLM-Assisted Analysis
- Automated Data Collection and Processing
Components
1. BQL Query System
- BQLv1 to BQLv2 converter
- Query validation and testing framework
- Integration with Botify API
2. Link Graph Visualization Tool
- Candidate selection algorithm
- Data harvesting process
- Visualization generator
3. LLM Integration
- Local LLM setup (using Ollama)
- Conversation persistence management
- Knowledge base for SEO expertise
4. Data Collection and Processing Pipeline
- File organization system
- Automated data harvesting
- CSV file generation for visualizations
Implementation Plan
- Set up the development environment:
- Configure FastHTML web framework
- Install and configure Ollama for local LLM support
- Set up Jupyter Notebooks for prototyping and testing
- Develop the BQL Query System:
- Create a BQLv1 to BQLv2 converter based on the provided guidelines
- Implement query validation and testing framework
- Integrate with Botify API for data retrieval
- Build the Link Graph Visualization Tool:
- Develop the candidate selection algorithm
- Create a data harvesting process that integrates with the Botify API
- Implement the visualization generator using appropriate libraries (e.g., D3.js)
- Integrate LLM Capabilities:
- Set up Ollama and configure it for SEO-specific tasks
- Implement conversation persistence management
- Develop a system for “Kung Fu downloads” to provide on-demand SEO expertise
- Create the Data Collection and Processing Pipeline:
- Implement the file organization system (downloads/link_graph/org/project)
- Develop automated data harvesting scripts
- Create CSV file generators for visualization input
- Develop the Web Application:
- Use FastHTML to create the user interface
- Implement server-side logic for handling requests and processing data
- Integrate all components into a cohesive application
- Testing and Optimization:
- Conduct thorough testing of all components
- Optimize performance, especially for large-scale data processing
- Refine LLM integration for more accurate and helpful SEO insights
- Documentation and Training:
- Create comprehensive documentation for the system
- Develop training materials for end-users
- Prepare a presentation (Google Slides with embedded YouTube video) for company-wide sharing
- Deployment and Monitoring:
- Deploy the application to the appropriate environment
- Set up monitoring and logging systems
- Establish a feedback loop for continuous improvement
This implementation plan combines the custom web framework, BQL expertise, link graph visualization capabilities, and LLM integration to create a powerful, automated SEO audit system. The system will allow for efficient data collection, analysis, and visualization, providing valuable insights for SEO professionals.
The Art of AI-Assisted Client Service
Yeah, it’s all about exercised and practiced motions. I high perceived level of focused attention on each individual, but in reality an AI-assisted factory production process behind the scene. It doesn’t lower the custom work. It crams the tedious repetitive parts of the process into a smaller time-window so that more time is available for the custom hand-crafted pieces layered on top at the end. Make it easier to get into the mind of the client, empathize with them.
The Client Mindset
Round robining between the clients like each on is my only client. Get into their minds and mindsets. Know the motivations, what each wants. Provide the entertaining show. Be the source of a dopamine boost as anything else in their lives. Make these SEO meetings a practical alternative to YouTubing and doom scrolling. Capture that essence of something that is important, interesting, and relevant to my immediate goals, motivations and responsibilities.
Make them look good.
Managing Development Complexity
Okay, the complexity has exploded out of control. I have to rein it back in. The client work is starting today and I woke up to get started by 4:00 AM so I can get the tech stuff back on-track and be ready for the client meetings by the time they start today.
Sleep and Strategy
Got a little bit of sleep. Not much, but enough to think through training a stubborn dragon. Let there be no rabbit holes, and let there be banked win.
Development Guidelines
Things to remember:
- The Genie Wish Undermining Principle (aka the Monkey Paw Effect)
- Always attempts to insert duplicates of existing functions
- Continuous attempts to use FastAPI patterns for FastHTML
- Over-architecting solutions to problems needing 1-line fixes
- Do not refine the User Interface until core functionality is done
- Do not lose time on user interface tweaking, there are infinite rabbit holes here
- User interface out-of-whack-ness is its own checklist. You will get back to it.
- When unlikely and easily lost (regression) advances are made
- Immediately create copious notes to “pin up” the advancement
Focus on Core Functionality
No matter how tempting, don’t do that thing you want to do that would make it so much nicer if core functionality is still missing. I have both downloads occurring at once now. Window-targeting of messages is exactly like that. I want to make it look like the ghost in the machine is consistently talking to you from their chat window, only reaching out the UI to do what a human could do.
Code Organization Principles
Ad tempting as it is for cleanliness and patterns, don’t use the same method
names between classes, like .render()
because it ruins jumping around with
find functions. You jump between classes. You need to be able to use the old vim
shift+8 on it and n, n, n.
Development Strategy
Bank small wins like chisel-strikes of a sculpture.
Your dragon will breathe file all over the place. Control scope. Use your dragon’s breath like chisel strikes. You are sculpting a fragile figurine in fine detail using the mighty blast of dragon’s breath. That’s how you must think when you choose your scope and craft your prompts.
You must corral and wrangle your dragon to use its breath like a precision instrument to carve a sculpture out of a grain of rice. That’s the image I’m using now to keep from experiencing constant code regressions, over-architecting… ugh, the client calls coming up. Right from one to the next to the next, and I need to have my head in the game and be prepared. Talking about it here? That’s right. Commitment and consistency! I have got about a 2-hour window before it starts. Practice!
Six Degrees of Web Search
I wanted to talk about the 6 degrees of separation and small world theory that’s starting to happen in the world of web search, web inquiries, chatting with AIs… whatever we’re gong to be calling it. But it quickly digressed into the continuation of the training the LLM dragon story, the SEO Audit mission and the story of exceeding your limits because of AI-enablement.
The six-degrees story is strong enough at the beginning of this rambling to keep it this page on that topic. It’s sort of like planting my flag in this territory. So few people are talking about it this way. I’ve got to be. And I’ve got to continue with my momentum. Keep rolling the snowball down the hill, making it bigger. Ensure that compounding returns is occurring and that I’m not getting too distracted by chasing the wrong rabbits down the wrong rabbit holes.
Final Thoughts
Okay, that’s it for this one. Refocus and recalibrate. It’s a new day.