Idempotent Kafka, Prompt Mania
Straight to the points.
I’m just going to get down to the building over the past week. I did a lot.
Tool Check
In building my practice strategy, I’m taking an inventory of all the tools and all of the categories of those tools. Essentially I’m keeping track of all the tools I have used, but more importantly designing the stack I want to use. So this is going to use some AI evaluations and include RAG context from my Obsidian collection.
Vault Check
There’s nothing I dread more than a notice from GitGuardian that I have secrets embedded in my code. This mostly happens when I covert old stuff into my new paradigm. So I have made sure that all of my secrets necessary (like API keys) are saved in Vault and fetched by a pristine .envrc. The vault check runs with my development workstation’s .zshrc so it will be unsealed and ready to go.
Local DynamoDB
I’m really starting to feel like a software engineer and data architect, rather than just a data engineer. So now I’m orchestrating unit tests for local connectivity. This is mostly an external logger for data pipeline actions. Previously I have been doing EDW control tables in the EDW itself.
MetaPrompt
I took a meta-prompt quiz which made me write five different prompts and sub-prompts to make sure I’m doing five right things in my prompt engineering.
Basic Diagnostic
Prompt Architecture Patterns
Context Management & Token Efficiency
Meta-Prompting & Systems Design
Advanced Promp Engineering Assessment
Certainly six months ago I would have considered this both arcane and mostly frivolous, but I have seen the light. Again, it’s about writing didactic essays and expecting non-deterministic results. Hell, I’ve been doing that for 20 years. I can tell it works as a methodology for discovery.
Now I have about 30 boilerplate prompts for this that and the other, specifically for data engineering, software engineering, system analysis and a few other topics. But this has nothing to do with MCP or Agentic programming, which I am going to expect to be deterministic.
WWID
WWID is back on the front burner. I’m going to use a couple new prompts to help me generate a corpus of questions. So far I have six sets, Ethics, Aesthetics, Epistemology, Metaphysics, Virtues and Philosophy. Next comes Political Philosopy, Human Nature & The Role of Science & Progress. I have to also get one on Religion but that’s probably redundant to Metaphysics. We’ll see.
It’s kind of interesting that Borderlands 4, which I just completed, gamifies some of the metaphysical questions about order vs chaos. I should finish the corpus by this time next week.
Prefect
You know, I’ve been vibe coding with Temporal and it hasn’t been of any practical quick and dirty use. So I’ve decided to give Prefect a look. You know what, it’s not horrible. It’s not even bad. It’s certainly a step up from Airflow, so I think I’m going to give it a deeper look. I think I’m going to have to be a bit more practical and take Python a touch more seriously. If this continues, I may even slapdash together some N8N for my Tiingo stack.
The Tiingo Catch All
Tiingo is my experimental full stack thingy. I’ve already integrated the Kafka producer. So now I go directly to Tiingo and then push all the messages into Kafka. So that works. And now the SASL authentication works with Vault, so that’s nice. Next is the outbound consumer for DuckDB and perhaps DuckLake too. The application is becoming non-trivial. I’m probably going to send it off to Superset and Grafana. Actually yeah, Grafana. I’m doing daily candlesticks on 20 securities, which by the way are weighted towards the energy sector. Then I can N8N alerts via SNS. Hmm. That has become complicated. Well that’s the idea. This is going to be the realtime framework for higher volume streaming transactions and alerts. It should scale up to telematics.
I like the idea that I can use Kafka for idempotency. I know it complicates the stack but I need to be flexible in practice, not just in theory. This will be the first step, then I’m going to have to mess around with Flink and other CDC options. This seems to be the second most straightforward way. Again, I’m trying not to be doing containers.
Intent To Spec
In this week’s progress on prompt engineering I used an outside prompt to work an inside prompt. So this is the first time I’ve done an iteration loop. I expected it to loop about three times but it only went twice. Here’s the context code for the outer loop:
INTENT: You are building an application plan that will direct a small software development project.
OBJECT: The object of this plan is a design specification.
HIGH LEVEL:The object of this planned project is to build a personal RAG system that will take data collected in Obsidian and use a Google API to generate an Notebook in Google NotebookLM.
NOTES:
- There may be hundreds of documents that need to be processed.
- We will generate a program in Go to do the refreshing
- We will use a Google API to generate the Notebook
- Additional notes will be added to refine the Notebook
INTERACTION LOOP:
- You will interactively ask questions about the project to get a clear understanding of the intent.
- Ask 4-5 questions in a structured manner to cover various aspects of the design spec area by area.
- If there is ambiguity you may defer to the next iteration.
GENERATION:
- generate the design spec as a draft with a version number and a title
- place this in the drafts folder.
- Notify the user
EVALUATION:
- Using the context/iterate.md prompt, perform an evaluation of the draft design specification.
- Output the evaluation results in a markdown file in the drafts folder with the name “evaluation_results.md” with the same version number as the design specification.
- Follow the recommendations of the evaluation and highlight. You will iterate
FINISH:
- Review the final design specification and ensure it meets the requirements.
- Notify the user that the project is complete.
Area Code
The last thing I built this week is an area code lookup. It was almost purely vibe coded which is a first time for my own CLI utilities. I used Gemini to scrape whatever wherever and build me a source table.


