ETL to QE, Update 55, Nostr AI Chatbot Implementation Details
So for Nostr Dec 2024 Roadmap, "1. Nostr interface equivalent to Open WebUI" we just need to develop a bot that works with existing clients and has a help command for old conversations and stuff. What different kinds of data do we need to deal with?
- Required Tags,
- CID of contents possibly with salt.
- AI Model / Chatbot requested
- Conversational Messages, need to include what message is replied to
- Tags for other features,
- Files are for part 2 of Nostr Dec 2024 Roadmap "2. Nostr RBAC LDAP via Blossom Upgrade"
- I want to use my personal data in a RAG chatbot, but that is part of 4 of Nostr Dec 2024 Roadmap, "4. Data ingestion of all my social media"
- References to knowledge graph, but that is part of Nostr Dec 2024 Roadmap, "5. Knowledge Graph all the things"
- Chatbot DAG are for part 6 of Nostr Dec 2024 Roadmap, "6. Agents and ETL Kafka Hamilton DAG style"
So all we want is a AI chatbot that works with existing clients. That is a pretty well scoped problem. All you need is a little parser for the chatbot to load commands to reset the conversation and set AI model and stuff. I should just be able to use Commander npm to process the command.
No Commander npm uses flags for everything, actually Commander npm should still work without commands.
But the first thing we need to do is write a little framework for running multiple bot filters at once, in our case doing the DM chatbot and Thread chatbot.
There we go is that task well defined enough for you?
We can throw Nostr Relay Filters problem out the window for now and focus on this
But then how does a user of a thread bot set their model and stuff.
The same logic of the CLI from the DM bot is accessable via thread but when the bot replies it adds a tag to remember the LLM model to use, so the LLM model is set globally for a user and nostrGet can fetch it... Does the nostr spec require implementation of earliest in filters?
So it looks like the limit key value pair within a query is what produces the earliest events. I learned this from nostr-rs-relay with this commit. I also found this and
This seems relevant, skipping for now.
"The limit
property of a filter is only valid for the initial query and MUST be ignored afterwards. When limit: n
is present it is assumed that the events returned in the initial query will be the last n
events ordered by the created_at
." -source
Okay we now know we can get the most recent events from a relay, and how we would go about properly scraping relays.
Cool now what is stopping us from writing the implementation of this?
Well we need a ticket,
- DDT001 - Run two independent chatbots based on filters at the same time
- DDT002 -Add CLI Slash Command functionality to the AI chatbots to change AI model and get help
Is that it? Is that all we really need to do to get a KISS AI chatbot?
No then we have additional features such as,
- Threaded/DAG AI conversations
- Dynamic versioned Titles of Conversations
- Dynamic AI model selections
- Chatbot/Tool Functionality such as Searching Web and stuff
- RAG functionality
All those additional features require a custom client and for that we need,
- Local Caching Relay
- Nostr Feeds
- Algotainment
- Profile / User
- Notes
- Reactions
- Follow List
- Nostr Thread Data Structure in JS
- Component Display Nostr Thread Data Structure
- Encrypted Conversation Feed Data Structure in JS
- Component to display Encrypted Nostr Conversations
- Component to display Encrypted Nostr Conversation
- Component to fetch, store, and validate Nostr Profiles
- Component to display nostr profiles
Cool now we, once again, have what we perceive as a reasonable TODO list. Now here's to praying that we will have the EQ to commit to this simple plan, at least the first two things I wrote tickets for,