"A pattern for building personal knowledge bases using LLMs.
This is an idea file, it is designed to be copy pasted to your own LLM Agent (e.g. OpenAI Codex, Claude Code, OpenCode / Pi, or etc.). Its goal is to communicate the high level idea, but your agent will build out the specifics in collaboration with you."
Post by Andreas Horn @ | LinkedIn
𝟮𝟬 𝗺𝗶𝗹𝗹𝗶𝗼𝗻 𝘃𝗶𝗲𝘄𝘀 𝗶𝗻 𝟯 𝗱𝗮𝘆𝘀 - 𝗮𝗻𝗱 𝘀𝘁𝗶𝗹𝗹 𝗻𝗼𝘁 𝗲𝗻𝗼𝘂𝗴𝗵 𝗽𝗲𝗼𝗽𝗹𝗲 𝗮𝗿𝗲 𝘁𝗮𝗹𝗸𝗶𝗻𝗴 𝗮𝗯𝗼𝘂𝘁 𝘄𝗵𝗮𝘁 𝗔𝗻𝗱𝗿𝗲𝗷 𝗞𝗮𝗿𝗽𝗮𝘁𝗵𝘆 𝗷𝘂𝘀𝘁 𝗿𝗲𝘃𝗲𝗮𝗹𝗲𝗱: 𝗺𝗼𝘀𝘁 𝗰𝘂𝗿𝗿𝗲𝗻𝘁 𝗥𝗔𝗚 𝘄𝗼𝗿𝗸𝗳𝗹𝗼𝘄𝘀 𝗮𝗿𝗲 𝘀𝗼𝗹𝘃𝗶𝗻𝗴 𝘁𝗵𝗲 𝘄𝗿𝗼𝗻𝗴 𝗽𝗿𝗼𝗯𝗹𝗲𝗺. ⇣
𝟮𝟬 𝗺𝗶𝗹𝗹𝗶𝗼𝗻 𝘃𝗶𝗲𝘄𝘀 𝗶𝗻 𝟯 𝗱𝗮𝘆𝘀 - 𝗮𝗻𝗱 𝘀𝘁𝗶𝗹𝗹 𝗻𝗼𝘁 𝗲𝗻𝗼𝘂𝗴𝗵 𝗽𝗲𝗼𝗽𝗹𝗲 𝗮𝗿𝗲 𝘁𝗮𝗹𝗸𝗶𝗻𝗴 𝗮𝗯𝗼𝘂𝘁 𝘄𝗵𝗮𝘁 𝗔𝗻𝗱𝗿𝗲𝗷 𝗞𝗮𝗿𝗽𝗮𝘁𝗵𝘆 𝗷𝘂𝘀𝘁 𝗿𝗲𝘃𝗲𝗮𝗹𝗲𝗱: 𝗺𝗼𝘀𝘁 𝗰𝘂𝗿𝗿𝗲𝗻𝘁 𝗥𝗔𝗚 𝘄𝗼𝗿𝗸𝗳𝗹𝗼𝘄𝘀 𝗮𝗿𝗲 𝘀𝗼𝗹𝘃𝗶𝗻𝗴 𝘁𝗵𝗲 𝘄𝗿𝗼𝗻𝗴 𝗽𝗿𝗼𝗯𝗹𝗲𝗺. ⇣
Andrej Karpathy shared how he's been using LLMs lately. He's writing less code with AI and spending most of his tokens building and maintaining a personal knowledge base on whatever he's actively researching.
- Stage 1: Data Ingest – Dumps raw research into a bucket using a Browser Clipper and Hotkeys.
- tage 2: LLM Compilation – LLMs (like ChatGPT or Claude) read and summarize every incoming source for automated bookkeeping.
- Stage 3: The Wiki – Manages interrelated Markdown files and visualizations through Obsidian.
- Stage 4: Q&A / Querying – Queries a compounded synthesis via Auto-maintained Index Files.
- Stage 5: Output Formats – Generates Marp slide decks, Matplotlib charts, and new pages.
- Stage 6: Linting – Performs health checks and fills gaps using LLM Agents and Web Search.
- Stage 7: Extra Tools – Builds custom search engines using Vibe-coding (coding by natural language interaction with AI).
- Stage 8: Future Direction – Removes context window reliance through Model Fine-tuning.
Marp: Markdown Presentation Ecosystem
Create beautiful slide decks using an intuitive Markdown experience
No comments:
Post a Comment