Header Ads

ad728
  • Breaking News

    Show HN: A 100-Line LLM Framework https://ift.tt/ecnOlPH

    Show HN: A 100-Line LLM Framework I've seen a lot of comments about how complex frameworks like LangChain can be. Over the holidays, I wanted to see how minimal an LLM framework could get if we stripped away everything non-essential. The result is an LLM framework in just 100 lines of code. These 100 lines capture what I see as the core abstraction of most LLM frameworks: a nested directed graph that breaks down tasks into multiple LLM steps, with branching and recursion to enable agent-like decision-making. From there, you can layer on more advanced features like agents, RAG, task decomposition, and more. I’ve intentionally avoided bundling vendor-specific wrappers (e.g., for OpenAI) into the framework. That kind of lock-in can be brittle and is easy to recreate on the fly—just feed the vendor’s API docs into your favorite LLM to generate a new wrapper. With miniLLMFlow, you only get the fundamentals. It also works nicely with coding assistants like ChatGPT, Claude, and Cursor.ai. Because the code is so minimal, you can quickly share the entire "source code and documentation with an AI assistant, and it can help you build new workflows on the spot. I’m adding more examples (including multi-agent setups) and would love feedback! If there's a feature or use case you’d like to see, please let me know. GitHub: https://ift.tt/M8EhKHt https://ift.tt/M8EhKHt January 6, 2025 at 09:20PM

    No comments

    Post Top Ad

    ad728

    Post Bottom Ad

    ad728