09-11, 09:40–10:20 (Europe/Berlin), Main KiCon Presentation Space
Large Language Models (LLMs) and Vision Language Models (VLMs) are transforming software development, but their application in hardware design is still emerging. To be genuinely useful in electronics design, it requires a structured understanding of the project. This talk presents AmpereBrain, a proof-of-concept that gives a local AI the tools to read and interpret KiCad projects in a structured way. We demonstrate how providing the AI with access to project specifications, schematic diagrams, and component-level details enables it to become a practical assistant. This allows for reliable, AI-powered help with tasks like documentation, component queries, and basic schematic reviews, laying the groundwork for more advanced, structured design modifications.
This session is a practical demonstration of what is possible when AI is given the right tools to understand a hardware project. We will show how this structured approach moves beyond simple chatbots to create a genuinely useful design copilot.
The talk will be structured as follows:
1. Live Demo: An AI Copilot for KiCad (15 min)
We will begin with a live demonstration of the AmpereBrain AI assistant, showcasing its multi-level understanding of a KiCad project:
- High-Level Task (Project Specs): We'll ask the AI to "Update the specifications to require reverse-polarity protection on the main power input." The AI will read and apply a diff to the
specifications.mdfile. - Structural Analysis (Schematic View): We'll then ask, "Show me the power input circuit and check if it has a protection diode." The AI will generate a Mermaid diagram of the relevant schematic section for review.
- Detailed Query (Component Data): Following up, we'll ask, "What is the maximum reverse voltage of diode D1?" The AI will find the component, access its linked datasheet, and provide the specific parameter.
- Basic Design Review: Finally, we'll ask the AI to "Verify that every IC has a bypass capacitor connected between its power and ground pins," demonstrating its ability to traverse the netlist to perform simple design rule checks.
2. How It Works: A Structured Approach to AI (10 min)
After the demo, we'll briefly explain the core concepts that make this possible:
- The Foundation: Multi-Level Context: We'll show how the AI builds a holistic understanding by processing the project at three levels:
- Intent: Reading Markdown files (
requirements.md,specifications.md) to grasp project goals. - Structure: Parsing the schematic to create block diagrams and understand circuit topology.
- Detail: Accessing individual symbol properties and datasheet content.
- Intent: Reading Markdown files (
- Organizing the AI Workforce: We'll introduce our two agent architectures: a hierarchical "Orchestrator" for delegating simple tasks and a collaborative "Swarm" for solving more complex problems. We'll outline the roles of the specialist agents (circuit expert, documentation writer, calculator).
3. The Future: From Proof-of-Concept to Product (5 min)
This proof-of-concept is the first step. We will conclude by discussing the roadmap to a commercial product:
- The Key to Reliable Editing: Schematics as Code. We'll explain our core principle: for an AI to safely edit a design, schematic elements like placement and wiring must be represented as structured, version-controllable text.
- The Roadmap: We will outline the path from this POC to a robust design tool. While the core file-parsing engine is open-source, the advanced AI features shown are part of the AmpereBrain commercial offering. Our goal is to build a reliable, professional tool that leverages AI to accelerate the hardware design process.
Founder of the StepUp companies, building wearables. Following closely the AI revolution.