|
1 | | -# `atlas.llm`: A Deep Dive Into a Single-Binary, On-Device Coding Companion |
| 1 | +# atlas.llm: A Deep Dive Into a Single-Binary, On-Device Coding Companion |
2 | 2 |
|
3 | 3 | > _A study in systems-level glue code: how ~2,900 lines of Go turn a prebuilt |
4 | 4 | > `llama.cpp` server into a usable local AI coding assistant, and the |
@@ -54,29 +54,19 @@ consequence of these three. |
54 | 54 |
|
55 | 55 | The program is a Go process that performs three roles simultaneously: |
56 | 56 |
|
57 | | -``` |
58 | | -┌────────────────────────────────────────────────────────────────┐ |
59 | | -│ atlas.llm (Go) │ |
60 | | -│ │ |
61 | | -│ ┌───────────────────┐ ┌──────────────────────────┐ │ |
62 | | -│ │ TUI / CLI layer │──────▶│ Orchestration & config │ │ |
63 | | -│ │ (bubbletea + │ │ (download, registry, │ │ |
64 | | -│ │ lipgloss + │ │ chat history, slash │ │ |
65 | | -│ │ glamour) │◀──────│ commands) │ │ |
66 | | -│ └───────────────────┘ └───────────┬──────────────┘ │ |
67 | | -│ │ │ |
68 | | -│ ┌──────────▼──────────┐ │ |
69 | | -│ │ llama-server client │ │ |
70 | | -│ │ (net/http + JSON) │ │ |
71 | | -│ └──────────┬──────────┘ │ |
72 | | -└───────────────────────────────────────────┼────────────────────┘ |
73 | | - │ HTTP on localhost |
74 | | - ▼ |
75 | | - ┌──────────────────────────────┐ |
76 | | - │ llama-server (C++) │ |
77 | | - │ GGUF mmap + KV cache + │ |
78 | | - │ chat template from weights │ |
79 | | - └──────────────────────────────┘ |
| 57 | +```mermaid |
| 58 | +graph TB |
| 59 | + subgraph atlas["atlas.llm (Go)"] |
| 60 | + direction TB |
| 61 | + TUI["TUI / CLI layer<br/>(bubbletea + lipgloss + glamour)"] |
| 62 | + Orch["Orchestration & config<br/>(download, registry,<br/>chat history, slash commands)"] |
| 63 | + Client["llama-server client<br/>(net/http + JSON)"] |
| 64 | + TUI -->|command| Orch |
| 65 | + Orch -->|reply| TUI |
| 66 | + Orch --> Client |
| 67 | + end |
| 68 | + Server["llama-server (C++)<br/>GGUF mmap + KV cache +<br/>chat template from weights"] |
| 69 | + Client -->|HTTP on localhost| Server |
80 | 70 | ``` |
81 | 71 |
|
82 | 72 | The Go process **never performs inference itself**. It spawns one long- |
@@ -476,10 +466,20 @@ The terminal UI is built on [`bubbletea`](https://github.com/charmbracelet/bubbl |
476 | 466 | an Elm-Architecture implementation for Go. Every bubbletea program is |
477 | 467 | a triple: |
478 | 468 |
|
479 | | -``` |
480 | | -state ←── Update(state, msg) ──→ (state', cmd) |
481 | | - │ |
482 | | - └──────── View(state) ──────────→ string (full screen, redrawn) |
| 469 | +```mermaid |
| 470 | +graph LR |
| 471 | + state(("state")) |
| 472 | + update["Update(state, msg)"] |
| 473 | + next(("state'")) |
| 474 | + cmd["cmd<br/>(async thunk)"] |
| 475 | + view["View(state)"] |
| 476 | + frame["string<br/>(full screen, redrawn)"] |
| 477 | + state --> update |
| 478 | + update --> next |
| 479 | + update --> cmd |
| 480 | + cmd -.->|msg| update |
| 481 | + state --> view |
| 482 | + view --> frame |
483 | 483 | ``` |
484 | 484 |
|
485 | 485 | Input events (`tea.KeyMsg`, `tea.WindowSizeMsg`, spinner ticks, HTTP |
|
0 commit comments