Skip to content

Commit db20715

Browse files
committed
feat(blog): convert atlas.llm ASCII diagrams to mermaid
Both the top-level system architecture diagram (atlas.llm Go process + llama-server subprocess) and the bubbletea Elm-triple diagram were rendered as plain ASCII fences. Convert them to mermaid so they render as proper flow diagrams in the blog view.
1 parent cc4e9fc commit db20715

1 file changed

Lines changed: 28 additions & 28 deletions

File tree

public/posts/atlas-llm-deep-dive.txt

Lines changed: 28 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# `atlas.llm`: A Deep Dive Into a Single-Binary, On-Device Coding Companion
1+
# atlas.llm: A Deep Dive Into a Single-Binary, On-Device Coding Companion
22

33
> _A study in systems-level glue code: how ~2,900 lines of Go turn a prebuilt
44
> `llama.cpp` server into a usable local AI coding assistant, and the
@@ -54,29 +54,19 @@ consequence of these three.
5454

5555
The program is a Go process that performs three roles simultaneously:
5656

57-
```
58-
┌────────────────────────────────────────────────────────────────┐
59-
│ atlas.llm (Go) │
60-
│ │
61-
│ ┌───────────────────┐ ┌──────────────────────────┐ │
62-
│ │ TUI / CLI layer │──────▶│ Orchestration & config │ │
63-
│ │ (bubbletea + │ │ (download, registry, │ │
64-
│ │ lipgloss + │ │ chat history, slash │ │
65-
│ │ glamour) │◀──────│ commands) │ │
66-
│ └───────────────────┘ └───────────┬──────────────┘ │
67-
│ │ │
68-
│ ┌──────────▼──────────┐ │
69-
│ │ llama-server client │ │
70-
│ │ (net/http + JSON) │ │
71-
│ └──────────┬──────────┘ │
72-
└───────────────────────────────────────────┼────────────────────┘
73-
│ HTTP on localhost
74-
75-
┌──────────────────────────────┐
76-
│ llama-server (C++) │
77-
│ GGUF mmap + KV cache + │
78-
│ chat template from weights │
79-
└──────────────────────────────┘
57+
```mermaid
58+
graph TB
59+
subgraph atlas["atlas.llm (Go)"]
60+
direction TB
61+
TUI["TUI / CLI layer<br/>(bubbletea + lipgloss + glamour)"]
62+
Orch["Orchestration & config<br/>(download, registry,<br/>chat history, slash commands)"]
63+
Client["llama-server client<br/>(net/http + JSON)"]
64+
TUI -->|command| Orch
65+
Orch -->|reply| TUI
66+
Orch --> Client
67+
end
68+
Server["llama-server (C++)<br/>GGUF mmap + KV cache +<br/>chat template from weights"]
69+
Client -->|HTTP on localhost| Server
8070
```
8171

8272
The Go process **never performs inference itself**. It spawns one long-
@@ -476,10 +466,20 @@ The terminal UI is built on [`bubbletea`](https://github.com/charmbracelet/bubbl
476466
an Elm-Architecture implementation for Go. Every bubbletea program is
477467
a triple:
478468

479-
```
480-
state ←── Update(state, msg) ──→ (state', cmd)
481-
482-
└──────── View(state) ──────────→ string (full screen, redrawn)
469+
```mermaid
470+
graph LR
471+
state(("state"))
472+
update["Update(state, msg)"]
473+
next(("state'"))
474+
cmd["cmd<br/>(async thunk)"]
475+
view["View(state)"]
476+
frame["string<br/>(full screen, redrawn)"]
477+
state --> update
478+
update --> next
479+
update --> cmd
480+
cmd -.->|msg| update
481+
state --> view
482+
view --> frame
483483
```
484484

485485
Input events (`tea.KeyMsg`, `tea.WindowSizeMsg`, spinner ticks, HTTP

0 commit comments

Comments
 (0)