The Simulation Core: Rust and a Discrete Event Scheduler
The heart of the system is a discrete event simulation (DES) engine written in pure Rust. No async runtime. No threads (WASM is single-threaded in browsers). Just a priority queue of timestamped events and a tight loop that processes them.
pub struct Simulator {
event_queue: BinaryHeap<Reverse<SimEvent>>,
clock: SimTime,
nodes: HashMap<NodeId, Node>,
network: NetworkTopology,
}
impl Simulator {
pub fn step(&mut self) -> Option<SimEvent> {
let event = self.event_queue.pop()?.0;
self.clock = event.time;
self.dispatch(event.clone());
Some(event)
}
}
SimTime is a u64 representing nanoseconds of simulated
time — not wall time. The simulation can run a full 60-second scenario in 200 milliseconds
of real time, or deliberately slow down to let you watch a Raft leader election happen in
human-perceivable steps.
The BinaryHeap<Reverse<SimEvent>> pattern is the standard DES trick —
Reverse flips the max-heap into a min-heap so the earliest event always pops
first. Events carry a type, a target node ID, a payload, and a timestamp.
Async Rust compiled to WASM introduces runtimes like Tokio that add ~150KB to the binary and — more importantly — introduce their own scheduling logic that fights with the deterministic event ordering required for reproducible simulations. When you replay a scenario with the same seed, you need identical results. Async executors make that guarantee hard to keep.
Determinism as a Feature
Every random value in the simulator comes from a seeded PRNG — specifically
SmallRng from the rand crate, seeded at scenario start.
Network latency jitter, packet loss events, node failure timing — all seeded.
This means a scenario is fully replayable. You can serialize the seed and the event log, send it to a colleague, and they run the exact same simulation. The "share scenario" feature works exactly this way: it's a seed and a config JSON, not a recording of every event.
It also means the test suite can run the simulator and assert on specific event sequences. If a Raft implementation regresses, the test fails deterministically — not flakily.
cargo test — assertions run against exact event sequences, not probabilistic outcomesCompiling to WASM: The wasm-bindgen Bridge
The Rust core compiles to a .wasm binary via wasm-pack. The
interface surface exposed to the outside world is defined with
#[wasm_bindgen] macros:
#[wasm_bindgen]
pub struct SimulatorHandle {
inner: Rc<RefCell<Simulator>>,
}
#[wasm_bindgen]
impl SimulatorHandle {
#[wasm_bindgen(constructor)]
pub fn new(config_json: &str) -> Result<SimulatorHandle, JsValue> {
let config: ScenarioConfig = serde_json::from_str(config_json)
.map_err(|e| JsValue::from_str(&e.to_string()))?;
Ok(SimulatorHandle {
inner: Rc::new(RefCell::new(Simulator::from_config(config))),
})
}
pub fn step(&mut self) -> JsValue {
match self.inner.borrow_mut().step() {
Some(event) => serde_wasm_bindgen::to_value(&event).unwrap(),
None => JsValue::NULL,
}
}
}
The Rc<RefCell<>> wrapping is necessary because
wasm_bindgen doesn't support mutable references across the FFI boundary in the
way you'd want. You wrap in RefCell and take the borrow at call time. It's
not pretty. It's the standard pattern.
Serialization crosses the boundary as JSON strings (for config) or via
serde_wasm_bindgen (for event data). The latter avoids an intermediate
JSON parse/stringify round-trip for the hot path — the step() call that
fires potentially thousands of times per second.
The compiled .wasm binary is ~380KB uncompressed, ~120KB gzipped. That's
with opt-level = "z" and lto = true in the release profile.
The wasm-opt post-processing step from Binaryen shaves another ~15KB. For
a full simulation engine with several implemented protocols, that's acceptable.
The Flutter Shell: Why Not Just a Web App?
The question that comes up every time: why Flutter? The honest answer: the node graph canvas. Rendering 500+ nodes with animated edges, real-time packet-in-flight visualization, and smooth 60fps interaction is genuinely difficult in the browser DOM. SVG doesn't scale. Canvas 2D requires careful manual dirty-region tracking. WebGL is correct but means writing a mini rendering engine.
Flutter's CustomPainter with Canvas gives a retained-mode
drawing API backed by Skia (or Impeller on newer targets). The node graph renderer is a
single CustomPainter subclass:
class SimulatorBridge {
late js.JSObject _handle;
Future<void> initialize(ScenarioConfig config) async {
final configJson = jsonEncode(config.toJson());
_handle = SimulatorWasm.create(configJson.toJS);
}
SimEvent? step() {
final result = _handle.callMethod('step'.toJS);
if (result.isNull) return null;
return SimEvent.fromJson(result.dartify() as Map);
}
}
This is deliberately thin. The bridge does type conversion and nothing else. Business logic stays in Rust. Rendering logic stays in Flutter. The bridge is not the place for either.
The node graph renderer transforms world coordinates to screen coordinates using a
Matrix4 maintained by the pan/zoom gesture handler, draws edges first
(z-order), then nodes, then in-flight packet animations — and only repaints when the
simulation emits a new event or the user interacts.
The Render Loop
Flutter's animation system drives the simulation step rate. A Ticker fires
every frame (targeting 60fps). Each tick, the Flutter layer calls step() on
the Rust core some number of times — controlled by a "simulation speed" multiplier.
step() call per frame. Real-time — watch Raft elections unfold at human-perceivable pace.
Events returned by step() flow into a stream that the UI layer subscribes to.
State changes — node status, message queues, leader election outcomes — are applied to
Flutter ChangeNotifier objects. Widgets rebuild only when their specific state
changes. Even at 100× speed, the Dart side isn't doing work proportional to the
number of simulation events — it's batching state deltas and applying them once
per frame.
What This Architecture Gets You
Deterministic by construction
The Rust core is deterministic and has no side effects visible outside. Test it with
cargo test entirely without a browser — assertions run on exact event
sequences.
Simulation state lives outside Dart heap
Simulation state lives in WASM linear memory. A GC pause pauses rendering; it doesn't corrupt or delay simulation events. The two runtimes don't share memory.
No real concurrency in the core
Events are processed one at a time in timestamp order. The "concurrency" being simulated is modeled, not real — which is the only way to get observable, controllable behavior.
No data leaves your machine
The entire simulation runs in WASM memory in your browser tab. You can model topologies containing real service names and proprietary configurations without any data reaching an external server.
The FFI boundary has a cost. Crossing from Dart into WASM and back is not free. At very high simulation speeds (10,000×), the serialization overhead at the bridge becomes the bottleneck, not the simulation itself. For most use cases — learning, architecture validation, interview prep — it doesn't matter. For genuinely large-scale parameter sweeps, a native Rust binary would be faster.
Specific dependencies worth naming
- wasm-bindgen 0.2.x The FFI glue. Macro expansion is verbose but the output is correct.
-
serde + serde_wasm_bindgen
Serialization across the boundary.
serde_wasm_bindgenavoids the JSON string intermediary for hot-path calls. - rand (SmallRng) Fast, seedable, portable PRNG. Not cryptographic, which is fine here.
-
wasm-pack
Build toolchain that handles
wasm-optpost-processing and generates the JS/TS glue code. -
priority-queue
Used where event priorities need to change (Dijkstra-style). The standard
BinaryHeapdoesn't support priority updates. -
flutter_riverpod
State management on the Flutter side —
ChangeNotifierproviders per feature slice.