We rewrote our Rust WASM Parser in TypeScript – and it got 3x Faster
4 hours ago
- #Performance Optimization
- #TypeScript
- #WASM
- The openui-lang parser was initially built in Rust and compiled to WASM for performance, but optimization efforts revealed the WASM boundary overhead was the bottleneck.
- The parser pipeline consists of six stages: Autocloser, Lexer, Splitter, Parser, Resolver, and Mapper.
- Attempts to optimize by skipping JSON serialization with serde-wasm-bindgen resulted in 30% slower performance due to fine-grained conversions across runtime boundaries.
- The real solution was porting the parser to TypeScript, eliminating the WASM boundary entirely, resulting in 2.2x to 4.6x faster per-call performance.
- An O(N²) streaming inefficiency was addressed by implementing statement-level incremental caching, reducing total streaming cost by 2.6x to 3.3x.
- WASM is beneficial for compute-bound tasks with minimal interop, portable native libraries, but not for frequently-called functions on small inputs.
- Key takeaways include profiling actual time spent, understanding the costs of WASM-JS boundary crossings, and prioritizing algorithmic improvements over language-level optimizations.