ruvector-onnx-embeddings-wasmRepository35/100 via “tokenization and text preprocessing for embeddings”
Portable WASM embedding generation with SIMD and parallel workers - run text embeddings in browsers, Cloudflare Workers, Deno, and Node.js
Unique: Implements streaming tokenization for long documents, processing text in chunks and maintaining state across chunk boundaries to handle word-boundary edge cases. Supports custom tokenization rules via pluggable tokenizer interface, allowing domain-specific vocabulary (e.g., code tokens, medical terminology).
vs others: More efficient than calling external tokenization APIs (e.g., Hugging Face Inference API) since tokenization runs locally with zero network latency, and more flexible than hardcoded tokenization since vocabulary is configurable per model.