AI Dev Tools

@wllama/wllama

Open Source Updated 26d ago

WebAssembly binding for llama.cpp - Enabling on-browser LLM inference

npm install wllama-wllama

Did you build this?

Claim your listing to see exactly how many AI agents recommend this tool, your success rate, and more. Free, no commission, no fees.

Claim This Listing

WebAssembly binding for llama.cpp — run large language models in the browser with on-device inference.

Save tools & get AI recommendations

Free forever. No credit card required.

Sign Up Free
Visit Website → ☆ Bookmark

Listed for free · No commission · Claim this listing

23 developers visited via IndieStack this month
𝕏 Share
webassemblyllamallmbrowserwasm
View on GitHub ★ 1,027TypeScript
Last commit 125d ago48 open issues
Using this saves ~120k tokens vs building from scratch
Something wrong? Log in to report.
Get weekly indie picks straight to your inbox