To use these files in your wllama project:
import { Wllama } from 'wllama';
const wllama = new Wllama({
wasmUrl: 'https://your-domain.com/wllama.wasm',
wasmJsUrl: 'https://your-domain.com/wllama.js'
});
await wllama.loadModelFromUrl('path/to/bonsai-8b-q1_0_g128.gguf');
Built from: https://github.com/ngxson/wllama with llama.cpp replaced by https://github.com/PrismML-Eng/llama.cpp