2a2632444b
Refactor Dockerfile and update .dockerignore for wasi-nn tests; adjust map-dir parameters in smoke test script ( #4158 )
2025-04-10 11:59:59 +08:00
b2c7cb2375
Use wasm32-wasip1 instead of wasm32-wasi target for rust code ( #4057 )
...
Rust compiler previously deprecated, and now removed the wasm32-wasi target and replaced it with wasm32-wasip1. This
change updates all the occurrences of wasm32-wasi in the context of Rust compilation.
covers the wasi-nn/test.
2025-02-05 11:31:49 +08:00
0599351262
wasi-nn: Add a new target for llama.cpp as a wasi-nn backend ( #3709 )
...
Minimum support:
- [x] accept (WasmEdge) customized model parameters. metadata.
- [x] Target [wasmedge-ggml examples](https://github.com/second-state/WasmEdge-WASINN-examples/tree/master/wasmedge-ggml )
- [x] basic
- [x] chatml
- [x] gemma
- [x] llama
- [x] qwen
---
In the future, to support if required:
- [ ] Target [wasmedge-ggml examples](https://github.com/second-state/WasmEdge-WASINN-examples/tree/master/wasmedge-ggml )
- [ ] command-r. (>70G memory requirement)
- [ ] embedding. (embedding mode)
- [ ] grammar. (use the grammar option to constrain the model to generate the JSON output)
- [ ] llama-stream. (new APIS `compute_single`, `get_output_single`, `fini_single`)
- [ ] llava. (image representation)
- [ ] llava-base64-stream. (image representation)
- [ ] multimodel. (image representation)
- [ ] Target [llamaedge](https://github.com/LlamaEdge/LlamaEdge )
2024-09-10 08:45:18 +08:00
140ff25d46
wasi-nn: Apply new architecture ( #3692 )
...
ps.
https://github.com/bytecodealliance/wasm-micro-runtime/issues/3677
2024-08-13 09:14:52 +08:00
058bc47102
[wasi-nn] Add a new wasi-nn backend openvino ( #3603 )
2024-07-22 17:16:41 +08:00
d36160b294
wasi-nn: Add wasmedge-wasinn-example as smoke test ( #3554 )
2024-06-24 12:03:08 +08:00