769d16eaab
wasi-nn: move some host-only things out of wasi_nn_types.h ( #4334 )
...
cf. https://github.com/bytecodealliance/wasm-micro-runtime/issues/4324
2025-06-06 15:07:29 +08:00
79cb4366ae
wasi-nn: remove unused wasi_nn_dump_tensor_dimension prototype ( #4325 )
2025-06-05 09:48:28 +08:00
b20ebc2724
wasi_nn.h: add import_name attribute ( #4328 )
...
this would fix undefined symbol errors by making it clear
these functions are imported.
references:
e2c698c7e8/llvm/lib/MC/WasmObjectWriter.cpp (L1798-L1799)
e2c698c7e8/llvm/lib/Object/WasmObjectFile.cpp (L749-L752)
e2c698c7e8/lld/wasm/Symbols.cpp (L203)
e2c698c7e8/lld/wasm/Relocations.cpp (L36-L40)
2025-06-05 09:48:00 +08:00
ae6e490ad5
fix wasi-nn abi definitions ( #4307 )
...
sync with a more appropriate version of the definitions.
as we use the "wasi_ephemeral_nn", which is p1-based, it seems
more appropriate to use definitions from witx, not wit.
it's a bit unfortunate p2-based wasi-nn made gratuitous changes
like this from p1.
note: this is an ABI change.
2025-06-03 13:22:48 +08:00
aa1ff778b9
add load_by_name in wasi-nn ( #4298 )
2025-06-03 06:26:58 +08:00
0599351262
wasi-nn: Add a new target for llama.cpp as a wasi-nn backend ( #3709 )
...
Minimum support:
- [x] accept (WasmEdge) customized model parameters. metadata.
- [x] Target [wasmedge-ggml examples](https://github.com/second-state/WasmEdge-WASINN-examples/tree/master/wasmedge-ggml )
- [x] basic
- [x] chatml
- [x] gemma
- [x] llama
- [x] qwen
---
In the future, to support if required:
- [ ] Target [wasmedge-ggml examples](https://github.com/second-state/WasmEdge-WASINN-examples/tree/master/wasmedge-ggml )
- [ ] command-r. (>70G memory requirement)
- [ ] embedding. (embedding mode)
- [ ] grammar. (use the grammar option to constrain the model to generate the JSON output)
- [ ] llama-stream. (new APIS `compute_single`, `get_output_single`, `fini_single`)
- [ ] llava. (image representation)
- [ ] llava-base64-stream. (image representation)
- [ ] multimodel. (image representation)
- [ ] Target [llamaedge](https://github.com/LlamaEdge/LlamaEdge )
2024-09-10 08:45:18 +08:00
140ff25d46
wasi-nn: Apply new architecture ( #3692 )
...
ps.
https://github.com/bytecodealliance/wasm-micro-runtime/issues/3677
2024-08-13 09:14:52 +08:00
058bc47102
[wasi-nn] Add a new wasi-nn backend openvino ( #3603 )
2024-07-22 17:16:41 +08:00
db025e457a
sync up with latest wasi-nn spec ( #3530 )
2024-06-17 14:58:09 +08:00
f844b33b2d
Make wasi-nn backends as separated shared libraries ( #3509 )
...
- All files under *core/iwasm/libraries/wasi-nn* are compiled as shared libraries
- *wasi-nn.c* is shared between backends
- Every backend has a separated shared library
- If wasi-nn feature is enabled, iwasm will depend on shared library libiwasm.so
instead of linking static library libvmlib.a
2024-06-14 12:06:56 +08:00
028f43bc18
Fix compilation warnings of wasi-nn ( #3497 )
2024-06-07 10:49:44 +08:00
ac9e789951
wasi-nn: Simplify cmake and headers' location ( #2308 )
...
Major changes:
- Public headers inside `wasi-nn/include`
- Put cmake files in `cmake` folder
- Make linux iwasm link with `${WASI_NN_LIBS}` so iwasm can enable wasi-nn
2023-06-26 09:29:05 +08:00