Make wasi-nn backends as separated shared libraries (#3509)

- All files under *core/iwasm/libraries/wasi-nn* are compiled as shared libraries
- *wasi-nn.c* is shared between backends
- Every backend has a separated shared library
- If wasi-nn feature is enabled, iwasm will depend on shared library libiwasm.so
  instead of linking static library libvmlib.a
This commit is contained in:
liang.he
2024-06-14 12:06:56 +08:00
committed by GitHub
parent 1434c45283
commit f844b33b2d
20 changed files with 296 additions and 258 deletions

View File

@ -2,15 +2,28 @@
## How to use
### Host
Enable WASI-NN in the WAMR by spefiying it in the cmake building configuration as follows,
```
```cmake
set (WAMR_BUILD_WASI_NN 1)
```
The definition of the functions provided by WASI-NN is in the header file `core/iwasm/libraries/wasi-nn/wasi_nn.h`.
or in command line
By only including this file in your WASM application you will bind WASI-NN into your module.
```bash
$ cmake -DWAMR_BUILD_WASI_NN=1 <other options> ...
```
> ![Caution]
> If enable `WAMR_BUID_WASI_NN`, iwasm will link a shared WAMR library instead of a static one. Wasi-nn backends will be loaded dynamically at runtime. Users shall specify the path of the backend library and register it to the iwasm runtime with `--native-lib=<path of backend library>`. All shared libraries should be placed in the `LD_LIBRARY_PATH`.
### Wasm
The definition of functions provided by WASI-NN (Wasm imports) is in the header file _core/iwasm/libraries/wasi-nn/wasi_nn.h_.
By only including this file in a WASM application you will bind WASI-NN into your module.
## Tests
@ -27,9 +40,8 @@ Build the runtime image for your execution target type.
- `vx-delegate`
- `tpu`
```
EXECUTION_TYPE=cpu
docker build -t wasi-nn-${EXECUTION_TYPE} -f core/iwasm/libraries/wasi-nn/test/Dockerfile.${EXECUTION_TYPE} .
```bash
EXECUTION_TYPE=cpu docker build -t wasi-nn-${EXECUTION_TYPE} -f core/iwasm/libraries/wasi-nn/test/Dockerfile.${EXECUTION_TYPE} .
```
### Build wasm app
@ -50,15 +62,19 @@ If all the tests have run properly you will the the following message in the ter
Tests: passed!
```
> [!TIP]
> Use _libwasi-nn-tflite.so_ as an example. You shall use whatever you have built.
- CPU
```
```bash
docker run \
-v $PWD/core/iwasm/libraries/wasi-nn/test:/assets \
-v $PWD/core/iwasm/libraries/wasi-nn/test/models:/models \
wasi-nn-cpu \
--dir=/ \
--env="TARGET=cpu" \
--native-lib=/lib/libwasi-nn-tflite.so \
/assets/test_tensorflow.wasm
```
@ -66,7 +82,7 @@ docker run \
- Requirements:
- [NVIDIA docker](https://github.com/NVIDIA/nvidia-docker).
```
```bash
docker run \
--runtime=nvidia \
-v $PWD/core/iwasm/libraries/wasi-nn/test:/assets \
@ -74,17 +90,19 @@ docker run \
wasi-nn-nvidia-gpu \
--dir=/ \
--env="TARGET=gpu" \
--native-lib=/lib/libwasi-nn-tflite.so \
/assets/test_tensorflow.wasm
```
- vx-delegate for NPU (x86 simulator)
```
```bash
docker run \
-v $PWD/core/iwasm/libraries/wasi-nn/test:/assets \
wasi-nn-vx-delegate \
--dir=/ \
--env="TARGET=gpu" \
--native-lib=/lib/libwasi-nn-tflite.so \
/assets/test_tensorflow_quantized.wasm
```
@ -92,7 +110,7 @@ docker run \
- Requirements:
- [Coral USB](https://coral.ai/products/accelerator/).
```
```bash
docker run \
--privileged \
--device=/dev/bus/usb:/dev/bus/usb \
@ -100,6 +118,7 @@ docker run \
wasi-nn-tpu \
--dir=/ \
--env="TARGET=tpu" \
--native-lib=/lib/libwasi-nn-tflite.so \
/assets/test_tensorflow_quantized.wasm
```
@ -120,20 +139,20 @@ Use [classification-example](https://github.com/bytecodealliance/wasi-nn/tree/ma
### Prepare the model and the wasm
``` bash
```bash
$ pwd
/workspaces/wasm-micro-runtime/core/iwasm/libraries/wasi-nn/test
$ docker build -t wasi-nn-example:v1.0 -f Dockerfile.wasi-nn-example .
```
There are model files(*mobilenet\**) and wasm files(*wasi-nn-example.wasm*) in the directory */workspaces/wasi-nn/rust/examples/classification-example/build* in the image of wasi-nn-example:v1.0.
There are model files(\*mobilenet\**) and wasm files(*wasi-nn-example.wasm*) in the directory */workspaces/wasi-nn/rust/examples/classification-example/build\* in the image of wasi-nn-example:v1.0.
### build iwasm and test
*TODO: May need alternative steps to build the iwasm and test in the container of wasi-nn-example:v1.0*
_TODO: May need alternative steps to build the iwasm and test in the container of wasi-nn-example:v1.0_
``` bash
```bash
$ pwd
/workspaces/wasm-micro-runtime
@ -143,9 +162,9 @@ $ docker run --rm -it -v $(pwd):/workspaces/wasm-micro-runtime wasi-nn-example:v
> [!Caution]
> The following steps are executed in the container of wasi-nn-example:v1.0.
``` bash
```bash
$ cd /workspaces/wasm-micro-runtime/product-mini/platforms/linux
$ cmake -S . -B build -DWAMR_BUILD_WASI_NN=1 -DWAMR_BUILD_WASI_EPHEMERAL_NN=1
$ cmake --build build
$ ./build/iwasm -v=5 --map-dir=/workspaces/wasi-nn/rust/examples/classification-example/build/::fixture /workspaces/wasi-nn/rust/examples/classification-example/build/wasi-nn-example.wasm
```
```