GitXplorerGitXplorer
o

jest-onnxruntime

public
0 stars
0 forks
0 issues

Commits

List of commits on branch master.
Unverified
99e592e3d06c8f11241e7fdbb8183afbc04c0ff7

Readme for completeness

oo-alexandre-felipe committed 3 years ago
Unverified
e4f1734e2b4a8eaa6521e931039b1ee7a7b2ce94

Docker example

oo-alexandre-felipe committed 3 years ago
Unverified
852425660d6535af10237094386785ba45eb0669

Initial test

oo-alexandre-felipe committed 3 years ago

README

The README file for this repository.

JEST/onnxruntime bug

This repository demonstrates a problem I experienced when trying to use jest with onnxjs. onnxjs is a frame work to evaluate neural network models in the format ONNX, it supports different backends. onnxjs-node enables the use of onnxruntime in onnxjs.

These libraries use tensors objects, a tensor class is a multidimensional view of a typed array. The outputs of a neural network are constructed in NAPI as typed arrays [1].

Both the native code and javascript code have typed array maps [2],[3], and the javascript code checks if the value returned is an array of the expected type[4]. Normally this works, but if we invoke this from a jest test, that check will evaluate to false, and finally it will throw an error[5].

Similar issues were reported before[6], [7].

Steps to reproduce

Clone this repository and run npm install. The script run.js loads and evaluate double.onnx it should print 2 * 1.5 = 3, where the 3 was calculated in the onnxruntime code. The script jest.test.js is the attempt to do the same in a jest test suite, unfortunately the test fails in that type check, before the (correct) result, is returned.

Using Docker

For convenience, and to better reproducibility I also prepared a docker file, you should be able to see the described results by running docker-compose up --build. The OK service shows the successful run. The NOK shows the failing run.

The underlying ONNX model

The model was generated using pytorch in the script gen-model.py.