GitXplorerGitXplorer
s

onnxcustom

public
16 stars
3 forks
5 issues

Commits

List of commits on branch master.
Verified
671b3c9927c37fb88a01d9a8c42df1c2b0ae092e

Support for FN, UZ variant of float 8 types (#91)

xxadupre committed 2 years ago
Unverified
651a946399ce319bdce7c8679648347ee1c2b7f4

fix formula

xxadupre committed 2 years ago
Verified
76c3d2ec456a0bc4a37ab13d243003d4d29b7faf

Improves float 8 functions and documentation (#90)

xxadupre committed 2 years ago
Verified
bfe64a1cdc417ee2d8bcbf625039228b59f7655c

Fixes backward propagation for shape results (#88)

ssdpython committed 2 years ago
Unverified
736dbbbae129de49ba81bf9ded61db28e62804e7

documentation

xxadupre committed 2 years ago
Unverified
b91298816e794ad489ac31b96e7cc765a24d42a7

extend unit test

xxadupre committed 2 years ago

README

The README file for this repository.

.. image:: https://circleci.com/gh/sdpython/onnxcustom/tree/master.svg?style=svg :target: https://circleci.com/gh/sdpython/onnxcustom/tree/master

.. image:: https://travis-ci.com/sdpython/onnxcustom.svg?branch=master :target: https://app.travis-ci.com/github/sdpython/onnxcustom :alt: Build status

.. image:: https://ci.appveyor.com/api/projects/status/a3sn45a2fayoxb5q?svg=true :target: https://ci.appveyor.com/project/sdpython/onnxcustom :alt: Build Status Windows

.. image:: https://codecov.io/gh/sdpython/onnxcustom/branch/master/graph/badge.svg :target: https://codecov.io/gh/sdpython/onnxcustom

.. image:: https://badge.fury.io/py/onnxcustom.svg :target: http://badge.fury.io/py/onnxcustom

.. image:: http://img.shields.io/github/issues/sdpython/onnxcustom.png :alt: GitHub Issues :target: https://github.com/sdpython/onnxcustom/issues

.. image:: https://img.shields.io/badge/license-MIT-blue.svg :alt: MIT License :target: http://opensource.org/licenses/MIT

.. image:: https://pepy.tech/badge/onnxcustom/month :target: https://pepy.tech/project/onnxcustom/month :alt: Downloads

.. image:: https://img.shields.io/github/forks/sdpython/onnxcustom.svg :target: https://github.com/sdpython/onnxcustom/ :alt: Forks

.. image:: https://img.shields.io/github/stars/sdpython/onnxcustom.svg :target: https://github.com/sdpython/onnxcustom/ :alt: Stars

.. image:: https://img.shields.io/github/repo-size/sdpython/onnxcustom :target: https://github.com/sdpython/onnxcustom/ :alt: size

onnxcustom: custom ONNX

.. image:: https://raw.githubusercontent.com/sdpython/onnxcustom/master/_doc/sphinxdoc/source/_static/project_ico.png :width: 50

documentation <http://www.xavierdupre.fr/app/onnxcustom/helpsphinx/index.html>_

Examples, tutorial on how to convert machine learned models into ONNX, implement your own converter or runtime, or even train with ONNX / onnxruntime.

The function check or the command line python -m onnxcustom check checks the module is properly installed and returns processing time for a couple of functions or simply:

::

import onnxcustom
onnxcustom.check()

The documentation also introduces onnx, onnxruntime for inference and training. The tutorial related to scikit-learn has been merged into sklearn-onnx documentation <http://onnx.ai/sklearn-onnx/index_tutorial.html>_. Among the tools this package implements, you may find:

  • a tool to convert NVidia Profilder logs into a dataframe,
  • a SGD optimizer similar to what scikit-learn implements but based on onnxruntime-training and able to train an CPU and GPU,
  • functions to manipulate onnx graph.

Installation of onnxruntime-training

onnxruntime-training is only available on Linux. The CPU can be installed with the following instruction.

::

pip install onnxruntime-training --extra-index-url https://download.onnxruntime.ai/onnxruntime_nightly_cpu.html

Versions using GPU with CUDA or ROCm are available. Check download.onnxruntime.ai <https://download.onnxruntime.ai/>_ to find a specific version. You can use it on Windows inside WSL (Windows Linux Subsystem) or compile it for CPU:

::

python tools\ci_build\build.py --skip_tests --build_dir .\build\Windows --config Release --build_shared_lib --build_wheel --numpy_version= --cmake_generator="Visual Studio 16 2019" --enable_training --enable_training_ops

GPU versions work better on WSL, see Build onnxruntime on WSL (Windows Linux Subsystem) <http://www.xavierdupre.fr/app/onnxcustom/helpsphinx/blog/2021/2021-12-16_wsl.html>_. onnxcustom can be installed from pypi.