diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md new file mode 100644 index 0000000..b4e1709 --- /dev/null +++ b/CODE_OF_CONDUCT.md @@ -0,0 +1,15 @@ +# Code of Conduct + +We are a community based on openness, as well as friendly and didactic discussions. + +We aspire to treat everybody equally, and value their contributions. + +Decisions are made based on technical merit and consensus. + +Code is not the only way to help the project. Reviewing pull requests, +answering questions to help others on mailing lists or issues, organizing and +teaching tutorials, working on the website, improving the documentation, are +all priceless contributions. + +We abide by the principles of openness, respect, and consideration of others of +the Python Software Foundation: https://www.python.org/psf/codeofconduct/ diff --git a/README.rst b/README.rst index b24b73d..fa6cdb7 100644 --- a/README.rst +++ b/README.rst @@ -31,6 +31,10 @@ onnx-array-api: APIs to create ONNX Graphs **onnx-array-api** implements APIs to create custom ONNX graphs. The objective is to speed up the implementation of converter libraries. + +Numpy API ++++++++++ + The first one matches **numpy API**. It gives the user the ability to convert functions written following the numpy API to convert that function into ONNX as @@ -113,10 +117,15 @@ It supports eager mode as well: l2_loss=[0.002] [0.042] +Light API ++++++++++ + The second API or **Light API** tends to do every thing in one line. +It is inspired from the `Reverse Polish Notation +`_. The euclidean distance looks like the following: -:: +.. code-block:: python import numpy as np from onnx_array_api.light_api import start @@ -142,3 +151,30 @@ The library is released on `pypi/onnx-array-api `_ and its documentation is published at `APIs to create ONNX Graphs `_. + +GraphBuilder API +++++++++++++++++ + +Almost every converting library (converting a machine learned model to ONNX) is implementing +its own graph builder and customizes it for its needs. +It handles some frequent tasks such as giving names to intermediate +results, loading, saving onnx models. It can be used as well to extend an existing graph. + +.. code-block:: python + + import numpy as np + from onnx_array_api.graph_api import GraphBuilder + + g = GraphBuilder() + g.make_tensor_input("X", np.float32, (None, None)) + g.make_tensor_input("Y", np.float32, (None, None)) + r1 = g.make_node("Sub", ["X", "Y"]) # the name given to the output is given by the class, + # it ensures the name is unique + init = g.make_initializer(np.array([2], dtype=np.int64)) # the class automatically + # converts the array to a tensor + r2 = g.make_node("Pow", [r1, init]) + g.make_node("ReduceSum", [r2], outputs=["Z"]) # the output name is given because + # the user wants to choose the name + g.make_tensor_output("Z", np.float32, (None, None)) + + onx = g.to_onnx() # final conversion to onnx diff --git a/_doc/index.rst b/_doc/index.rst index f2f8998..02c4eed 100644 --- a/_doc/index.rst +++ b/_doc/index.rst @@ -45,11 +45,83 @@ The objective is to speed up the implementation of converter libraries. CHANGELOGS license +Sources available on +`github/onnx-array-api `_. + +GraphBuilder API +++++++++++++++++ + +Almost every converting library (converting a machine learned model to ONNX) is implementing +its own graph builder and customizes it for its needs. +It handles some frequent tasks such as giving names to intermediate +results, loading, saving onnx models. It can be used as well to extend an existing graph. +See :ref:`l-graph-api`. + +.. runpython:: + :showcode: + + import numpy as np + from onnx_array_api.graph_api import GraphBuilder + from onnx_array_api.plotting.text_plot import onnx_simple_text_plot + + g = GraphBuilder() + g.make_tensor_input("X", np.float32, (None, None)) + g.make_tensor_input("Y", np.float32, (None, None)) + r1 = g.make_node("Sub", ["X", "Y"]) # the name given to the output is given by the class, + # it ensures the name is unique + init = g.make_initializer(np.array([2], dtype=np.int64)) # the class automatically + # converts the array to a tensor + r2 = g.make_node("Pow", [r1, init]) + g.make_node("ReduceSum", [r2], outputs=["Z"]) # the output name is given because + # the user wants to choose the name + g.make_tensor_output("Z", np.float32, (None, None)) + + onx = g.to_onnx() # final conversion to onnx + + print(onnx_simple_text_plot(onx)) + +Light API ++++++++++ + +The syntax is inspired from the +`Reverse Polish Notation `_. +This kind of API is easy to use to build new graphs, +less easy to extend an existing graph. See :ref:`l-light-api`. + +.. runpython:: + :showcode: + + import numpy as np + from onnx_array_api.light_api import start + from onnx_array_api.plotting.text_plot import onnx_simple_text_plot + + model = ( + start() + .vin("X") + .vin("Y") + .bring("X", "Y") + .Sub() + .rename("dxy") + .cst(np.array([2], dtype=np.int64), "two") + .bring("dxy", "two") + .Pow() + .ReduceSum() + .rename("Z") + .vout() + .to_onnx() + ) + + print(onnx_simple_text_plot(model)) + Numpy API +++++++++ -Sources available on -`github/onnx-array-api `_. +Writing ONNX graphs requires to know ONNX syntax unless +it is possible to reuse an existing syntax such as :epkg:`numpy`. +This is what this API is doing. +This kind of API is easy to use to build new graphs, +almost impossible to use to extend new graphs as it usually requires +to know onnx for that. See :ref:`l-numpy-api-onnx`. .. runpython:: :showcode: @@ -110,35 +182,6 @@ Sources available on res = jitted_myloss(x, y) print(to_dot(jitted_myloss.get_onnx())) -Light API -+++++++++ - -.. runpython:: - :showcode: - - import numpy as np - from onnx_array_api.light_api import start - from onnx_array_api.plotting.text_plot import onnx_simple_text_plot - - model = ( - start() - .vin("X") - .vin("Y") - .bring("X", "Y") - .Sub() - .rename("dxy") - .cst(np.array([2], dtype=np.int64), "two") - .bring("dxy", "two") - .Pow() - .ReduceSum() - .rename("Z") - .vout() - .to_onnx() - ) - - print(onnx_simple_text_plot(model)) - - Older versions ++++++++++++++ pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy