Skip to content

Commit

Permalink
refactor README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
fumihwh committed Oct 9, 2018
1 parent 6b62296 commit 88babe3
Show file tree
Hide file tree
Showing 3 changed files with 29 additions and 35 deletions.
39 changes: 4 additions & 35 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,55 +1,24 @@
# Tensorflow Backend and Frontend for ONNX
[![Build Status](https://travis-ci.org/onnx/onnx-tensorflow.svg?branch=master)](https://travis-ci.org/onnx/onnx-tensorflow)

## To convert pb between Tensorflow and ONNX:
## To convert models between Tensorflow and ONNX:

This comment has been minimized.

Copy link
@tjingrant

tjingrant Oct 9, 2018

Collaborator

Since our package name os onnx-tf, maybe we should be consistent here and say between ONNX and Tensorflow (even though logically there's no difference.)


### Use CLI:
Tensorflow -> ONNX: `onnx-tf convert -t onnx -i /path/to/input.pb -o /path/to/output.onnx`

This comment has been minimized.

Copy link
@tjingrant

tjingrant Oct 9, 2018

Collaborator

Let's not use -> here since the meaning is not universally established. Let's use explicitly From ... To ...


ONNX -> Tensorflow: `onnx-tf convert -t tf -i /path/to/input.onnx -o /path/to/output.pb`

### Use python:
### Convert programmatically:

Tensorflow -> ONNX:
```
from tensorflow.core.framework import graph_pb2
from onnx_tf.frontend import tensorflow_graph_to_onnx_model
graph_def = graph_pb2.GraphDef()
with open(input_path, "rb") as f:
graph_def.ParseFromString(f.read())
nodes, node_inputs = set(), set()
for node in graph_def.node:
nodes.add(node.name)
node_inputs.update(set(node.input))
output = list(set(nodes) - node_inputs)
[Tensorflow -> ONNX](https://github.com/onnx/onnx-tensorflow/blob/master/example/tf_to_onnx.py)

model = tensorflow_graph_to_onnx_model(graph_def, output, ignore_unimplemented=True)
with open(output_path, 'wb') as f:
f.write(model.SerializeToString())
```

ONNX -> Tensorflow:
```
import onnx
from onnx_tf.backend import prepare
onnx_model = onnx.load(input_path)
tf_rep = prepare(onnx_model)
tf_rep.export_graph(output_path)
```
[ONNX -> Tensorflow](https://github.com/onnx/onnx-tensorflow/blob/master/example/onnx_to_tf.py)

## To do inference on ONNX model by using Tensorflow backend:

This comment has been minimized.

Copy link
@tjingrant

tjingrant Oct 9, 2018

Collaborator

Maybe shorter name? something like ONNX Model Inference with onnx-tf ?

```
import onnx
from onnx_tf.backend import prepare
output = prepare(onnx.load(input_path)).run(input)
```

Expand Down
8 changes: 8 additions & 0 deletions example/onnx_to_tf.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
import onnx

from onnx_tf.backend import prepare


onnx_model = onnx.load("input_path")
tf_rep = prepare(onnx_model)
tf_rep.export_graph("output_path")
17 changes: 17 additions & 0 deletions example/tf_to_onnx.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
from tensorflow.core.framework import graph_pb2

from onnx_tf.frontend import tensorflow_graph_to_onnx_model


graph_def = graph_pb2.GraphDef()
with open("input_path", "rb") as f:
graph_def.ParseFromString(f.read())
nodes, node_inputs = set(), set()
for node in graph_def.node:
nodes.add(node.name)
node_inputs.update(set(node.input))
output = list(set(nodes) - node_inputs)

model = tensorflow_graph_to_onnx_model(graph_def, output, ignore_unimplemented=True)
with open("output_path", 'wb') as f:
f.write(model.SerializeToString())

0 comments on commit 88babe3

Please sign in to comment.