Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement a ONNX to ONNX Script code generator based on libcst #873

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

abock
Copy link
Contributor

@abock abock commented Jul 13, 2023

[WIP] Proposal to use libcst for code generation

This PR is somewhat experimental, but goes far enough to illustrate using libcst for code generation of ONNX to ONNX Script. It is nearly as complete as the existing template-based onnxscript.proto2python, and lays the groundwork for implementing more "raising" passes to produce idiomatic Python code.

I implemented this to learn more about libcst and currently I am heavily leaning on wanting to take a full dependency on libcst. I see adoption happening in three phases:

  1. Continue with the work in this PR by implementing more transformers to produce idiomatic Python ONNX Script from ONNX. This keeps the libcst dependency scoped to just this feature.

  2. Port the existing opgen generator to use libcst. Again, the libcst dependency would be scoped to just this code generator.

  3. Finally, let's port our existing Python AST support that uses the builtin ast module to libcst. I suspect if we do this, lots of utilities that are necessary for (1) and (2) will be highly useful here. More importantly, libcst provides scope analysis and qualified name resolving, which is the basis for doing real semantic analysis work with CSTTransformer.

Also note, that if we implement various passes as CSTTransformers, we can begin to define a very nice UX in IDEs for "lightbulb fixes". For example any pass we may apply as part of builtin code generation for ONNX to ONNX Script (phase 1), we can surface to the IDEs to upgrade user code to more idiomatic Python. PyTorch is also beginning to use libcst for providing fixes to PyTorch code as well.

Depending on libcst

libcst itself does not carry many dependencies and is relatively small (about 10MB). Comparatively, onnx itself is rather large (112MB). Thus, I am not too concerned about the size of the libcst dependency, considering it would become central to ONNX Script across the three identified phases.

Dependency Cost

Command Installed site-packages Disk Usage Delta
python3.10 -m venv venv --upgrade-deps setuptools, pip 19MB Baseline
pip install onnx onnx, google, protobuf, numpy, typing_extensions 131MB +112MB
pip install libcst libcst, pyyaml, mypy_extensions, typing_inspect 142MB +11MB

Current capability of this PR:

  • Adds some general codegen utilities based on libcst

  • Implements an ONNX to ONNX Script generator: the base converter produces ONNX Script that is very 1:1 with the structure of ONNX, and transformers are implemented to raise the generated code to more idiomatic Python that ONNX Script supports; this commit provides support for raising to Python binary operators and raising Constant/make_tensor to supported Python constants; more transformers need to be implemented, but this commit can be used as a guide.

  • Adds a new top-level command line interface, allowing the code generator to be invoked:

    python -m onnxscript convert model.onnx

- Adds some general codegen utilities based on libcst

- Implements an ONNX to ONNX Script generator: the base converter
  produces ONNX Script that is very 1:1 with the structure of ONNX,
  and transformers are implemented to raise the generated code to
  more idiomatic Python that ONNX Script supports; this commit
  provides support for raising to Python binary operators and raising
  Constant/make_tensor to supported Python constants; more transformers
  need to be implemented, but this commit can be used as a guide.

- Adds a new top-level command line interface, allowing the code
  generator to be invoked:

  python -m onnxscript convert model.onnx
@codecov
Copy link

codecov bot commented Jul 13, 2023

Codecov Report

Merging #873 (99a2f51) into main (97604f6) will decrease coverage by 1.94%.
The diff coverage is 0.00%.

@@            Coverage Diff             @@
##             main     #873      +/-   ##
==========================================
- Coverage   76.57%   74.64%   -1.94%     
==========================================
  Files         112      115       +3     
  Lines       13407    13754     +347     
  Branches     1350     1440      +90     
==========================================
  Hits        10267    10267              
- Misses       2805     3151     +346     
- Partials      335      336       +1     
Impacted Files Coverage Δ
onnxscript/__main__.py 0.00% <0.00%> (ø)
onnxscript/codeanalysis/__init__.py 0.00% <0.00%> (ø)
onnxscript/codeanalysis/onnx_to_onnxscript.py 0.00% <0.00%> (ø)

@justinchuby justinchuby self-requested a review July 13, 2023 18:01

numpy_tensor = onnx.numpy_helper.to_array(tensor_proto)

return cst.Call(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is one example where verbosity makes this harder to read than the existing template-based mechanism. Do you think it might make sense to judiciously mix parsed strings, with formatted strings, with cst constructors to get the best of both worlds?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants