Skip to content

v1.8.0

Compare
Choose a tag to compare
@jcwchen jcwchen released this 07 Nov 00:55
994c618

ONNX v1.8 is now available with exciting enhanced features! You may learn more about the project, who is involved and what tools are available at the onnx.ai site. We would like to thank every community member for contributing to the project!

Key Updates

  • Windows conda package is now available in v1.8.0 Release (last supported version was v1.1.1)
  • Training
    • Added Differentiable tags to make Gradient operator better defined #2723, #2893, #2911, #2954
    • Removed GraphCall; eliminated need to implement GraphCall #2964
    • Created a tool and example for users to use TrainingInfoProto for training #3008
  • Shape Inference and Checker
    • Large model (>2GB model) support added for checker and shape_inference #2744
    • Graph level shape inference fixes to patch the IR gap introduced since IR version 4 #3023
    • Node level shape inference fixes for operators
  • Version Converter
    • More operators supported #2664
  • General Features
    • Added serialization for inputs and outputs of Sequence and Map data types #2581
    • Added programmatic access to version-table and extend make-model #2918
    • Added size check to make_tensor #2987

Opset version 13

API

  • onnx.shape_inference now accepts model path and supports >2GB models for shape inference. #3012

Infrastructure

  • CI improvements for reliability
  • Moved to AzurePipelines to speed up runs
  • pybind11 updated to 2.6.0 to prevent segmentation fault on Windows

Bug fixes

  • #2888 Return empty string from ToDataTypeString() when tensor_data_type not found
  • #2946 Resolve segfault on Input without tensor data in ConstantofShape
  • #2950 Add nullptr check to type inference mtds to avoid segfaults
  • #2983 Fix type inference issue (scalar initializers and Resize)
  • #3000 Fix ConvTranspose: enhance attribute check
  • #3005 Fix shape inference of scalar ConstantOfShape
  • #3014 Fix shape inference
  • #3023 IR gap issue has been fixed in checker and shape inference

Installation

You can simply pip upgrade using the pip install onnx --upgrade or build from source following the instructions on Github.

Notes

  • onnx.optimizer is moving to another repo: https://github.com/onnx/optimizer. It will be removed from onnx/onnx in ONNX 1.9.
  • onnx.version_converter has IR gap issue - cannot use input from initializer: #3007
  • onnx.shape_inference updates both output and value_info. It will only update the original output in future update: #3069

Contributors

Thanks to these individuals for their contributions in this release:
jcwchen, askhade, wschin, vinitra, prasanthpul, gramalingam, daquexian, rajeevnalawadi, sveta-levitan, ashbhandare, chinhuang007, KsenijaS, shinh, BowenBao, shubhambhokare1, pranav-prakash, prabhat00155, pluradj, matteosal, jackwish, Yukigaru, H1Gdev, 462630221, natke, kevinch-nv, RandySheriffH, souptc, fdwr, HectorSVC, jspisak, codemzs, yuslepukhin, linkerzhang