Commit 3baa2da
Upgrade ONNX from 1.19 to 1.21 (#1207)
### What does this PR do?
Type of change: new feature
Upgrade ONNX dependency from `~=1.19.0` to `~=1.21.0`. ONNX 1.20+
removed several deprecated
helper functions (`float32_to_bfloat16`, `float32_to_float8e4m3`,
`pack_float32_to_4bit`) that
`onnx_graphsurgeon` 0.5.x still references at import time. This PR adds
a compatibility shim
(`modelopt/onnx/_onnx_compat.py`) that restores these functions using
`ml_dtypes` before any
`onnx_graphsurgeon` import occurs. This supersedes the partial inline
fix from #1204 by also
handling `float32_to_float8e4m3`.
Changes:
- Bump `onnx~=1.19.0` to `onnx~=1.21.0` in `pyproject.toml`
- Add `modelopt/onnx/_onnx_compat.py` compatibility shim for removed
ONNX APIs
- Import shim in `modelopt/onnx/__init__.py` and
`tests/unit/onnx/conftest.py`
- Remove usage of removed `onnx.helper.pack_float32_to_4bit` in
`test_quant_utils.py`
- Update example requirements (`genai_llm`, `whisper`) to `onnx==1.21.0`
**TensorRT Compatibility:** TRT 10.16-GA supports opsets 9–24. ModelOpt
quantization modes
use opsets 19–23, all within range. ONNX 1.21 does not force opset 26.
### Usage
```python
# No API changes — the upgrade is transparent to users.
# The compatibility shim is applied automatically on import.
import modelopt.onnx
```
### Testing
- 469/470 ONNX unit tests pass inside
`nvcr.io/nvidia/tensorrt:25.06-py3` (1 pre-existing ORT
`CopyTensorAsync` EP issue, not ONNX-related)
- 6/6 `torch_onnx` integration tests pass (fp8, int8, nvfp4, mxfp8,
int4_awq, auto)
- ViT FP8 quantization via `torch_onnx` → TRT engine build → ImageNet
eval: **85.3% top-1, 97.8% top-5**
- ViT FP8 quantization via `onnx_ptq` → TRT engine build succeeds
- All pre-commit hooks pass (ruff, mypy, bandit, license headers)
### Before your PR is "*Ready for review*"
Make sure you read and follow [Contributor
guidelines](https://github.com/NVIDIA/Model-Optimizer/blob/main/CONTRIBUTING.md)
and your commits are signed (`git commit -s -S`).
Make sure you read and follow the [Security Best
Practices](https://github.com/NVIDIA/Model-Optimizer/blob/main/SECURITY.md#security-coding-practices-for-contributors)
(e.g. avoiding hardcoded `trust_remote_code=True`, `torch.load(...,
weights_only=False)`, `pickle`, etc.).
- Is this change backward compatible?: ✅
- If you copied code from any other sources or added a new PIP
dependency, did you follow guidance in `CONTRIBUTING.md`: ✅
- Did you write any new necessary tests?: ✅ (updated existing tests,
added conftest.py for compat shim)
- Did you update
[Changelog](https://github.com/NVIDIA/Model-Optimizer/blob/main/CHANGELOG.rst)?:
❌ (dependency upgrade, no API change)
### Additional Information
Related: #1204 (partial fix for `float32_to_bfloat16` only — this PR
supersedes it with full coverage)
🤖 Generated with [Claude Code](https://claude.com/claude-code)
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit
* **Dependencies**
* Removed unpinned ONNX from example requirement files and updated the
ONNX optional dependency to ~=1.21.0.
* **Refactor**
* Centralized an ONNX compatibility shim to restore missing helper APIs
when needed.
* **Tests**
* Added tests for the compatibility shim, adjusted quantization tests to
remove reliance on removed ONNX helpers, and ensured shim runs before
related tests.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
---------
Signed-off-by: ajrasane <131806219+ajrasane@users.noreply.github.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>1 parent 3249d0b commit 3baa2da
File tree
5 files changed
+18
-61
lines changed- examples/windows/onnx_ptq
- genai_llm
- whisper
- modelopt/onnx
- tests/unit/onnx/quantization
5 files changed
+18
-61
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1 | 1 | | |
2 | | - | |
3 | 2 | | |
4 | 3 | | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
4 | 4 | | |
5 | 5 | | |
6 | 6 | | |
7 | | - | |
8 | 7 | | |
9 | 8 | | |
10 | 9 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
18 | 18 | | |
19 | 19 | | |
20 | 20 | | |
21 | | - | |
22 | | - | |
23 | | - | |
24 | | - | |
25 | | - | |
26 | | - | |
27 | | - | |
28 | | - | |
29 | | - | |
30 | | - | |
31 | | - | |
32 | | - | |
33 | 21 | | |
34 | 22 | | |
35 | 23 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
57 | 57 | | |
58 | 58 | | |
59 | 59 | | |
60 | | - | |
61 | | - | |
| 60 | + | |
| 61 | + | |
62 | 62 | | |
63 | 63 | | |
64 | 64 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
16 | 16 | | |
17 | 17 | | |
18 | 18 | | |
19 | | - | |
20 | 19 | | |
21 | 20 | | |
22 | 21 | | |
| |||
37 | 36 | | |
38 | 37 | | |
39 | 38 | | |
40 | | - | |
41 | 39 | | |
42 | 40 | | |
43 | | - | |
44 | | - | |
| 41 | + | |
45 | 42 | | |
46 | 43 | | |
47 | | - | |
48 | 44 | | |
49 | 45 | | |
50 | | - | |
51 | | - | |
| 46 | + | |
52 | 47 | | |
53 | 48 | | |
54 | | - | |
55 | 49 | | |
56 | 50 | | |
57 | | - | |
58 | | - | |
| 51 | + | |
59 | 52 | | |
60 | 53 | | |
61 | | - | |
62 | 54 | | |
63 | 55 | | |
64 | | - | |
65 | | - | |
| 56 | + | |
66 | 57 | | |
67 | 58 | | |
68 | | - | |
69 | 59 | | |
70 | 60 | | |
71 | | - | |
72 | | - | |
| 61 | + | |
73 | 62 | | |
74 | 63 | | |
75 | 64 | | |
76 | 65 | | |
77 | 66 | | |
78 | 67 | | |
79 | | - | |
80 | 68 | | |
81 | 69 | | |
82 | | - | |
83 | | - | |
| 70 | + | |
84 | 71 | | |
85 | 72 | | |
86 | 73 | | |
87 | | - | |
88 | 74 | | |
89 | 75 | | |
90 | | - | |
91 | | - | |
| 76 | + | |
92 | 77 | | |
93 | 78 | | |
94 | 79 | | |
95 | | - | |
96 | 80 | | |
97 | 81 | | |
98 | | - | |
99 | | - | |
| 82 | + | |
100 | 83 | | |
101 | 84 | | |
102 | 85 | | |
| |||
105 | 88 | | |
106 | 89 | | |
107 | 90 | | |
108 | | - | |
109 | 91 | | |
110 | | - | |
111 | 92 | | |
112 | 93 | | |
113 | 94 | | |
114 | 95 | | |
115 | | - | |
116 | 96 | | |
117 | | - | |
118 | 97 | | |
119 | 98 | | |
120 | 99 | | |
| |||
123 | 102 | | |
124 | 103 | | |
125 | 104 | | |
126 | | - | |
127 | 105 | | |
128 | | - | |
129 | 106 | | |
130 | 107 | | |
131 | 108 | | |
132 | 109 | | |
133 | | - | |
134 | 110 | | |
135 | | - | |
136 | 111 | | |
137 | 112 | | |
138 | 113 | | |
139 | 114 | | |
140 | 115 | | |
141 | 116 | | |
142 | 117 | | |
143 | | - | |
144 | 118 | | |
145 | | - | |
146 | 119 | | |
147 | 120 | | |
148 | 121 | | |
149 | 122 | | |
150 | | - | |
151 | 123 | | |
152 | | - | |
153 | 124 | | |
154 | 125 | | |
155 | 126 | | |
156 | 127 | | |
157 | 128 | | |
158 | 129 | | |
159 | 130 | | |
160 | | - | |
161 | 131 | | |
162 | | - | |
163 | 132 | | |
164 | 133 | | |
165 | 134 | | |
166 | 135 | | |
167 | | - | |
168 | 136 | | |
169 | | - | |
170 | 137 | | |
171 | 138 | | |
172 | 139 | | |
| |||
175 | 142 | | |
176 | 143 | | |
177 | 144 | | |
178 | | - | |
179 | 145 | | |
180 | | - | |
181 | 146 | | |
182 | 147 | | |
183 | 148 | | |
| |||
186 | 151 | | |
187 | 152 | | |
188 | 153 | | |
189 | | - | |
190 | 154 | | |
191 | | - | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
192 | 163 | | |
193 | 164 | | |
194 | 165 | | |
| |||
0 commit comments