Skip to content

Actions: microsoft/onnxruntime

Issue Labeler

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
3,251 workflow runs
3,251 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Encryption does not work with trt_dump_ep_context_model
Issue Labeler #6176: Issue #23289 opened by BengtGustafsson
January 8, 2025 14:37 2m 54s
January 8, 2025 14:37 2m 54s
[Build] v1.19.2 abseil_cpp failed: 2 with JP5.1.4 gcc/g++13
Issue Labeler #6175: Issue #23286 opened by lida2003
January 8, 2025 07:41 11s
January 8, 2025 07:41 11s
January 8, 2025 02:37 2m 47s
[Performance] model inference in onnxruntime is toooooo slow
Issue Labeler #6173: Issue #23282 edited by Tian14267
January 8, 2025 02:32 2m 27s
January 8, 2025 02:32 2m 27s
January 8, 2025 02:29 1m 28s
January 8, 2025 02:27 7s
[Performance] model inference in onnxruntime is toooooo slow
Issue Labeler #6170: Issue #23282 edited by Tian14267
January 8, 2025 02:18 50s
January 8, 2025 02:18 50s
[Performance] model inference in onnxruntime is toooooo slow
Issue Labeler #6169: Issue #23282 edited by Tian14267
January 8, 2025 02:18 1m 21s
January 8, 2025 02:18 1m 21s
[Performance] model inference in onnxruntime is toooooo slow
Issue Labeler #6168: Issue #23282 edited by Tian14267
January 8, 2025 02:17 1m 37s
January 8, 2025 02:17 1m 37s
[Performance] model inference in onnxruntime is toooooo slow
Issue Labeler #6167: Issue #23282 edited by Tian14267
January 8, 2025 02:17 1m 51s
January 8, 2025 02:17 1m 51s
[Performance] model inference in onnxruntime is toooooo slow
Issue Labeler #6166: Issue #23282 opened by Tian14267
January 8, 2025 02:05 9s
January 8, 2025 02:05 9s
[js/webgpu] ConvTranspose1D slower on Webgpu than Wasm
Issue Labeler #6165: Issue #23273 opened by gianlourbano
January 7, 2025 15:26 3m 13s
January 7, 2025 15:26 3m 13s
Mismatch between Matmul op in FLOAT16 and pytorch Linear op.
Issue Labeler #6164: Issue #23272 opened by AyoubMDL
January 7, 2025 13:32 44s
January 7, 2025 13:32 44s
Why is the console messed up when using onnxruntime.InferenceSession?
Issue Labeler #6163: Issue #23270 edited by Septemberlemon
January 7, 2025 03:14 1m 48s
January 7, 2025 03:14 1m 48s
Why is the console messed up when using onnxruntime.InferenceSession?
Issue Labeler #6162: Issue #23270 opened by Septemberlemon
January 7, 2025 03:13 1m 14s
January 7, 2025 03:13 1m 14s
Allow disabling colors in the output
Issue Labeler #6161: Issue #23269 opened by mrkam2
January 7, 2025 02:53 41s
January 7, 2025 02:53 41s
January 7, 2025 01:17 1m 27s
System.ExecutionEngineException creating Microsoft.ML.OnnxRuntime.SessionOptions
Issue Labeler #6158: Issue #23263 opened by DennyOne
January 6, 2025 19:17 9s
January 6, 2025 19:17 9s
CoreML failed: Unable to get shape for output
Issue Labeler #6157: Issue #23262 opened by thewh1teagle
January 6, 2025 18:11 9s
January 6, 2025 18:11 9s
Memory access violation when execute optimization and inference
Issue Labeler #6155: Issue #23258 edited by Cookiee235
January 6, 2025 14:26 2m 47s
January 6, 2025 14:26 2m 47s
Memory access violation when execute optimization and inference
Issue Labeler #6154: Issue #23258 opened by Cookiee235
January 6, 2025 14:25 2m 51s
January 6, 2025 14:25 2m 51s
The CPU is running normally, but the GPU running results are inconsistent
Issue Labeler #6153: Issue #23201 edited by FFchopon
January 6, 2025 08:55 1m 48s
January 6, 2025 08:55 1m 48s
[Build] Better support for vcpkg
Issue Labeler #6152: Issue #23158 edited by snnn
January 5, 2025 20:09 3m 18s
January 5, 2025 20:09 3m 18s