You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran into this panic during actual usage, then reduced it to these tests (borrowing test code from test_tensor_ops_grad.rs). It seems like maybe it's trying to take the gradient of a variable or placeholder when it shouldn't have to because grad_with_default provides the gradient explicitly...
---- test_tensor_ops_grad::add_n_single_placeholder stdout ----
thread 'test_tensor_ops_grad::add_n_single_placeholder' panicked at 'internal error: entered unreachable code', /n/rust-autograd/src/tensor_ops/basic_source_ops.rs:23:1
stack backtrace:
0: rust_begin_unwind
at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panicking.rs:584:5
1: core::panicking::panic_fmt
at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/panicking.rs:142:14
2: core::panicking::panic
at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/panicking.rs:48:5
3: <autograd::tensor_ops::basic_source_ops::Placeholder as autograd::op::Op<T>>::grad
at ./src/tensor_ops/basic_source_ops.rs:15:17
4: autograd::op::GradientContext<T>::compute_input_grads
at ./src/op.rs:371:9
5: autograd::gradient::compute_gradients
at ./src/gradient.rs:60:23
6: autograd::tensor_ops::grad_with_default
at ./src/tensor_ops/mod.rs:142:21
7: lib::test_tensor_ops_grad::add_n_single_placeholder::{{closure}}
at ./tests/test_tensor_ops_grad.rs:83:17
8: autograd::graph::run
at ./src/graph.rs:99:5
9: lib::test_tensor_ops_grad::add_n_single_placeholder
at ./tests/test_tensor_ops_grad.rs:78:5
10: lib::test_tensor_ops_grad::add_n_single_placeholder::{{closure}}
at ./tests/test_tensor_ops_grad.rs:77:1
11: core::ops::function::FnOnce::call_once
at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/ops/function.rs:248:5
12: core::ops::function::FnOnce::call_once
at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/ops/function.rs:248:5
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
---- test_tensor_ops_grad::add_n_single_variable stdout ----
thread 'test_tensor_ops_grad::add_n_single_variable' panicked at 'internal error: entered unreachable code', /n/rust-autograd/src/tensor_ops/basic_source_ops.rs:21:1
stack backtrace:
0: rust_begin_unwind
at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panicking.rs:584:5
1: core::panicking::panic_fmt
at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/panicking.rs:142:14
2: core::panicking::panic
at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/panicking.rs:48:5
3: <autograd::tensor_ops::basic_source_ops::Variable as autograd::op::Op<T>>::grad
at ./src/tensor_ops/basic_source_ops.rs:15:17
4: autograd::op::GradientContext<T>::compute_input_grads
at ./src/op.rs:371:9
5: autograd::gradient::compute_gradients
at ./src/gradient.rs:60:23
6: autograd::tensor_ops::grad_with_default
at ./src/tensor_ops/mod.rs:142:21
7: lib::test_tensor_ops_grad::add_n_single_variable::{{closure}}
at ./tests/test_tensor_ops_grad.rs:63:17
8: autograd::variable::VariableEnvironment<F>::run
at ./src/variable.rs:697:9
9: lib::test_tensor_ops_grad::add_n_single_variable
at ./tests/test_tensor_ops_grad.rs:61:5
10: lib::test_tensor_ops_grad::add_n_single_variable::{{closure}}
at ./tests/test_tensor_ops_grad.rs:58:1
11: core::ops::function::FnOnce::call_once
at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/ops/function.rs:248:5
12: core::ops::function::FnOnce::call_once
at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/ops/function.rs:248:5
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
The text was updated successfully, but these errors were encountered:
I ran into this panic during actual usage, then reduced it to these tests (borrowing test code from
test_tensor_ops_grad.rs
). It seems like maybe it's trying to take the gradient of a variable or placeholder when it shouldn't have to because grad_with_default provides the gradient explicitly...The text was updated successfully, but these errors were encountered: