You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran the example convolutional net on the MNIST data and everything worked well.
Then I made one change, I plugged in solver method Adam() instead of SGD() to probe the system.
This resulted in an error, so I reset the Adam() to SGD() and restarted.
This gave the following output:
[2017-11-27T08:43:37 | info | Mocha]:
ERROR: LoadError: update is not implemented for the solver type Mocha.Solver{Mocha.SGD}, net type Mocha.Net{Mocha.CPUBackend}, and state type Mocha.SolverState{Mocha.AdamSolverState}
Stacktrace:
[1] onestep_solve(::Mocha.Solver{Mocha.SGD}, ::Mocha.Net{Mocha.CPUBackend}, ::Mocha.SolverState{Mocha.AdamSolverState}) at /home/colin/.julia/v0.6/Mocha/src/solvers.jl:210
[2] do_solve_loop(::Mocha.Solver{Mocha.SGD}, ::Mocha.Net{Mocha.CPUBackend}, ::Mocha.SolverState{Mocha.AdamSolverState}) at /home/colin/.julia/v0.6/Mocha/src/solvers.jl:242
[3] solve(::Mocha.Solver{Mocha.SGD}, ::Mocha.Net{Mocha.CPUBackend}) at /home/colin/.julia/v0.6/Mocha/src/solvers.jl:235
[4] include_from_node1(::String) at ./loading.jl:569
[5] include(::String) at ./sysimg.jl:14
[6] process_options(::Base.JLOptions) at ./client.jl:305
[7] _start() at ./client.jl:371
while loading /home/colin/courses/deeplearn/mochatest.jl, in expression starting on line 40
It looks like the solver state has been retained from the failed Adam attempt, and is now spoiling the straight SGD. I was able to get past this by removing the load_from option in params.
When re-attempting Adam I get:
ERROR: LoadError: MethodError: no method matching axpy!(::Int64, ::Float32, ::Ptr{Float64}, ::Int64, ::Ptr{Float32}, ::Int64)
Closest candidates are:
axpy!(::Integer, ::Float32, !Matched::Union{DenseArray{Float32,N} where N, Ptr{Float32}}, ::Integer, ::Union{DenseArray{Float32,N} where N, Ptr{Float32}}, ::Integer) at linalg/blas.jl:434
axpy!(::Integer, !Matched::Float64, ::Union{DenseArray{Float64,N} where N, Ptr{Float64}}, ::Integer, !Matched::Union{DenseArray{Float64,N} where N, Ptr{Float64}}, ::Integer) at linalg/blas.jl:434
axpy!(::Integer, !Matched::Complex{Float64}, !Matched::Union{DenseArray{Complex{Float64},N} where N, Ptr{Complex{Float64}}}, ::Integer, !Matched::Union{DenseArray{Complex{Float64},N} where N, Ptr{Complex{Float64}}}, ::Integer) at linalg/blas.jl:434
...
Stacktrace:
[1] update_parameters!(::Mocha.Net{Mocha.CPUBackend}, ::Mocha.Adam, ::Float64, ::Float64, ::Float64, ::Float64, ::Mocha.CPUBlob{Float32,4}, ::Mocha.CPUBlob{Float32,4}, ::Mocha.CPUBlob{Float32,4}, ::Mocha.CPUBlob{Float32,4}, ::Float64, ::Type{T} where T) at /home/colin/.julia/v0.6/Mocha/src/solvers/adam.jl:144
[2] update(::Mocha.Solver{Mocha.Adam}, ::Mocha.Net{Mocha.CPUBackend}, ::Mocha.SolverState{Mocha.AdamSolverState}) at /home/colin/.julia/v0.6/Mocha/src/solvers/adam.jl:110
[3] onestep_solve(::Mocha.Solver{Mocha.Adam}, ::Mocha.Net{Mocha.CPUBackend}, ::Mocha.SolverState{Mocha.AdamSolverState}) at /home/colin/.julia/v0.6/Mocha/src/solvers.jl:210
[4] do_solve_loop(::Mocha.Solver{Mocha.Adam}, ::Mocha.Net{Mocha.CPUBackend}, ::Mocha.SolverState{Mocha.AdamSolverState}) at /home/colin/.julia/v0.6/Mocha/src/solvers.jl:242
[5] solve(::Mocha.Solver{Mocha.Adam}, ::Mocha.Net{Mocha.CPUBackend}) at /home/colin/.julia/v0.6/Mocha/src/solvers.jl:235
[6] include_from_node1(::String) at ./loading.jl:569
[7] include(::String) at ./sysimg.jl:14
[8] process_options(::Base.JLOptions) at ./client.jl:305
[9] _start() at ./client.jl:371
while loading /home/colin/courses/deeplearn/mochatest.jl, in expression starting on line 41
In the Adam paper the authors mention the usefulness of Adam in a Convolutional context. Am I missing an option somewhere or is this an option waiting for its turn on the todo list?
The text was updated successfully, but these errors were encountered:
I ran the example convolutional net on the MNIST data and everything worked well.
Then I made one change, I plugged in solver method
Adam()
instead ofSGD()
to probe the system.This resulted in an error, so I reset the
Adam()
toSGD()
and restarted.This gave the following output:
It looks like the solver state has been retained from the failed
Adam
attempt, and is now spoiling the straightSGD
. I was able to get past this by removing theload_from
option inparams
.When re-attempting
Adam
I get:In the Adam paper the authors mention the usefulness of Adam in a Convolutional context. Am I missing an option somewhere or is this an option waiting for its turn on the todo list?
The text was updated successfully, but these errors were encountered: