Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: can't serialize tensor(62, device='cuda:0') #46

Open
ltc576935585 opened this issue Jun 7, 2020 · 5 comments
Open

TypeError: can't serialize tensor(62, device='cuda:0') #46

ltc576935585 opened this issue Jun 7, 2020 · 5 comments

Comments

@ltc576935585
Copy link

ltc576935585 commented Jun 7, 2020

I do not know why it happend.

/home/lts/.conda/envs/PCL/bin/python /home/lts/PycharmProject/mean-teacher/pytorch/main.py
INFO:main:=> creating model 'cifar_shakeshake26'
INFO:main:=> creating EMA model 'cifar_shakeshake26'
INFO:main:
List of model parameters:

module.conv1.weight 16 * 3 * 3 * 3 = 432
module.layer1.0.conv_a1.weight 96 * 16 * 3 * 3 = 13,824
module.layer1.0.bn_a1.weight 96 = 96
module.layer1.0.bn_a1.bias 96 = 96
module.layer1.0.conv_a2.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.0.bn_a2.weight 96 = 96
module.layer1.0.bn_a2.bias 96 = 96
module.layer1.0.conv_b1.weight 96 * 16 * 3 * 3 = 13,824
module.layer1.0.bn_b1.weight 96 = 96
module.layer1.0.bn_b1.bias 96 = 96
module.layer1.0.conv_b2.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.0.bn_b2.weight 96 = 96
module.layer1.0.bn_b2.bias 96 = 96
module.layer1.0.downsample.0.weight 96 * 16 * 1 * 1 = 1,536
module.layer1.0.downsample.1.weight 96 = 96
module.layer1.0.downsample.1.bias 96 = 96
module.layer1.1.conv_a1.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.1.bn_a1.weight 96 = 96
module.layer1.1.bn_a1.bias 96 = 96
module.layer1.1.conv_a2.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.1.bn_a2.weight 96 = 96
module.layer1.1.bn_a2.bias 96 = 96
module.layer1.1.conv_b1.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.1.bn_b1.weight 96 = 96
module.layer1.1.bn_b1.bias 96 = 96
module.layer1.1.conv_b2.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.1.bn_b2.weight 96 = 96
module.layer1.1.bn_b2.bias 96 = 96
module.layer1.2.conv_a1.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.2.bn_a1.weight 96 = 96
module.layer1.2.bn_a1.bias 96 = 96
module.layer1.2.conv_a2.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.2.bn_a2.weight 96 = 96
module.layer1.2.bn_a2.bias 96 = 96
module.layer1.2.conv_b1.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.2.bn_b1.weight 96 = 96
module.layer1.2.bn_b1.bias 96 = 96
module.layer1.2.conv_b2.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.2.bn_b2.weight 96 = 96
module.layer1.2.bn_b2.bias 96 = 96
module.layer1.3.conv_a1.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.3.bn_a1.weight 96 = 96
module.layer1.3.bn_a1.bias 96 = 96
module.layer1.3.conv_a2.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.3.bn_a2.weight 96 = 96
module.layer1.3.bn_a2.bias 96 = 96
module.layer1.3.conv_b1.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.3.bn_b1.weight 96 = 96
module.layer1.3.bn_b1.bias 96 = 96
module.layer1.3.conv_b2.weight 96 * 96 * 3 * 3 = 82,944
module.layer1.3.bn_b2.weight 96 = 96
module.layer1.3.bn_b2.bias 96 = 96
module.layer2.0.conv_a1.weight 192 * 96 * 3 * 3 = 165,888
module.layer2.0.bn_a1.weight 192 = 192
module.layer2.0.bn_a1.bias 192 = 192
module.layer2.0.conv_a2.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.0.bn_a2.weight 192 = 192
module.layer2.0.bn_a2.bias 192 = 192
module.layer2.0.conv_b1.weight 192 * 96 * 3 * 3 = 165,888
module.layer2.0.bn_b1.weight 192 = 192
module.layer2.0.bn_b1.bias 192 = 192
module.layer2.0.conv_b2.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.0.bn_b2.weight 192 = 192
module.layer2.0.bn_b2.bias 192 = 192
module.layer2.0.downsample.conv.weight 192 * 96 * 1 * 1 = 18,432
module.layer2.0.downsample.conv.bias 192 = 192
module.layer2.0.downsample.bn.weight 192 = 192
module.layer2.0.downsample.bn.bias 192 = 192
module.layer2.1.conv_a1.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.1.bn_a1.weight 192 = 192
module.layer2.1.bn_a1.bias 192 = 192
module.layer2.1.conv_a2.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.1.bn_a2.weight 192 = 192
module.layer2.1.bn_a2.bias 192 = 192
module.layer2.1.conv_b1.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.1.bn_b1.weight 192 = 192
module.layer2.1.bn_b1.bias 192 = 192
module.layer2.1.conv_b2.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.1.bn_b2.weight 192 = 192
module.layer2.1.bn_b2.bias 192 = 192
module.layer2.2.conv_a1.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.2.bn_a1.weight 192 = 192
module.layer2.2.bn_a1.bias 192 = 192
module.layer2.2.conv_a2.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.2.bn_a2.weight 192 = 192
module.layer2.2.bn_a2.bias 192 = 192
module.layer2.2.conv_b1.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.2.bn_b1.weight 192 = 192
module.layer2.2.bn_b1.bias 192 = 192
module.layer2.2.conv_b2.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.2.bn_b2.weight 192 = 192
module.layer2.2.bn_b2.bias 192 = 192
module.layer2.3.conv_a1.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.3.bn_a1.weight 192 = 192
module.layer2.3.bn_a1.bias 192 = 192
module.layer2.3.conv_a2.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.3.bn_a2.weight 192 = 192
module.layer2.3.bn_a2.bias 192 = 192
module.layer2.3.conv_b1.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.3.bn_b1.weight 192 = 192
module.layer2.3.bn_b1.bias 192 = 192
module.layer2.3.conv_b2.weight 192 * 192 * 3 * 3 = 331,776
module.layer2.3.bn_b2.weight 192 = 192
module.layer2.3.bn_b2.bias 192 = 192
module.layer3.0.conv_a1.weight 384 * 192 * 3 * 3 = 663,552
module.layer3.0.bn_a1.weight 384 = 384
module.layer3.0.bn_a1.bias 384 = 384
module.layer3.0.conv_a2.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.0.bn_a2.weight 384 = 384
module.layer3.0.bn_a2.bias 384 = 384
module.layer3.0.conv_b1.weight 384 * 192 * 3 * 3 = 663,552
module.layer3.0.bn_b1.weight 384 = 384
module.layer3.0.bn_b1.bias 384 = 384
module.layer3.0.conv_b2.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.0.bn_b2.weight 384 = 384
module.layer3.0.bn_b2.bias 384 = 384
module.layer3.0.downsample.conv.weight 384 * 192 * 1 * 1 = 73,728
module.layer3.0.downsample.conv.bias 384 = 384
module.layer3.0.downsample.bn.weight 384 = 384
module.layer3.0.downsample.bn.bias 384 = 384
module.layer3.1.conv_a1.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.1.bn_a1.weight 384 = 384
module.layer3.1.bn_a1.bias 384 = 384
module.layer3.1.conv_a2.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.1.bn_a2.weight 384 = 384
module.layer3.1.bn_a2.bias 384 = 384
module.layer3.1.conv_b1.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.1.bn_b1.weight 384 = 384
module.layer3.1.bn_b1.bias 384 = 384
module.layer3.1.conv_b2.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.1.bn_b2.weight 384 = 384
module.layer3.1.bn_b2.bias 384 = 384
module.layer3.2.conv_a1.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.2.bn_a1.weight 384 = 384
module.layer3.2.bn_a1.bias 384 = 384
module.layer3.2.conv_a2.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.2.bn_a2.weight 384 = 384
module.layer3.2.bn_a2.bias 384 = 384
module.layer3.2.conv_b1.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.2.bn_b1.weight 384 = 384
module.layer3.2.bn_b1.bias 384 = 384
module.layer3.2.conv_b2.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.2.bn_b2.weight 384 = 384
module.layer3.2.bn_b2.bias 384 = 384
module.layer3.3.conv_a1.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.3.bn_a1.weight 384 = 384
module.layer3.3.bn_a1.bias 384 = 384
module.layer3.3.conv_a2.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.3.bn_a2.weight 384 = 384
module.layer3.3.bn_a2.bias 384 = 384
module.layer3.3.conv_b1.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.3.bn_b1.weight 384 = 384
module.layer3.3.bn_b1.bias 384 = 384
module.layer3.3.conv_b2.weight 384 * 384 * 3 * 3 = 1,327,104
module.layer3.3.bn_b2.weight 384 = 384
module.layer3.3.bn_b2.bias 384 = 384
module.fc1.weight 10 * 384 = 3,840
module.fc1.bias 10 = 10
module.fc2.weight 10 * 384 = 3,840
module.fc2.bias 10 = 10

all parameters sum of above = 26,197,316

/home/lts/.conda/envs/PCL/lib/python3.6/site-packages/torch/nn/_reduction.py:49: UserWarning: size_average and reduce args will be deprecated, please use reduction='sum' instead.
warnings.warn(warning.format(ret))
/home/lts/PycharmProject/mean-teacher/pytorch/main.py:224: UserWarning: volatile was removed and now has no effect. Use with torch.no_grad(): instead.
ema_input_var = torch.autograd.Variable(ema_input, volatile=True)
INFO:main:Epoch: [0][0/22000] Time 10.247 (10.247) Data 3.054 (3.054) Class 2.2228 (2.2228) Cons 0.0015 (0.0015) Prec@1 3.000 (3.000) Prec@5 51.000 (51.000)
Traceback (most recent call last):
File "/home/lts/PycharmProject/mean-teacher/pytorch/main.py", line 426, in
main(RunContext(file, 0))
File "/home/lts/PycharmProject/mean-teacher/pytorch/main.py", line 105, in main
train(train_loader, model, ema_model, optimizer, epoch, training_log)
File "/home/lts/PycharmProject/mean-teacher/pytorch/main.py", line 311, in train
**meters.sums()
File "/home/lts/PycharmProject/mean-teacher/pytorch/mean_teacher/run_context.py", line 34, in record
self._record(step, col_val_dict)
File "/home/lts/PycharmProject/mean-teacher/pytorch/mean_teacher/run_context.py", line 45, in _record
self.save()
File "/home/lts/PycharmProject/mean-teacher/pytorch/mean_teacher/run_context.py", line 38, in save
df.to_msgpack(self.log_file_path, compress='zlib')
File "/home/lts/.conda/envs/PCL/lib/python3.6/site-packages/pandas/core/generic.py", line 1320, in to_msgpack
**kwargs)
File "/home/lts/.conda/envs/PCL/lib/python3.6/site-packages/pandas/io/packers.py", line 154, in to_msgpack
writer(fh)
File "/home/lts/.conda/envs/PCL/lib/python3.6/site-packages/pandas/io/packers.py", line 150, in writer
fh.write(pack(a, **kwargs))
File "/home/lts/.conda/envs/PCL/lib/python3.6/site-packages/pandas/io/packers.py", line 691, in pack
use_bin_type=use_bin_type).pack(o)
File "pandas/io/msgpack/_packer.pyx", line 230, in pandas.io.msgpack._packer.Packer.pack (pandas/io/msgpack/_packer.cpp:3642)
File "pandas/io/msgpack/_packer.pyx", line 232, in pandas.io.msgpack._packer.Packer.pack (pandas/io/msgpack/_packer.cpp:3484)
File "pandas/io/msgpack/_packer.pyx", line 191, in pandas.io.msgpack._packer.Packer._pack (pandas/io/msgpack/_packer.cpp:2605)
File "pandas/io/msgpack/_packer.pyx", line 220, in pandas.io.msgpack._packer.Packer._pack (pandas/io/msgpack/_packer.cpp:3178)
File "pandas/io/msgpack/_packer.pyx", line 191, in pandas.io.msgpack._packer.Packer._pack (pandas/io/msgpack/_packer.cpp:2605)
File "pandas/io/msgpack/_packer.pyx", line 220, in pandas.io.msgpack._packer.Packer._pack (pandas/io/msgpack/_packer.cpp:3178)
File "pandas/io/msgpack/_packer.pyx", line 227, in pandas.io.msgpack._packer.Packer._pack (pandas/io/msgpack/_packer.cpp:3348)
TypeError: can't serialize tensor(62, device='cuda:0')

Process finished with exit code 1

@ghost
Copy link

ghost commented Aug 20, 2020

That's because the labeled_minibatch_size is a Tensor object.
Using the codes in https://github.com/bl0/mean-teacher/blob/master/pytorch/main.py should solve this problem.

@panzhengo1
Copy link

I got the same problem, do you know how to solve it yet?

@panzhengo1
Copy link

That's because the labeled_minibatch_size is a Tensor object.
Using the codes in https://github.com/bl0/mean-teacher/blob/master/pytorch/main.py should solve this problem.
The link is 404, do you have another one?

@NamlessM
Copy link

That's because the labeled_minibatch_size is a Tensor object.
Using the codes in https://github.com/bl0/mean-teacher/blob/master/pytorch/main.py should solve this problem.
The link is 404, do you have another one?

You could just change the code in main function. Use .item() method to change the tensor into int

@yingqing0317
Copy link

change “labeled_minibatch_size = target_var.data.ne(NO_LABEL).sum()” to “labeled_minibatch_size = target_var.data.ne(NO_LABEL).sum().item()”

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants