You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there, I download this code and adapted them for my semi-supervised segmentation (Pytorch version). And thanks for this genius code you provided!
But I have a question is that I know that mean-teacher model contains two loss, one is for unsupervised loss for paired labeled data and ground-truth, and the other is for contrast loss.
And here is the contrast loss I calculated:
get the consistency weight by 10 * sigmoid_rampup(epoch, 5) at each epoch
compute the logits from student and teacher's output
And then I got contrast loss up to thousands and it seems not right. Is it normal or some bug in my code?
Could you give me some advice if you have some idea? Thanks!
The text was updated successfully, but these errors were encountered:
Hi, there. I am here to update my problem. For debugging and I found that the problem is from softmax_mse_loss after change it to nn.MSEloss() the problem of loss up to thousands is gone.
But it's still confused me that,
will contrast loss is the regularization for student model and teacher model, so it won't decrease too low or increase too much, just oscillate around e-4 or something else?
Hi there, I download this code and adapted them for my semi-supervised segmentation (Pytorch version). And thanks for this genius code you provided!
But I have a question is that I know that mean-teacher model contains two loss, one is for unsupervised loss for paired labeled data and ground-truth, and the other is for contrast loss.
And here is the contrast loss I calculated:
10 * sigmoid_rampup(epoch, 5)
at each epochAnd then I got contrast loss up to thousands and it seems not right. Is it normal or some bug in my code?
Could you give me some advice if you have some idea? Thanks!
The text was updated successfully, but these errors were encountered: