You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the code given below , I find that we are not specifying weights in the resnet layers. If so do we except that the weights will get updated through the optimizer and not through the HyperNetwork ? -
class ResNetBlock(nn.Module):
def __init__(self, in_size=16, out_size=16, downsample = False):
super(ResNetBlock,self).__init__()
self.out_size = out_size
self.in_size = in_size
if downsample:
self.stride1 = 2
self.reslayer = nn.Conv2d(in_channels=self.in_size, out_channels=self.out_size, stride=2, kernel_size=1)
else:
self.stride1 = 1
self.reslayer = IdentityLayer()
self.bn1 = nn.BatchNorm2d(out_size)
self.bn2 = nn.BatchNorm2d(out_size)
def forward(self, x, conv1_w, conv2_w):
residual = self.reslayer(x)
print(x.shape)
out = F.relu(self.bn1(F.conv2d(x, conv1_w, stride=self.stride1, padding=1)), inplace=True)
out = self.bn2(F.conv2d(out, conv2_w, padding=1))
out += residual
out = F.relu(out)
return out
The text was updated successfully, but these errors were encountered:
In this setup, the weights for the ResNet layers are dynamically generated by the Hypernetwork rather than being directly optimized by the standard optimizer. During training, the ResNet block weights change based on the outputs of the Hypernetwork, while the optimizer updates only the weights of the Hypernetwork itself. So, even though we are not explicitly specifying weights for the ResNet layers, they are indirectly learned through the Hypernetwork rather than being updated directly by the optimizer
In the code given below , I find that we are not specifying weights in the resnet layers. If so do we except that the weights will get updated through the optimizer and not through the HyperNetwork ? -
The text was updated successfully, but these errors were encountered: