You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have looked over the code in the repository and still have not found a difference in my code that would cause it to break. I have reworked the entire chapter up to this point and still can not figure it out. Below is the error message and my code in its entirety is attached.
In [121]: with tqdm.trange(3000) as t:
...: for epoch in t:
...: epoch_loss = 0.0
...: for x, y in zip(xs, ys):
...: predicted = net.forward(x)
...: epoch_loss += loss.loss(predicted, y)
...: gradient = loss.gradient(predicted, y)
...: net.backward(gradient)
...: optimizer.step(net)
...: t.set_description(f"xor loss {epoch_loss:.3f}")
...:
0%| | 0/3000 [00:00<?, ?it/s]
NotImplementedError Traceback (most recent call last)
in
3 epoch_loss = 0.0
4 for x, y in zip(xs, ys):
----> 5 predicted = net.forward(x)
6 epoch_loss += loss.loss(predicted, y)
7 gradient = loss.gradient(predicted, y)
in forward(self, input)
10 """Just forward the input through the layers in order."""
11 for layer in self.layers:
---> 12 input = layer.forward(input)
13 return input
14 def backward(self, gradient):
in forward(self, input)
6 Note the lack of types. We're not going to be perscriptive about what kinds of inputs layers can take and whats kinds of outputs they can return
7 """
----> 8 raise NotImplementedError
9 def backward(self, gradient):
10 """
I have looked over the code in the repository and still have not found a difference in my code that would cause it to break. I have reworked the entire chapter up to this point and still can not figure it out. Below is the error message and my code in its entirety is attached.
In [121]: with tqdm.trange(3000) as t:
...: for epoch in t:
...: epoch_loss = 0.0
...: for x, y in zip(xs, ys):
...: predicted = net.forward(x)
...: epoch_loss += loss.loss(predicted, y)
...: gradient = loss.gradient(predicted, y)
...: net.backward(gradient)
...: optimizer.step(net)
...: t.set_description(f"xor loss {epoch_loss:.3f}")
...:
0%| | 0/3000 [00:00<?, ?it/s]
NotImplementedError Traceback (most recent call last)
in
3 epoch_loss = 0.0
4 for x, y in zip(xs, ys):
----> 5 predicted = net.forward(x)
6 epoch_loss += loss.loss(predicted, y)
7 gradient = loss.gradient(predicted, y)
in forward(self, input)
10 """Just forward the input through the layers in order."""
11 for layer in self.layers:
---> 12 input = layer.forward(input)
13 return input
14 def backward(self, gradient):
in forward(self, input)
6 Note the lack of types. We're not going to be perscriptive about what kinds of inputs layers can take and whats kinds of outputs they can return
7 """
----> 8 raise NotImplementedError
9 def backward(self, gradient):
10 """
NotImplementedError:
dsfs_deeplearningissue.txt
The text was updated successfully, but these errors were encountered: