-
Notifications
You must be signed in to change notification settings - Fork 3
/
Copy pathREADME
44 lines (30 loc) · 1.32 KB
/
README
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
This is a very simple RNN built on top of Theano.
- No learning methods.
- No error functions.
- Only inputs, hidden, outputs, bias for hidden.
- Recurrent weights only from hidden to hidden.
- Only tanh, sig and identity transfer function.
- Long short-term memory cells.
Will be extended, depending on what Theano offers (they plan to add BPTT) and
how much time I have.
Performance: Quick tests show that this implementation is roughly 20% as fast as
PyBrain with arac.
Usage:
>>> import scipy
>>> from rnn import RecurrentNetwork
Create a network with three hiddens. The transfer functions can be specified as
'sig', 'tanh' and 'id'.
>>> net = RecurrentNetwork(1, 3, 1, hiddenfunc='tanh', outfunc='id')
If you want to use LSTM cells, call
>>> net = LstmNetwork(1, 3, 1, outfunc='id')
Generate a sequence of length 3.
>>> inpt = scipy.random.random((3, 1))
Activate sequence in the network.
>>> net(inpt)
[array([[-0.86715716, -0.38825977, -0.8660872 ],
[-0.15247899, 0.787628 , -0.88322586],
[-0.95365858, -0.55350506, -0.89969933]], dtype=float32), array([[ 1.33543777],
[-0.95650125],
[ 1.68285382]], dtype=float32)]
First list item is the hidden activations, second is the results. In the case of
LSTMs, there is three arrays (state, hidden, output).