当前位置:   article > 正文

matlab lstm工具箱,[转载]RNN以及LSTM的Matlab代码

matlab的lstm工具箱怎么看

% implementation of LSTM

clc

clear

close all

%% training dataset

generation

binary_dim  = 8;

largest_number = 2^binary_dim - 1;

binary  = cell(largest_number,

1);

for i = 1:largest_number + 1

binary{i}

=

dec2bin(i-1, binary_dim);

int2binary{i}

= binary{i};

end

%% input variables

alpha  = 0.1;

input_dim  = 2;

hidden_dim = 32;

output_dim = 1;

%% initialize neural network

weights

% in_gate  = sigmoid(X(t) * U_i + H(t-1) * W_i)

------- (1)

U_i = 2 * rand(input_dim, hidden_dim) - 1;

W_i = 2 * rand(hidden_dim, hidden_dim) - 1;

U_i_update = zeros(size(U_i));

W_i_update = zeros(size(W_i));

% forget_gate = sigmoid(X(t) * U_f +

H(t-1) * W_f)  -------

(2)

U_f = 2 * rand(input_dim, hidden_dim) - 1;

W_f = 2 * rand(hidden_dim, hidden_dim) - 1;

U_f_update = zeros(size(U_f));

W_f_update = zeros(size(W_f));

% out_gate  = sigmoid(X(t) * U_o + H(t-1) * W_o)

------- (3)

U_o = 2 * rand(input_dim, hidden_dim) - 1;

W_o = 2 * rand(hidden_dim, hidden_dim) - 1;

U_o_update = zeros(size(U_o));

W_o_update = zeros(size(W_o));

% g_gate  = tanh(X(t) * U_g + H(t-1) *

W_g)  ------- (4)

U_g = 2 * rand(input_dim, hidden_dim) - 1;

W_g = 2 * rand(hidden_dim, hidden_dim) - 1;

U_g_update = zeros(size(U_g));

W_g_update = zeros(size(W_g));

out_para = 2 * rand(hidden_dim, output_dim) - 1;

out_para_update = zeros(size(out_para));

% C(t) = C(t-1) .* forget_gate + g_gate

.* in_gate  -------

(5)

% S(t) = tanh(C(t)) .* out_gate

------- (6)

% Out  = sigmoid(S(t) *

out_para)  ------- (7)

% Note: Equations (1)-(6) are cores of

LSTM in forward, and equation (7) is

% used to transfer hiddent layer to

predicted output, i.e., the output layer.

% (Sometimes you can use softmax for

equation (7))

%% train

iter = 99999; % trai

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/菜鸟追梦旅行/article/detail/134823?site
推荐阅读
相关标签
  

闽ICP备14008679号