赞
踩
程序:
- import torch as t
- from torch import nn
- from torch.autograd import Variable as V
-
- t.manual_seed(1000)
-
- # 输入 batch_size=3,序列长度都为2,序列中每个元素占4维
-
- input=V(t.randn(2,3,4))
- print(input)
-
- # lstm 输入向量4维,3个隐藏元,1层
-
- lstm=nn.LSTM(4,3,1)
-
- # 初始状态:1层,batch_size=3,3个隐层元
-
- h0=V(t.randn(1,3,3))
- c0=V(t.randn(1,3,3))
-
- out,(hn,cn)=lstm((input,h0),c0)
- out,(hn,cn)=lstm(input,(h0,c0))
- print(out)
- print("hn=\n",hn)
- print("cn=\n",cn)
-
- out,hn=lstm(input,(h0,c0))
- print(out)
- print("hn=\n",hn)
这样写,out,(hn,cn)=lstm((input,h0),c0) 会报错:
- /home/wangbin/anaconda3/envs/deep_learning/bin/python3.7 /home/wangbin/anaconda3/envs/deep_learning/project/main.py
- Traceback (most recent call last):
- File "/home/wangbin/anaconda3/envs/deep_learning/project/main.py", line 21, in <module>
- out,(hn,cn)=lstm((input,h0),c0)
- File "/home/wangbin/anaconda3/envs/deep_learning/lib/python3.7/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
- result = self.forward(*input, **kwargs)
- File "/home/wangbin/anaconda3/envs/deep_learning/lib/python3.7/site-packages/torch/nn/modules/rnn.py", line 564, in forward
- max_batch_size = input.size(0) if self.batch_first else input.size(1)
- AttributeError: 'tuple' object has no attribute 'size'
- tensor([[[-0.5306, -1.1300, -0.6734, -0.7669],
- [-0.7029, 0.9896, -0.4482, 0.8927],
- [-0.6043, 1.0726, 1.0481, 1.0527]],
-
- [[-0.6424, -1.2234, -1.0794, -0.6037],
- [-0.7926, -0.1414, -1.0225, -0.0482],
- [ 0.6610, -0.8908, 1.4793, -0.3934]]])
-
- Process finished with exit code 1
根据报错信息查看:
可以发现lstm的forward具有三个参数,self、input、hx,但是hx可以再分。
- def forward(self, input, hx=None): # noqa: F811
- orig_input = input
- # xxx: isinstance check needs to be in conditional for TorchScript to compile
- if isinstance(orig_input, PackedSequence):
- input, batch_sizes, sorted_indices, unsorted_indices = input
- max_batch_size = batch_sizes[0]
- max_batch_size = int(max_batch_size)
- else:
- batch_sizes = None
- max_batch_size = input.size(0) if self.batch_first else input.size(1)
- sorted_indices = None
- unsorted_indices = None
-
- if hx is None:
- num_directions = 2 if self.bidirectional else 1
- zeros = torch.zeros(self.num_layers * num_directions,
- max_batch_size, self.hidden_size,
- dtype=input.dtype, device=input.device)
- hx = (zeros, zeros)
- else:
- # Each batch of the hidden state should match the input sequence that
- # the user believes he/she is passing in.
- hx = self.permute_hidden(hx, sorted_indices)
-
- self.check_forward_args(input, hx, batch_sizes)
- if batch_sizes is None:
- result = _VF.lstm(input, hx, self._flat_weights, self.bias, self.num_layers,
- self.dropout, self.training, self.bidirectional, self.batch_first)
- else:
- result = _VF.lstm(input, batch_sizes, hx, self._flat_weights, self.bias,
- self.num_layers, self.dropout, self.training, self.bidirectional)
- output = result[0]
- hidden = result[1:]
- # xxx: isinstance check needs to be in conditional for TorchScript to compile
- if isinstance(orig_input, PackedSequence):
- output_packed = PackedSequence(output, batch_sizes, sorted_indices, unsorted_indices)
- return output_packed, self.permute_hidden(hidden, unsorted_indices)
- else:
- return output, self.permute_hidden(hidden, unsorted_indices)
如果这样写out,(hn,cn)=lstm(input,h0,c0) ,仍然报错:
- /home/wangbin/anaconda3/envs/deep_learning/bin/python3.7 /home/wangbin/anaconda3/envs/deep_learning/project/main.py
- tensor([[[-0.5306, -1.1300, -0.6734, -0.7669],
- [-0.7029, 0.9896, -0.4482, 0.8927],
- [-0.6043, 1.0726, 1.0481, 1.0527]],
-
- [[-0.6424, -1.2234, -1.0794, -0.6037],
- [-0.7926, -0.1414, -1.0225, -0.0482],
- [ 0.6610, -0.8908, 1.4793, -0.3934]]])
- Traceback (most recent call last):
- File "/home/wangbin/anaconda3/envs/deep_learning/project/main.py", line 21, in <module>
- out,(hn,cn)=lstm(input,h0,c0)
- File "/home/wangbin/anaconda3/envs/deep_learning/lib/python3.7/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
- result = self.forward(*input, **kwargs)
- TypeError: forward() takes from 2 to 3 positional arguments but 4 were given
-
- Process finished with exit code 1
错误原因是self是人家的默认参数,虽然实参里面没写,但是形参里面已经存在,是默认传递的。
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。