반응형
간단한 Linear Regression모델 선언
import torch.nn as nn
class LR(nn.Module):
def __init__(self, in_size, output_size):
super(LR, self).__init__()
self.linear = nn.Linear(in_size, output_size)
def forward(self, x):
out = self.linear(x)
return out
# Linear Regression model 생성
model = LR(1, 1)
# Linear Regression bias, weight 변수 확인
model.state_dict()
"""
OrderedDict([('linear.weight', tensor([[-0.3027]])),
('linear.bias', tensor([0.4232]))])
"""
model.parameters()
"""
[Parameter containing:
tensor([[-0.3027]], requires_grad=True), Parameter containing:
tensor([0.4232], requires_grad=True)]
"""
# weight, bias 변수를 고치고싶다면
model.state_dict()['linear.weight'].data[0] = torch.tensor([0.5133])
model.state_dict()['linear.bias'].data[0] = torch.tensor([-0.4414])
x = torch.tensor([1.0])
yhat = model(x)
yhat # tensor([[0.0739]])
x= torch.tensor([[1.0], [2.0], [3.0]])
yhat = model(x)
yhat
"""
tensor([[-0.8548],
[-1.4548],
[-2.0548]], grad_fn=<AddmmBackward>)
"""
nn.Linear(in_size, output_size)
in_size, output_size는 bias, weight의 shape를 조정하는 수치
반응형
'Data > Data Science' 카테고리의 다른 글
[딥러닝] LSTM & Anomaly Detection (0) | 2020.04.29 |
---|---|
[ML] Anomaly Detection 알고리즘 정리 (0) | 2020.03.22 |
[Pytorch] Differentiation in Pytorch (0) | 2020.03.11 |
[Pytorch] LSTM 간단한 공부 (1) | 2020.01.04 |
[Pyspark] pyspark 내장 ML 모델사용 (0) | 2020.01.02 |