How backward and step are associated with model paramters update?

Posted by yaohong on Wednesday, November 17, 2021

TOC

How are backward and step associated with model paramters update?

optimizer accept the paramters of the model, it can update the parameters, but how is loss function associated with paramters?

loss.backward()

optimizer.step()

REFERENCE:

1.pytorch - connection between loss.backward() and optimizer.step()

2.https://pytorch.org/tutorials/beginner/former_torchies/nnft_tutorial.html#forward-and-backward-function-hooks

「点个赞」

Yaohong

点个赞

使用微信扫描二维码完成支付