当前位置:网站首页>PyTorch 12. Hook usage
PyTorch 12. Hook usage
2022-04-23 07:28:00 【DCGJ666】
PyTorch 12. hook Usage of
hook
- because pytorch The intermediate result of graph calculation will be discarded automatically , So if you want these values, you need to use hook functions . Hook functions include Variable Hook and nn.Module hook , Similar usage .
- In the use of hook Function time , Its input should not be modified , But it can return a new gradient that replaces the current gradient , namely , Using this function will return a gradient value
register_hook
in the light of Tensor Variable hook function
import torch
grad_list = []
def print_grad(grad):
grad_list.append(grad)
x = torch.randn(2,1)
y = x+2
y.register_hook(print_grad)
y.backward()
When the whole network carries out reverse transmission , After running to the variable registered by the hook function , The gradient of the variable is saved , call print_grad function , Add gradient to grad_list in
register_forward_hook
For the network layer hook function , Specific visual blog , You can refer to my other document , Visualize specific layers
Hook functions should not modify input and output , And it should be deleted in time after use , To avoid increasing the running load by running the hook every time . Hook function is mainly used to obtain some intermediate results , Such as the output of an intermediate layer or the gradient of a layer .
import torch
model = VGG()
features = torch.Tensor()
def hook(module, input, output):
# Copy the output of this layer to features in
features.copy_(output.data)
handle = model.layer8.register_forward_hook(hook)
_ = model(input)
# run out hook Delete after
handle.remove()
register_backward_hook
First introduced Container The concept of : When Module Of forward There is only one of the functions Function When , be called Module, If Module Include others Module, be called Container.
stay module Register one on backward hook. The target of this method can only be used in Module On , Cannot be used in Container On .
Every time calculation module Of inputs The gradient of , This hook Will be called
hook(module,grad_input,grad_output)->Tensor or None
Find the gradient of the module , And register_grad similar
版权声明
本文为[DCGJ666]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/04/202204230611343899.html
边栏推荐
猜你喜欢

GIS实战应用案例100篇(五十一)-ArcGIS中根据指定的范围计算nc文件逐时次空间平均值的方法

AUTOSAR从入门到精通100讲(八十四)-UDS之时间参数总结篇

Use originpro express for free

北峰油气田自组网无线通信对讲系统解决方案

【点云系列】Pointfilter: Point Cloud Filtering via Encoder-Decoder Modeling

RISCV MMU 概述

PyTorch 10. 学习率

【期刊会议系列】IEEE系列模板下载指南

使用proteus仿真STM32超声波SRF04测距!Code+Proteus

基于openmv的无人机Apriltag动态追踪降落完整项目资料(labview+openmv+apriltag+正点原子四轴)
随机推荐
机器学习——PCA与LDA
enforce fail at inline_container.cc:222
Modifying a column with the 'identity' pattern is not supported
多机多卡训练时的错误
网络层重要知识(面试、复试、期末)
AUTOSAR从入门到精通100讲(五十一)-AUTOSAR网络管理
Unable to determine the device handle for GPU 0000:02:00.0: GPU is lost.
Mysql database installation and configuration details
【51单片机交通灯仿真】
GIS实战应用案例100篇(三十四)-拼接2020globeland30
armv8m(cortex m33) MPU实战
x509解析
关于短视频平台框架搭建与技术选型探讨
Solution to slow compilation speed of Xcode
PyTorch 18. torch. backends. cudnn
torch.where能否传递梯度
AUTOSAR从入门到精通100讲(五十二)-诊断和通信管理功能单元
F. The wonderful use of pad
By onnx checker. check_ Common errors detected by model
MySQL installation and configuration - detailed tutorial