当前位置:网站首页>Pytorch deep learning practice_ 11 convolutional neural network
Pytorch deep learning practice_ 11 convolutional neural network
2022-04-23 05:32:00 【Muxi dare】
B Standing at the of Mr. Liu er 《PyTorch Deep learning practice 》Lecture_11 GoogLeNet+Deep Residual Learning
Lecture_11 Convolution neural network advanced Convolutional Neural Network
GoogLeNet
Be good at finding the same module in complex code and writing it into a function / class →Inception Module
Inception Module
I don't know which effect is good , So use multiple convolutions for stacking , Through training will be good to increase the weight , Bad weight reduction
Brutally enumerate every kind of super parameter , Use gradient descent to automatically select the most appropriate
Note that the input and output of each circuit should be consistent
1x1 convolution?
Multi channel information fusion : Multiple channels of information are fused together
Implementation of Inception Module
class InceptionA(nn.Module):
"""docstring for InceptionA"""
def __init__(self,in_channels):
super(InceptionA, self).__init__()
self.branch1x1 = nn.Conv2d(in_channels,16,kernel_size=1)
self.branch5x5_1 = nn.Conv2d(in_channels,16,kernel_size=1)
self.branch5x5_2 = nn.Conv2d(16,24,kernel_size=5,padding=2)
self.branch3x3_1 = nn.Conv2d(in_channels,16,kernel_size=1)
self.branch3x3_2 = nn.Conv2d(16,24,kernel_size=3,padding=1)
self.branch3x3_3 = nn.Conv2d(24,24,kernel_size=3,padding=1)
self.branch_pool = nn.Conv2d(in_channels,24,kernel_size=1)
def forward(self,x):
branch1x1 = self.branch1x1(x)
branch5x5 = self.branch5x5_1(x)
branch5x5 = self.branch5x5_2(branch5x5)
branch3x3 = self.branch3x3_1(x)
branch3x3 = self.branch3x3_2(branch3x3)
branch3x3 = self.branch3x3_3(branch3x3)
branch_pool = F.avg_pool2d(x,kernel_size=3,stride=1,padding=1)
branch_pool = self.branch_pool(branch_pool)
outputs = [branch1x1,branch5x5,branch3x3,branch_pool]
return torch.cat(outputs,dim=1) # Along the first (channel) Splicing
Using Inception Module
class Net(nn.Module):
def __init__(self):
super(Net,self).__init__()
self.conv1 = nn.Conv2d(1,10,kernel_size=5)
self.conv2 = nn.Conv2d(88,20,kernel_size=5)
self.incep1 = InceptionA(in_channels=10)
self.incep2 = InceptionA(in_channels=20)
self.mp = nn.MaxPool2d(2)
self.fc = nn.Linear(1408,10)
def forward(self,x):
in_size = x.size(0)
x = F.relu(self.mp(self.conv1(x)))
x = self.incep1(x)
x = F.relu(self.mp(self.conv2(x)))
x = self.incep2(x)
x = x.view(in_size,-1)
x = self.fc(x)
return x
Code reappearance ( Output curve )
Pay attention to observation test accuracy To determine the training rounds , If the accuracy of a test set reaches a new high , Save its parameters
Excessively increasing the number of network layers will cause the gradient to disappear !!!
Deep Residual Learning Residual network
Residual Block
class ResidualBlock(nn.Module):
"""docstring for ResidualBlock"""
def __init__(self, channels):
super(ResidualBlock, self).__init__()
self.channels = channels
self.conv1 = nn.Conv2d(channels,channels,kernel_size=3,padding=1)
self.conv2 = nn.Conv2d(channels,channels,kernel_size=3,padding=1)
def forward(self,x):
y = F.relu(self,conv1(x))
y = self.conv2(y)
return F.relu(x+y)
Implementation of Simple Residual Network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.conv1 = nn.Conv2d(1, 16, kernel_size=5)
self.conv2 = nn.Conv2d(16, 32, kernel_size=5)
self.mp = nn.MaxPool2d(2)
self.rblock1 = ResidualBlock(16)
self.rblock2 = ResidualBlock(32)
self.fc = nn.Linear(512, 10)
def forward(self, x):
in_size = x.size(0)
x = self.mp(F.relu(self.conv1(x)))
x = self.rblock1(x)
x = self.mp(F.relu(self.conv2(x)))
x = self.rblock2(x)
x = x.view(in_size, -1)
x = self.fc(x)
return x
版权声明
本文为[Muxi dare]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/04/202204220535577592.html
边栏推荐
- 可執行程序執行流程
- Create process memory management copy_ Mm - processes and threads (IX)
- Similarities and differences between vector and array (notes)
- On the use of constant pointer and pointer constant -- exercise (record)
- shell指令学习1
- 巴普洛夫与兴趣爱好
- TSlint注释忽略错误和RESTful理解
- Hongji micro classroom | cyclone RPA's "flexible digital employee" actuator
- Intel SGX preliminary learning and understanding notes (continuously updated)
- catkin_ What did package do
猜你喜欢
Parameter analysis of open3d material setting
弘玑Cyclone RPA为国金证券提供技术支撑,超200个业务场景实现流程自动化
Laravel routing job
Laravel implements the Holy Grail model with template inheritance
Basic knowledge of redis
分支与循环语句
Hongji cyclone RPA provides technical support for Guojin securities and realizes process automation in more than 200 business scenarios
How to set the initial value of El input number to null
Uniapp wechat sharing
If I am PM's performance, movie VR ticket purchase display
随机推荐
(11) Vscode code formatting configuration
Three methods of list rendering
史上最强egg框架的error处理机制
Multi process model in egg -- egg document Porter
2021-09-28
selenium預先加載cookie的必要性
Cross platform packaging of QT packaging program
The QT debug version runs normally and the release version runs crash
Cross domain CORS relationship~
创建进程内存管理copy_mm - 进程与线程(九)
open3d材质设置参数分析
catkin_ What did package do
Output string in reverse order
Frequently asked interview questions - 1 (non technical)
Laravel implements the Holy Grail model with template inheritance
字符识别easyocr
五一劳动节期间什么理财产品会有收益?
npm升级后问题,慌得一批
QSslSocket::connectToHostEncrypted: TLS initialization failed
Frequently asked interview questions - 2 (computer network)