当前位置:网站首页>记录贴:pytorch学习Part4
记录贴:pytorch学习Part4
2022-08-08 17:33:00 【安联之夜】
记录贴:pytorch学习Part4
一、卷积
import torch
import torch.nn as nn
from torch.nn import functional as F
#方式一
layer=nn.Conv2d(1,3,kernel_size=3,stride=1,padding=0)
x = torch.rand(1,1,28,28)
out = layer.forward(x)
layer=nn.Conv2d(1,3,kernel_size=3,stride=1,padding=1)
out = layer.forward(x)
layer=nn.Conv2d(1,3,kernel_size=3,stride=2,padding=1)
out = layer.forward(x)
out.shape
layer.weight#权重的维度为3,1,3,3,代表三个卷积,一个图像通道,3*3的卷积核
layer.bias#一层卷积一个偏置
#方式二
w = torch.rand(16,3,5,5)#16个卷积层,3个图像通道,5*5的卷积核
b = torch.rand(16)
x = torch.randn(1,3,28,28)
out = F.conv2d(x,w,b,stride=1,padding=1)
二、下采样和上采样
#Pooling
x = torch.randn(1,16,14,14)
layer = nn.MaxPool2d(2,stride=2)#最大
out = layer(x)
out = F.avg_pool2d(x,2,stride=2)#平均
#upsample
x = out
out = F.interpolate(x,scale_factor=2,mode='nearest')
out = F.interpolate(x,scale_factor=3,mode='nearest')
三、标准化
#Image Normalization
normalize = transforms.Normalize(mean=[0.485,0.456,0.406],
std=[0.229,0.224,0.225])
#Batch Normalization
x = torch.rand(100,16,784)
layer = nn.BatchNorm1d(16)
out = layer(x)
layer.running_mean
layer.running_var
x = torch.rand(1,16,7,7)
layer = nn.BatchNorm2d(16)
四、Resnet
class ResBlk(nn.Module):
def __init__(self,ch_in,ch_out):
self.conv1 = nn.Conv2d(ch_in,ch_out,kernel_size=3,stride=1,padding=1)
self.bn1 = nn.BatchNorm2d(ch_out)
self.conv2 = nn.Conv2d(ch_in,ch_out,kernel_size=3,stride=1,padding=1)
self.bn2 = nn.BatchNorm2d(ch_out)
self.extra = nn.Sequential()
if ch_out != ch_in:
self.extra = nn.Sequential(
nn.Conv2d(ch_in,ch_out,kernel_size=4,stride=1),
nn.BatchNorm2d(ch_out))
def forward(self,x):
out = F.relu(self.bn1(self.conv1(x)))
out = self.bn2(self.conv2(out))
out = self.extra(x) + out
return out
五、类
# 网络结构
class MLP(nn.Module):
def __init__(self):
super(MLP, self).__init__()
self.model = nn.Sequential(
#nn.Linear(784, 200),
Mylinear(784,200),
nn.BatchNorm1d(200, eps=1e-8),
nn.LeakyReLU(inplace=True),
#nn.Linear(200, 200),
Mylinear(200, 200),
nn.BatchNorm1d(200, eps=1e-8),
nn.LeakyReLU(inplace=True),
#nn.Linear(200, 10),
Mylinear(200,10),
nn.LeakyReLU(inplace=True)
)
#Container
self.net = nn.Sequential(
nn.Conv2d(1,32,5,1,1),
nn.MaxPool2d(2,2),
nn.ReLU(True),
nn.BatchNorm2d(32),
nn.Conv2d(32,64,3,1,1),
nn.ReLU(True),
nn.BatchNorm2d(64),
nn.Conv2d(64,64,3,1,1),
nn.MaxPool2d(2,2),
nn.ReLU(True),
nn.BatchNorm2d(64),
nn.Conv2d(64,128,3,1,1),
nn.ReLU(True),
nn.BatchNorm2d(128))
边栏推荐
猜你喜欢
spark学习笔记(八)——sparkSQL概述-定义/特点/DataFrame/DataSet
DSPE-PEG-NH2,DSPE-PEG-amine,474922-26-4,磷脂-聚乙二醇-氨基科研试剂
比较器是否可以当做运放使用?
[Paper Reading] RAL 2022: Receding Moving Object Segmentation in 3D LiDAR Data Using Sparse 4D Convolutions
DSPE-PEG-FITC,Fluorescein-PEG-DSPE,修饰性PEG磷脂-聚乙二醇-荧光素
yarn : 无法加载文件 D:xxx\node_global\yarn.ps1 因为在此系统上禁止运行脚本
How to set timed network disconnection to assist self-discipline in win10
【20210923】选择感兴趣的研究方向?
Fluorescein-PEG-CLS,胆固醇-聚乙二醇-荧光素用于缩短包封周期
Open source summer | I have nothing to do during the epidemic, I made a button display box special effect to display my blog
随机推荐
LeetCode_Binary Tree_Medium_515. Find the maximum value in each tree row
How big is 1dp!
Fluorescein-PEG-CLS,胆固醇-聚乙二醇-荧光素用于缩短包封周期
L2-012 关于堆的判断 (25 分)(堆)
LeetCode(剑指 Offer)- 22. 链表中倒数第k个节点
顺序表与链表结构及解析
软件工程基础知识--认识软件工程
LeetCode_二叉树_中等_515.在每个树行中找最大值
医疗机构漏诊,该不该赔?--一起交通事故多处骨折,又遇到医疗机构漏诊
Regular use in js
win10如何设置定时联网断网辅助自律
XDOJ-统计正整数个数
L2-022 重排链表 (25 分)(模拟链表)
目标检测、目标跟踪、图像分类最新进展
How banner displays drawable images
IP分配——DHCP(讲解+配置)
List<String>用空串替换null值,并且都加上单引号,并且转为字符串用,分割
离线安装 Anaconda + TensorFlow
The new version of squirrel as source change operation
L2-015 互评成绩 (25 分)