当前位置:网站首页>Little things that happen when writing DMF
Little things that happen when writing DMF
2022-04-21 08:21:00 【m0_ fifty-two million four hundred and fourteen thousand seven 】
in fact , The following is what I began to think about halfway
- Error reporting response model nn.linear The parameters do not match and cannot be matmaul, After looking at the model parameters for a long time, there is no problem , Look at the error when taking out the incoming parameters .
- tensor The index says it can only be byte,long,bool, Then start with float Not good either. , Later, I found that it can actually be in autograd.Variable In the process tensor First convert to byte The reconversion of type into Variable—— I'm not sure , It turned out that it was just through direct cancellation variable, hold array Type passed in
- To determine the , The above method doesn't work , say mask[256] And what doesn't match
- “float() argument must be a string or a number, not ‘NoneType’”
This is what I'm rewriting loss Produced in the process of . Then I tried to turn float The result is wrong , Strong go tensor And I failed . And found that :NoneType Indicates that there are no parameters here , So my results didn't get in ?? - Then I found out that actually my model No results : That's the sentence :(output Is directly None)

- It turned out to be my model Inside forward Forgot the return value
- Feel this output There's a problem, but I don't know how to solve it , After all, I return an individual value , But the other two values are entered . Look at the follow-up results , If not , Just take out output The third one in tensor. Sure enough ,loss All are nan
- Suddenly found that there were no three tensor??? On the other loss All are nan The problem of —— The gradient... Disappeared loss There should be something wrong with the function , Just... The first time -inf 了

- Look at the ,x and y It's written backwards
mdzz - Ah, the gradient disappeared in the first training , I don't know why this matrix increases and decreases , I decided to try the normalization operation and distribute it to 0~5
- Find it impossible to , that loss The function is designed to give 0~1 Between x The distribution of the , You can't zoom in . in addition , I found that even if the maximum and minimum normalization , It didn't work .
- Practice and try , This loss It's too big . I know why I'm old ,loss Function copied wrong , No sign , It must be smaller and smaller
版权声明
本文为[m0_ fifty-two million four hundred and fourteen thousand seven ]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/04/202204210754059503.html
边栏推荐
猜你喜欢

SQL Server 数据库之SQL Server 数据库的安全设置

Redis (14) -- master-slave replication of redis
![Niuke white moon race 4 [solution]](/img/73/2ca6cc0501134258544e77736d18c8.png)
Niuke white moon race 4 [solution]

Final Cut Pro 在视频的多个地方同时打马赛克

Echart double line - echart chart (II)

ASUS good screen 120Hz high refresh rate, opening a new pattern of OLED Market

在链表结点后插入新结点

最近发现百度云分享都要设置有提取码, 无法设置为无提取码的分享.本文将教你怎么绕过百度设置无提取码的分享(即公开的), 一行代码搞定!

WOS开道,微盟为中国SaaS趟新路

大家在深圳的一天是怎样度过的呢?
随机推荐
ssm+double 用redis实现三次登录,三次登录失败锁定五分钟
eval()函数
22年4月刷题总结
web 轮播
千万别贪便宜,免费的MES系统可不一定好
怎么获取png图片的创作者等信息, 不仅仅是文件大小等信息.即要怎么获取图片的元信息(metadata)
关于深度学习画图这件事
File转换成MultiPartFile
The resolution database could not be started
feign请求拦截器中读取配置文件的问题
【get C】数据在内存是如何存储的
No .egg-info directory found in xxx\pip-pip-egg-info-mq
How pbidea imports large medical insurance files
Ehcart map
Hierarchical traversal of binary tree
Ronglian Qimo helps well-known new retail brands and reshapes new links for enterprise growth
4.20 learning records
不确定是 BUG 还是设定
第五章 函数
Convert n arrays into JSON form