当前位置:网站首页>Tensorflow uses keras to create neural networks
Tensorflow uses keras to create neural networks
2022-04-23 11:28:00 【A gentleman reads the river】
List of articles
Create a simple neural network
Use it directly keras.Model Method
def line_fit_model():
""" Model Build a network structure param: return: """
# Input layer
inputs = tf.keras.Input(shape=(1, ), name="inputs")
# Hidden layer -1
layer1 = layers.Dense(10, activation="relu", name="layer1")(inputs)
# Hidden layer -2
layer2 = layers.Dense(15, activation="relu", name="layer2")(layer1)
# Output layer
outputs = layers.Dense(5, activation="softmax", name="outputs")(layer2)
# Instantiation
model = tf.keras.Model(inputs=inputs, outputs=outputs)
# Show network structure
model.summary()
return model
Inherit keras.Model Method
class LiftModel(tf.keras.Model):
""" Inherit keras.Model Rewrite calling function """
def __init__(self):
super(LiftModel, self).__init__()
self.layer1 = layers.Dense(10, activation=tf.nn.relu, name="layer1")
self.layer2 = layers.Dense(15, activation=tf.nn.relu, name="layer2")
self.outputs = layers.Dense(5, activation=tf.nn.softmax, name="outputs")
def call(self, inputs):
layer1 = self.layer1(inputs)
layer2 = self.layer2(layer1)
outputs = self.outputs(layer2)
return outputs
if __name__ =="__main__":
inputs = tf.constant([[1]])
lift = LiftModel()
lift(inputs)
lift.summary()
use keras.Sequential Built in method
def line_fit_sequetnial():
model = tf.keras.Sequential([
layers.Dense(10, activation="relu", input_shape=(1, ), name="layer1"),
layers.Dense(15, activation="relu", name="layer2"),
layers.Dense(5, activation="softmax", name="outputs")
])
model.summary()
return model
use Sequential() Outsourcing method
def outline_fit_sequential():
model = tf.keras.Sequential()
model.add(layers.Dense(10, activation="relu", input_shape=(1, ), name="layer1"))
model.add(layers.Dense(15, activation="relu", name="layer2"))
model.add(layers.Dense(5, activation="softmax", name="output"))
model.summary()
return model
Create a convolutional neural network
Built in Sequentia Method
def cnn_sequential():
model = tf.keras.Sequential([
# Convolution layer -1
layers.Conv2D(32, (3, 3), activation="relu", input_shape=(28, 28, 3), name="conv-1"),
# Pooling layer -1
layers.MaxPooling2D((2, 2), name="pool-1"),
# Convolution layer -2
layers.Conv2D(64, (3, 3),activation="relu" ,name="conv-2"),
# Pooling layer -2
layers.MaxPooling2D((2, 2), name="pool-2"),
# Convolution layer -3
layers.Conv2D(64, (3, 3), activation="relu", name="conv-3"),
# Flatten the column vector
layers.Flatten(),
# Fully connected layer -1
layers.Dense(64, activation="relu", name="full-1"),
# softmax layer
layers.Dense(64, activation="softmax", name="softmax-1")
])
model.summary()
use Sequential Outsourcing method
def outline_cnn_sequential():
model = tf.keras.Sequential()
model.add(layers.Conv2D(32, (3, 3), activation="relu", input_shape=(84, 84, 3), name="conv-1" ))
model.add(layers.MaxPooling2D((2, 2), name="pool-1"))
model.add(layers.Conv2D(64, (3, 3), activation="relu", name="conv-2"))
model.add(layers.MaxPooling2D((2, 2), name="pool-2"))
model.add(layers.Conv2D(64, (3, 3), activation="relu", name="conv-3"))
model.add(layers.MaxPooling2D((2, 2), name="pool-3"))
model.add(layers.Flatten())
model.add(layers.Dense(64, activation="relu", name='full-1'))
model.add(layers.Dense(64, activation="softmax", name="softmax-1"))
model.summary()
版权声明
本文为[A gentleman reads the river]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/04/202204231122283595.html
边栏推荐
- Nacos Foundation (6): Nacos configuration management model
- Blog post navigation (real-time update)
- 简易投票系统数据库设计
- Summary of QT semaphore unresolved errors
- SOFA Weekly | 年度优秀 Committer 、本周 Contributor、本周 QA
- 分享两个实用的shell脚本
- 探究机器人教育的器材与教学
- 分享两个实用的shell脚本
- Interprocess communication -- message queue
- Detailed explanation of integer data type tinyint in MySQL
猜你喜欢
云呐|如何管理好公司的固定资产,固定资产管理怎么做
Interpretation of 2022 robot education industry analysis report
Interpretation of biological recognition in robot programming course
Upgrade the functions available for cpolar intranet penetration
QT 64 bit static version display gif
Yunna | how to manage the company's fixed assets and how to manage fixed assets
Get things technology network optimization - CDN resource request Optimization Practice
Nacos Foundation (7): Configuration Management
分享两个实用的shell脚本
PDMS soft lithography process
随机推荐
MQ的了解
stylecloud ,wordcloud 库学习及使用例子
How does QT turn qwigdet into qdialog
QT 64 bit static version display gif
SOFA Weekly | 年度优秀 Committer 、本周 Contributor、本周 QA
年度最尴尬的社死瞬间,是Siri给的
谁说抠图要会 PS?这个开源神器还能批量抠,效果拔群!
Interpreting the art created by robots
解析幼儿教育中steam教育的融合
讯飞2021年营收183亿:同比增41% 净利为15.56亿
ES6 learning notes II
Redis optimization series (II) redis master-slave principle and master-slave common configuration
Learn go language 0x06: Fibonacci closure exercise code in go language journey
Redis学习之五---高并发分布式锁实战
MySQL数据库事务transaction示例讲解教程
Mysql database transaction example tutorial
Interpretation of 2022 robot education industry analysis report
Learning go language 0x08: practice using error in go language journey
Analyzing the role of social robots in basic science
nacos基础(8):登录管理