当前位置: 移动技术网 > IT编程>脚本编程>Python > keras实现调用自己训练的模型,并去掉全连接层

keras实现调用自己训练的模型,并去掉全连接层

2020年06月18日  | 移动技术网IT编程  | 我要评论

命运石之门24,风流许仙,暗黑之路全文阅读

其实很简单

from keras.models import load_model

base_model = load_model('model_resenet.h5')#加载指定的模型
print(base_model.summary())#输出网络的结构图

这是我的网络模型的输出,其实就是它的结构图

__________________________________________________________________________________________________
layer (type)          output shape     param #   connected to           
==================================================================================================
input_1 (inputlayer)      (none, 227, 227, 1) 0                      
__________________________________________________________________________________________________
conv2d_1 (conv2d)        (none, 225, 225, 32) 320     input_1[0][0]          
__________________________________________________________________________________________________
batch_normalization_1 (batchnor (none, 225, 225, 32) 128     conv2d_1[0][0]          
__________________________________________________________________________________________________
activation_1 (activation)    (none, 225, 225, 32) 0      batch_normalization_1[0][0]   
__________________________________________________________________________________________________
conv2d_2 (conv2d)        (none, 225, 225, 32) 9248    activation_1[0][0]        
__________________________________________________________________________________________________
batch_normalization_2 (batchnor (none, 225, 225, 32) 128     conv2d_2[0][0]          
__________________________________________________________________________________________________
activation_2 (activation)    (none, 225, 225, 32) 0      batch_normalization_2[0][0]   
__________________________________________________________________________________________________
conv2d_3 (conv2d)        (none, 225, 225, 32) 9248    activation_2[0][0]        
__________________________________________________________________________________________________
batch_normalization_3 (batchnor (none, 225, 225, 32) 128     conv2d_3[0][0]          
__________________________________________________________________________________________________
merge_1 (merge)         (none, 225, 225, 32) 0      batch_normalization_3[0][0]   
                                 activation_1[0][0]        
__________________________________________________________________________________________________
activation_3 (activation)    (none, 225, 225, 32) 0      merge_1[0][0]          
__________________________________________________________________________________________________
conv2d_4 (conv2d)        (none, 225, 225, 32) 9248    activation_3[0][0]        
__________________________________________________________________________________________________
batch_normalization_4 (batchnor (none, 225, 225, 32) 128     conv2d_4[0][0]          
__________________________________________________________________________________________________
activation_4 (activation)    (none, 225, 225, 32) 0      batch_normalization_4[0][0]   
__________________________________________________________________________________________________
conv2d_5 (conv2d)        (none, 225, 225, 32) 9248    activation_4[0][0]        
__________________________________________________________________________________________________
batch_normalization_5 (batchnor (none, 225, 225, 32) 128     conv2d_5[0][0]          
__________________________________________________________________________________________________
merge_2 (merge)         (none, 225, 225, 32) 0      batch_normalization_5[0][0]   
                                 activation_3[0][0]        
__________________________________________________________________________________________________
activation_5 (activation)    (none, 225, 225, 32) 0      merge_2[0][0]          
__________________________________________________________________________________________________
max_pooling2d_1 (maxpooling2d) (none, 112, 112, 32) 0      activation_5[0][0]        
__________________________________________________________________________________________________
conv2d_6 (conv2d)        (none, 110, 110, 64) 18496    max_pooling2d_1[0][0]      
__________________________________________________________________________________________________
batch_normalization_6 (batchnor (none, 110, 110, 64) 256     conv2d_6[0][0]          
__________________________________________________________________________________________________
activation_6 (activation)    (none, 110, 110, 64) 0      batch_normalization_6[0][0]   
__________________________________________________________________________________________________
conv2d_7 (conv2d)        (none, 110, 110, 64) 36928    activation_6[0][0]        
__________________________________________________________________________________________________
batch_normalization_7 (batchnor (none, 110, 110, 64) 256     conv2d_7[0][0]          
__________________________________________________________________________________________________
activation_7 (activation)    (none, 110, 110, 64) 0      batch_normalization_7[0][0]   
__________________________________________________________________________________________________
conv2d_8 (conv2d)        (none, 110, 110, 64) 36928    activation_7[0][0]        
__________________________________________________________________________________________________
batch_normalization_8 (batchnor (none, 110, 110, 64) 256     conv2d_8[0][0]          
__________________________________________________________________________________________________
merge_3 (merge)         (none, 110, 110, 64) 0      batch_normalization_8[0][0]   
                                 activation_6[0][0]        
__________________________________________________________________________________________________
activation_8 (activation)    (none, 110, 110, 64) 0      merge_3[0][0]          
__________________________________________________________________________________________________
conv2d_9 (conv2d)        (none, 110, 110, 64) 36928    activation_8[0][0]        
__________________________________________________________________________________________________
batch_normalization_9 (batchnor (none, 110, 110, 64) 256     conv2d_9[0][0]          
__________________________________________________________________________________________________
activation_9 (activation)    (none, 110, 110, 64) 0      batch_normalization_9[0][0]   
__________________________________________________________________________________________________
conv2d_10 (conv2d)       (none, 110, 110, 64) 36928    activation_9[0][0]        
__________________________________________________________________________________________________
batch_normalization_10 (batchno (none, 110, 110, 64) 256     conv2d_10[0][0]         
__________________________________________________________________________________________________
merge_4 (merge)         (none, 110, 110, 64) 0      batch_normalization_10[0][0]   
                                 activation_8[0][0]        
__________________________________________________________________________________________________
activation_10 (activation)   (none, 110, 110, 64) 0      merge_4[0][0]          
__________________________________________________________________________________________________
max_pooling2d_2 (maxpooling2d) (none, 55, 55, 64)  0      activation_10[0][0]       
__________________________________________________________________________________________________
conv2d_11 (conv2d)       (none, 53, 53, 64)  36928    max_pooling2d_2[0][0]      
__________________________________________________________________________________________________
batch_normalization_11 (batchno (none, 53, 53, 64)  256     conv2d_11[0][0]         
__________________________________________________________________________________________________
activation_11 (activation)   (none, 53, 53, 64)  0      batch_normalization_11[0][0]   
__________________________________________________________________________________________________
max_pooling2d_3 (maxpooling2d) (none, 26, 26, 64)  0      activation_11[0][0]       
__________________________________________________________________________________________________
conv2d_12 (conv2d)       (none, 26, 26, 64)  36928    max_pooling2d_3[0][0]      
__________________________________________________________________________________________________
batch_normalization_12 (batchno (none, 26, 26, 64)  256     conv2d_12[0][0]         
__________________________________________________________________________________________________
activation_12 (activation)   (none, 26, 26, 64)  0      batch_normalization_12[0][0]   
__________________________________________________________________________________________________
conv2d_13 (conv2d)       (none, 26, 26, 64)  36928    activation_12[0][0]       
__________________________________________________________________________________________________
batch_normalization_13 (batchno (none, 26, 26, 64)  256     conv2d_13[0][0]         
__________________________________________________________________________________________________
merge_5 (merge)         (none, 26, 26, 64)  0      batch_normalization_13[0][0]   
                                 max_pooling2d_3[0][0]      
__________________________________________________________________________________________________
activation_13 (activation)   (none, 26, 26, 64)  0      merge_5[0][0]          
__________________________________________________________________________________________________
conv2d_14 (conv2d)       (none, 26, 26, 64)  36928    activation_13[0][0]       
__________________________________________________________________________________________________
batch_normalization_14 (batchno (none, 26, 26, 64)  256     conv2d_14[0][0]         
__________________________________________________________________________________________________
activation_14 (activation)   (none, 26, 26, 64)  0      batch_normalization_14[0][0]   
__________________________________________________________________________________________________
conv2d_15 (conv2d)       (none, 26, 26, 64)  36928    activation_14[0][0]       
__________________________________________________________________________________________________
batch_normalization_15 (batchno (none, 26, 26, 64)  256     conv2d_15[0][0]         
__________________________________________________________________________________________________
merge_6 (merge)         (none, 26, 26, 64)  0      batch_normalization_15[0][0]   
                                 activation_13[0][0]       
__________________________________________________________________________________________________
activation_15 (activation)   (none, 26, 26, 64)  0      merge_6[0][0]          
__________________________________________________________________________________________________
max_pooling2d_4 (maxpooling2d) (none, 13, 13, 64)  0      activation_15[0][0]       
__________________________________________________________________________________________________
conv2d_16 (conv2d)       (none, 11, 11, 32)  18464    max_pooling2d_4[0][0]      
__________________________________________________________________________________________________
batch_normalization_16 (batchno (none, 11, 11, 32)  128     conv2d_16[0][0]         
__________________________________________________________________________________________________
activation_16 (activation)   (none, 11, 11, 32)  0      batch_normalization_16[0][0]   
__________________________________________________________________________________________________
conv2d_17 (conv2d)       (none, 11, 11, 32)  9248    activation_16[0][0]       
__________________________________________________________________________________________________
batch_normalization_17 (batchno (none, 11, 11, 32)  128     conv2d_17[0][0]         
__________________________________________________________________________________________________
activation_17 (activation)   (none, 11, 11, 32)  0      batch_normalization_17[0][0]   
__________________________________________________________________________________________________
conv2d_18 (conv2d)       (none, 11, 11, 32)  9248    activation_17[0][0]       
__________________________________________________________________________________________________
batch_normalization_18 (batchno (none, 11, 11, 32)  128     conv2d_18[0][0]         
__________________________________________________________________________________________________
merge_7 (merge)         (none, 11, 11, 32)  0      batch_normalization_18[0][0]   
                                 activation_16[0][0]       
__________________________________________________________________________________________________
activation_18 (activation)   (none, 11, 11, 32)  0      merge_7[0][0]          
__________________________________________________________________________________________________
conv2d_19 (conv2d)       (none, 11, 11, 32)  9248    activation_18[0][0]       
__________________________________________________________________________________________________
batch_normalization_19 (batchno (none, 11, 11, 32)  128     conv2d_19[0][0]         
__________________________________________________________________________________________________
activation_19 (activation)   (none, 11, 11, 32)  0      batch_normalization_19[0][0]   
__________________________________________________________________________________________________
conv2d_20 (conv2d)       (none, 11, 11, 32)  9248    activation_19[0][0]       
__________________________________________________________________________________________________
batch_normalization_20 (batchno (none, 11, 11, 32)  128     conv2d_20[0][0]         
__________________________________________________________________________________________________
merge_8 (merge)         (none, 11, 11, 32)  0      batch_normalization_20[0][0]   
                                 activation_18[0][0]       
__________________________________________________________________________________________________
activation_20 (activation)   (none, 11, 11, 32)  0      merge_8[0][0]          
__________________________________________________________________________________________________
max_pooling2d_5 (maxpooling2d) (none, 5, 5, 32)   0      activation_20[0][0]       
__________________________________________________________________________________________________
conv2d_21 (conv2d)       (none, 3, 3, 64)   18496    max_pooling2d_5[0][0]      
__________________________________________________________________________________________________
batch_normalization_21 (batchno (none, 3, 3, 64)   256     conv2d_21[0][0]         
__________________________________________________________________________________________________
activation_21 (activation)   (none, 3, 3, 64)   0      batch_normalization_21[0][0]   
__________________________________________________________________________________________________
conv2d_22 (conv2d)       (none, 3, 3, 64)   36928    activation_21[0][0]       
__________________________________________________________________________________________________
batch_normalization_22 (batchno (none, 3, 3, 64)   256     conv2d_22[0][0]         
__________________________________________________________________________________________________
activation_22 (activation)   (none, 3, 3, 64)   0      batch_normalization_22[0][0]   
__________________________________________________________________________________________________
conv2d_23 (conv2d)       (none, 3, 3, 64)   36928    activation_22[0][0]       
__________________________________________________________________________________________________
batch_normalization_23 (batchno (none, 3, 3, 64)   256     conv2d_23[0][0]         
__________________________________________________________________________________________________
merge_9 (merge)         (none, 3, 3, 64)   0      batch_normalization_23[0][0]   
                                 activation_21[0][0]       
__________________________________________________________________________________________________
activation_23 (activation)   (none, 3, 3, 64)   0      merge_9[0][0]          
__________________________________________________________________________________________________
conv2d_24 (conv2d)       (none, 3, 3, 64)   36928    activation_23[0][0]       
__________________________________________________________________________________________________
batch_normalization_24 (batchno (none, 3, 3, 64)   256     conv2d_24[0][0]         
__________________________________________________________________________________________________
activation_24 (activation)   (none, 3, 3, 64)   0      batch_normalization_24[0][0]   
__________________________________________________________________________________________________
conv2d_25 (conv2d)       (none, 3, 3, 64)   36928    activation_24[0][0]       
__________________________________________________________________________________________________
batch_normalization_25 (batchno (none, 3, 3, 64)   256     conv2d_25[0][0]         
__________________________________________________________________________________________________
merge_10 (merge)        (none, 3, 3, 64)   0      batch_normalization_25[0][0]   
                                 activation_23[0][0]       
__________________________________________________________________________________________________
activation_25 (activation)   (none, 3, 3, 64)   0      merge_10[0][0]          
__________________________________________________________________________________________________
max_pooling2d_6 (maxpooling2d) (none, 1, 1, 64)   0      activation_25[0][0]       
__________________________________________________________________________________________________
flatten_1 (flatten)       (none, 64)      0      max_pooling2d_6[0][0]      
__________________________________________________________________________________________________
dense_1 (dense)         (none, 256)     16640    flatten_1[0][0]         
__________________________________________________________________________________________________
dropout_1 (dropout)       (none, 256)     0      dense_1[0][0]          
__________________________________________________________________________________________________
dense_2 (dense)         (none, 2)      514     dropout_1[0][0]         
==================================================================================================
total params: 632,098
trainable params: 629,538
non-trainable params: 2,560
__________________________________________________________________________________________________

去掉模型的全连接层

from keras.models import load_model

base_model = load_model('model_resenet.h5')
resnet_model = model(inputs=base_model.input, outputs=base_model.get_layer('max_pooling2d_6').output)
#'max_pooling2d_6'其实就是上述网络中全连接层的前面一层,当然这里你也可以选取其它层,把该层的名称代替'max_pooling2d_6'即可,这样其实就是截取网络,输出网络结构就是方便读取每层的名字。
print(resnet_model.summary())

新输出的网络结构:

__________________________________________________________________________________________________
layer (type)          output shape     param #   connected to           
==================================================================================================
input_1 (inputlayer)      (none, 227, 227, 1) 0                      
__________________________________________________________________________________________________
conv2d_1 (conv2d)        (none, 225, 225, 32) 320     input_1[0][0]          
__________________________________________________________________________________________________
batch_normalization_1 (batchnor (none, 225, 225, 32) 128     conv2d_1[0][0]          
__________________________________________________________________________________________________
activation_1 (activation)    (none, 225, 225, 32) 0      batch_normalization_1[0][0]   
__________________________________________________________________________________________________
conv2d_2 (conv2d)        (none, 225, 225, 32) 9248    activation_1[0][0]        
__________________________________________________________________________________________________
batch_normalization_2 (batchnor (none, 225, 225, 32) 128     conv2d_2[0][0]          
__________________________________________________________________________________________________
activation_2 (activation)    (none, 225, 225, 32) 0      batch_normalization_2[0][0]   
__________________________________________________________________________________________________
conv2d_3 (conv2d)        (none, 225, 225, 32) 9248    activation_2[0][0]        
__________________________________________________________________________________________________
batch_normalization_3 (batchnor (none, 225, 225, 32) 128     conv2d_3[0][0]          
__________________________________________________________________________________________________
merge_1 (merge)         (none, 225, 225, 32) 0      batch_normalization_3[0][0]   
                                 activation_1[0][0]        
__________________________________________________________________________________________________
activation_3 (activation)    (none, 225, 225, 32) 0      merge_1[0][0]          
__________________________________________________________________________________________________
conv2d_4 (conv2d)        (none, 225, 225, 32) 9248    activation_3[0][0]        
__________________________________________________________________________________________________
batch_normalization_4 (batchnor (none, 225, 225, 32) 128     conv2d_4[0][0]          
__________________________________________________________________________________________________
activation_4 (activation)    (none, 225, 225, 32) 0      batch_normalization_4[0][0]   
__________________________________________________________________________________________________
conv2d_5 (conv2d)        (none, 225, 225, 32) 9248    activation_4[0][0]        
__________________________________________________________________________________________________
batch_normalization_5 (batchnor (none, 225, 225, 32) 128     conv2d_5[0][0]          
__________________________________________________________________________________________________
merge_2 (merge)         (none, 225, 225, 32) 0      batch_normalization_5[0][0]   
                                 activation_3[0][0]        
__________________________________________________________________________________________________
activation_5 (activation)    (none, 225, 225, 32) 0      merge_2[0][0]          
__________________________________________________________________________________________________
max_pooling2d_1 (maxpooling2d) (none, 112, 112, 32) 0      activation_5[0][0]        
__________________________________________________________________________________________________
conv2d_6 (conv2d)        (none, 110, 110, 64) 18496    max_pooling2d_1[0][0]      
__________________________________________________________________________________________________
batch_normalization_6 (batchnor (none, 110, 110, 64) 256     conv2d_6[0][0]          
__________________________________________________________________________________________________
activation_6 (activation)    (none, 110, 110, 64) 0      batch_normalization_6[0][0]   
__________________________________________________________________________________________________
conv2d_7 (conv2d)        (none, 110, 110, 64) 36928    activation_6[0][0]        
__________________________________________________________________________________________________
batch_normalization_7 (batchnor (none, 110, 110, 64) 256     conv2d_7[0][0]          
__________________________________________________________________________________________________
activation_7 (activation)    (none, 110, 110, 64) 0      batch_normalization_7[0][0]   
__________________________________________________________________________________________________
conv2d_8 (conv2d)        (none, 110, 110, 64) 36928    activation_7[0][0]        
__________________________________________________________________________________________________
batch_normalization_8 (batchnor (none, 110, 110, 64) 256     conv2d_8[0][0]          
__________________________________________________________________________________________________
merge_3 (merge)         (none, 110, 110, 64) 0      batch_normalization_8[0][0]   
                                 activation_6[0][0]        
__________________________________________________________________________________________________
activation_8 (activation)    (none, 110, 110, 64) 0      merge_3[0][0]          
__________________________________________________________________________________________________
conv2d_9 (conv2d)        (none, 110, 110, 64) 36928    activation_8[0][0]        
__________________________________________________________________________________________________
batch_normalization_9 (batchnor (none, 110, 110, 64) 256     conv2d_9[0][0]          
__________________________________________________________________________________________________
activation_9 (activation)    (none, 110, 110, 64) 0      batch_normalization_9[0][0]   
__________________________________________________________________________________________________
conv2d_10 (conv2d)       (none, 110, 110, 64) 36928    activation_9[0][0]        
__________________________________________________________________________________________________
batch_normalization_10 (batchno (none, 110, 110, 64) 256     conv2d_10[0][0]         
__________________________________________________________________________________________________
merge_4 (merge)         (none, 110, 110, 64) 0      batch_normalization_10[0][0]   
                                 activation_8[0][0]        
__________________________________________________________________________________________________
activation_10 (activation)   (none, 110, 110, 64) 0      merge_4[0][0]          
__________________________________________________________________________________________________
max_pooling2d_2 (maxpooling2d) (none, 55, 55, 64)  0      activation_10[0][0]       
__________________________________________________________________________________________________
conv2d_11 (conv2d)       (none, 53, 53, 64)  36928    max_pooling2d_2[0][0]      
__________________________________________________________________________________________________
batch_normalization_11 (batchno (none, 53, 53, 64)  256     conv2d_11[0][0]         
__________________________________________________________________________________________________
activation_11 (activation)   (none, 53, 53, 64)  0      batch_normalization_11[0][0]   
__________________________________________________________________________________________________
max_pooling2d_3 (maxpooling2d) (none, 26, 26, 64)  0      activation_11[0][0]       
__________________________________________________________________________________________________
conv2d_12 (conv2d)       (none, 26, 26, 64)  36928    max_pooling2d_3[0][0]      
__________________________________________________________________________________________________
batch_normalization_12 (batchno (none, 26, 26, 64)  256     conv2d_12[0][0]         
__________________________________________________________________________________________________
activation_12 (activation)   (none, 26, 26, 64)  0      batch_normalization_12[0][0]   
__________________________________________________________________________________________________
conv2d_13 (conv2d)       (none, 26, 26, 64)  36928    activation_12[0][0]       
__________________________________________________________________________________________________
batch_normalization_13 (batchno (none, 26, 26, 64)  256     conv2d_13[0][0]         
__________________________________________________________________________________________________
merge_5 (merge)         (none, 26, 26, 64)  0      batch_normalization_13[0][0]   
                                 max_pooling2d_3[0][0]      
__________________________________________________________________________________________________
activation_13 (activation)   (none, 26, 26, 64)  0      merge_5[0][0]          
__________________________________________________________________________________________________
conv2d_14 (conv2d)       (none, 26, 26, 64)  36928    activation_13[0][0]       
__________________________________________________________________________________________________
batch_normalization_14 (batchno (none, 26, 26, 64)  256     conv2d_14[0][0]         
__________________________________________________________________________________________________
activation_14 (activation)   (none, 26, 26, 64)  0      batch_normalization_14[0][0]   
__________________________________________________________________________________________________
conv2d_15 (conv2d)       (none, 26, 26, 64)  36928    activation_14[0][0]       
__________________________________________________________________________________________________
batch_normalization_15 (batchno (none, 26, 26, 64)  256     conv2d_15[0][0]         
__________________________________________________________________________________________________
merge_6 (merge)         (none, 26, 26, 64)  0      batch_normalization_15[0][0]   
                                 activation_13[0][0]       
__________________________________________________________________________________________________
activation_15 (activation)   (none, 26, 26, 64)  0      merge_6[0][0]          
__________________________________________________________________________________________________
max_pooling2d_4 (maxpooling2d) (none, 13, 13, 64)  0      activation_15[0][0]       
__________________________________________________________________________________________________
conv2d_16 (conv2d)       (none, 11, 11, 32)  18464    max_pooling2d_4[0][0]      
__________________________________________________________________________________________________
batch_normalization_16 (batchno (none, 11, 11, 32)  128     conv2d_16[0][0]         
__________________________________________________________________________________________________
activation_16 (activation)   (none, 11, 11, 32)  0      batch_normalization_16[0][0]   
__________________________________________________________________________________________________
conv2d_17 (conv2d)       (none, 11, 11, 32)  9248    activation_16[0][0]       
__________________________________________________________________________________________________
batch_normalization_17 (batchno (none, 11, 11, 32)  128     conv2d_17[0][0]         
__________________________________________________________________________________________________
activation_17 (activation)   (none, 11, 11, 32)  0      batch_normalization_17[0][0]   
__________________________________________________________________________________________________
conv2d_18 (conv2d)       (none, 11, 11, 32)  9248    activation_17[0][0]       
__________________________________________________________________________________________________
batch_normalization_18 (batchno (none, 11, 11, 32)  128     conv2d_18[0][0]         
__________________________________________________________________________________________________
merge_7 (merge)         (none, 11, 11, 32)  0      batch_normalization_18[0][0]   
                                 activation_16[0][0]       
__________________________________________________________________________________________________
activation_18 (activation)   (none, 11, 11, 32)  0      merge_7[0][0]          
__________________________________________________________________________________________________
conv2d_19 (conv2d)       (none, 11, 11, 32)  9248    activation_18[0][0]       
__________________________________________________________________________________________________
batch_normalization_19 (batchno (none, 11, 11, 32)  128     conv2d_19[0][0]         
__________________________________________________________________________________________________
activation_19 (activation)   (none, 11, 11, 32)  0      batch_normalization_19[0][0]   
__________________________________________________________________________________________________
conv2d_20 (conv2d)       (none, 11, 11, 32)  9248    activation_19[0][0]       
__________________________________________________________________________________________________
batch_normalization_20 (batchno (none, 11, 11, 32)  128     conv2d_20[0][0]         
__________________________________________________________________________________________________
merge_8 (merge)         (none, 11, 11, 32)  0      batch_normalization_20[0][0]   
                                 activation_18[0][0]       
__________________________________________________________________________________________________
activation_20 (activation)   (none, 11, 11, 32)  0      merge_8[0][0]          
__________________________________________________________________________________________________
max_pooling2d_5 (maxpooling2d) (none, 5, 5, 32)   0      activation_20[0][0]       
__________________________________________________________________________________________________
conv2d_21 (conv2d)       (none, 3, 3, 64)   18496    max_pooling2d_5[0][0]      
__________________________________________________________________________________________________
batch_normalization_21 (batchno (none, 3, 3, 64)   256     conv2d_21[0][0]         
__________________________________________________________________________________________________
activation_21 (activation)   (none, 3, 3, 64)   0      batch_normalization_21[0][0]   
__________________________________________________________________________________________________
conv2d_22 (conv2d)       (none, 3, 3, 64)   36928    activation_21[0][0]       
__________________________________________________________________________________________________
batch_normalization_22 (batchno (none, 3, 3, 64)   256     conv2d_22[0][0]         
__________________________________________________________________________________________________
activation_22 (activation)   (none, 3, 3, 64)   0      batch_normalization_22[0][0]   
__________________________________________________________________________________________________
conv2d_23 (conv2d)       (none, 3, 3, 64)   36928    activation_22[0][0]       
__________________________________________________________________________________________________
batch_normalization_23 (batchno (none, 3, 3, 64)   256     conv2d_23[0][0]         
__________________________________________________________________________________________________
merge_9 (merge)         (none, 3, 3, 64)   0      batch_normalization_23[0][0]   
                                 activation_21[0][0]       
__________________________________________________________________________________________________
activation_23 (activation)   (none, 3, 3, 64)   0      merge_9[0][0]          
__________________________________________________________________________________________________
conv2d_24 (conv2d)       (none, 3, 3, 64)   36928    activation_23[0][0]       
__________________________________________________________________________________________________
batch_normalization_24 (batchno (none, 3, 3, 64)   256     conv2d_24[0][0]         
__________________________________________________________________________________________________
activation_24 (activation)   (none, 3, 3, 64)   0      batch_normalization_24[0][0]   
__________________________________________________________________________________________________
conv2d_25 (conv2d)       (none, 3, 3, 64)   36928    activation_24[0][0]       
__________________________________________________________________________________________________
batch_normalization_25 (batchno (none, 3, 3, 64)   256     conv2d_25[0][0]         
__________________________________________________________________________________________________
merge_10 (merge)        (none, 3, 3, 64)   0      batch_normalization_25[0][0]   
                                 activation_23[0][0]       
__________________________________________________________________________________________________
activation_25 (activation)   (none, 3, 3, 64)   0      merge_10[0][0]          
__________________________________________________________________________________________________
max_pooling2d_6 (maxpooling2d) (none, 1, 1, 64)   0      activation_25[0][0]       
==================================================================================================
total params: 614,944
trainable params: 612,384
non-trainable params: 2,560
__________________________________________________________________________________________________

以上这篇keras实现调用自己训练的模型,并去掉全连接层就是小编分享给大家的全部内容了,希望能给大家一个参考,也希望大家多多支持移动技术网。

如对本文有疑问,请在下面进行留言讨论,广大热心网友会与你互动!! 点击进行留言回复

相关文章:

验证码:
移动技术网