当前位置:   article > 正文

在U-Net中加入注意力机制时出现维度不匹配的问题_为什么加注意力机制总是通道不匹配

为什么加注意力机制总是通道不匹配

在U-Net中加入注意力机制时出现维度不匹配的问题

add()函数要求相加的两个函数维度必须完全相同

conat_x = add([theta_x, phi_g])

    报错:Traceback (most recent call last): File "I:/Retina-OD-Unet-master+unet-layer+attention/OD-training.py", line 478, in <module> model = att_unet(patch_height, patch_width,n_ch) #the U-net model File "I:/Retina-OD-Unet-master+unet-layer+attention/OD-training.py", line 232, in att_unet up1 = attention_up_and_concate(conv4, conv3, data_format=data_format) File "I:/Retina-OD-Unet-master+unet-layer+attention/OD-training.py", line 161, in attention_up_and_concate layer = attention_block_

    声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/Gausst松鼠会/article/detail/349685
    推荐阅读
    相关标签
      

    闽ICP备14008679号