Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

训练结果准确度无法提升问题请教? #3

Open
YDS-FOREVER opened this issue Apr 25, 2021 · 4 comments
Open

训练结果准确度无法提升问题请教? #3

YDS-FOREVER opened this issue Apr 25, 2021 · 4 comments

Comments

@YDS-FOREVER
Copy link

YDS-FOREVER commented Apr 25, 2021

训练了53个epoch,测试准确度还是只有0.39%,第1个epoch的时候准确度就是0.4多。说明真个训练网络都没有学习到。想问下我们这工程是论文官方的工程代码吗?是否能给出我们训练的结果?
训练数据使用bird,过程数据如下:

epoch train_acc train_loss
3 0.3504 5.3009
4 0.317 5.3007
51 0.367 5.2985
52 0.4171 5.2985

我觉得好奇的是我们网络为啥会丢掉一些卷积层的输出信息,位置如下Bottleneck:

   out = self.conv1(x)
    out = self.bn1(out)
    out = self.relu(out)
    out1 = out

    out = self.conv2(out)
    out = self.bn2(out)
    out = self.relu(out)

    if self.use_bp:
       #丢掉了out的32以后的所有通道,请问这样子构建的吗,为啥不直接self.conv2就构建32通道输出?
        out2 = out[:, :32, :, :]
        bilinear_features = self.bilinear_pooling(out1, out2)
        return out1, out2, bilinear_features
@TIan1874
Copy link
Member

TIan1874 commented Jun 8, 2021

训练了53个epoch,测试准确度还是只有0.39%,第1个epoch的时候准确度就是0.4多。说明真个训练网络都没有学习到。想问下我们这工程是论文官方的工程代码吗?是否能给出我们训练的结果?
训练数据使用bird,过程数据如下:

epoch train_acc train_loss
3 0.3504 5.3009
4 0.317 5.3007
51 0.367 5.2985
52 0.4171 5.2985
我觉得好奇的是我们网络为啥会丢掉一些卷积层的输出信息,位置如下Bottleneck:

   out = self.conv1(x)
    out = self.bn1(out)
    out = self.relu(out)
    out1 = out

    out = self.conv2(out)
    out = self.bn2(out)
    out = self.relu(out)

    if self.use_bp:
       #丢掉了out的32以后的所有通道,请问这样子构建的吗,为啥不直接self.conv2就构建32通道输出?
        out2 = out[:, :32, :, :]
        bilinear_features = self.bilinear_pooling(out1, out2)
        return out1, out2, bilinear_features

1)经过多次重复实验,代码没有问题,请检查数据集或环境是否有误;
2)训练结果请参考论文;
3)这里输出前32通道是参考的其他代码,具体请见WS_DAN

@wangshouxu
Copy link

训练了53个epoch,测试准确度还是只有0.39%,第1个epoch的时候准确度就是0.4多。说明真个训练网络都没有学习到。想问下我们这工程是论文官方的工程代码吗?是否能给出我们训练的结果? 训练数据使用bird,过程数据如下:

epoch train_acc train_loss
3 0.3504 5.3009
4 0.317 5.3007
51 0.367 5.2985
52 0.4171 5.2985
我觉得好奇的是我们网络为啥会丢掉一些卷积层的输出信息,位置如下Bottleneck:

   out = self.conv1(x)
    out = self.bn1(out)
    out = self.relu(out)
    out1 = out

    out = self.conv2(out)
    out = self.bn2(out)
    out = self.relu(out)

    if self.use_bp:
       #丢掉了out的32以后的所有通道,请问这样子构建的吗,为啥不直接self.conv2就构建32通道输出?
        out2 = out[:, :32, :, :]
        bilinear_features = self.bilinear_pooling(out1, out2)
        return out1, out2, bilinear_features

同样的情况,训练数据集是Cars,第一个epoch是0.6%,30个epoch时准确率0.83%损失6.12基本不变化了,也是感觉没学习到,数据集检查过没问题,也用这个数据集训练过其他代码,请问您的问题解决了么?

@TIan1874
Copy link
Member

训练了53个epoch,测试准确度还是只有0.39%,第1个epoch的时候准确度就是0.4多。说明真个训练网络都没有学习到。想问下我们这工程是论文官方的工程代码吗?是否能给出我们训练的结果? 训练数据使用bird,过程数据如下:
epoch train_acc train_loss
3 0.3504 5.3009
4 0.317 5.3007
51 0.367 5.2985
52 0.4171 5.2985
我觉得好奇的是我们网络为啥会丢掉一些卷积层的输出信息,位置如下Bottleneck:

   out = self.conv1(x)
    out = self.bn1(out)
    out = self.relu(out)
    out1 = out

    out = self.conv2(out)
    out = self.bn2(out)
    out = self.relu(out)

    if self.use_bp:
       #丢掉了out的32以后的所有通道,请问这样子构建的吗,为啥不直接self.conv2就构建32通道输出?
        out2 = out[:, :32, :, :]
        bilinear_features = self.bilinear_pooling(out1, out2)
        return out1, out2, bilinear_features

同样的情况,训练数据集是Cars,第一个epoch是0.6%,30个epoch时准确率0.83%损失6.12基本不变化了,也是感觉没学习到,数据集检查过没问题,也用这个数据集训练过其他代码,请问您的问题解决了么?

1)经过多次重复实验,代码没有问题,请检查数据集或环境是否有误;
2)bird测试结果如下:
epoch, test_acc,test_loss
0, 59.6479,2.4731
1, 75.5264,1.8175
2, 79.6686,1.5639
3, 82.4991,1.4433
4, 83.2758,1.3779
...
91, 87.6079,0.9748
92, 88.2637,0.9728
93, 87.8840,0.9739

@xiaoweidao
Copy link

你好,我复现的时候准确率也是学习不上去,请问你解决了吗?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants