Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Effect of outScaling for prediction quality #5

Open
codingS3b opened this issue Aug 7, 2020 · 4 comments
Open

Effect of outScaling for prediction quality #5

codingS3b opened this issue Aug 7, 2020 · 4 comments

Comments

@codingS3b
Copy link

After fiddling around with the outScaling parameter defined here (mainly because I did not really get the sense of it, since it apparently multiplies the predictions of the network, at least that is how I understood this part), I observed rapid changes in PSNR when changing the value from its default of 10 to e.g. 1 (which would mean the predictions are not altered).
This effect is reproducable e.g. in this notebook example by changing the line

means = prediction.tiledPredict(im, net ,ps=256, overlap=48, device=device, noiseModel=None)

which gives an Avg PSNR MMSE ~ 36 to this

means = prediction.tiledPredict(im, net ,ps=256, overlap=48, device=device, noiseModel=None, outScaling=1.0)

which for me produced an Avg PSNR MMSE ~ 20.

Do you have an idea on why that is happening and why a simple scaling of the prediction affects the PSNR that much? Or is the effect of the outScaling parameter a different one from what I think it is?

@lmanan
Copy link
Contributor

lmanan commented Aug 7, 2020

Hello @codingS3b
Thank you for looking into this and also for your pull request (I will get to it hopefully next week!).

On this question front, I wanted to ask if you also trained the network with outScaling = 1 by modifying this line of code to become this instead:

outputs = net((inputs_raw-meanTorch)/stdTorch) 

If both (training and prediction on Convallaria data) use the same values for outScaling, I would imagine that the results are comparable and close to Avg PSNR MMSE ~ 36 (although we haven't rigorously checked how the results fare with different outScaling values)

Let me know in case you discover a different behavior than this. Thank you!

@codingS3b
Copy link
Author

codingS3b commented Aug 7, 2020

Thanks for your response @MLbyML and pointing me to the training function.
I was not aware of the factor being multiplied there as well (since this is not a parameter of the function) and did not change it accordingly, so this might explain the problem. My model was trained with predictions multiplied by 10 and hence the same should be done for the inference part, I get that now :)

@psteinb
Copy link

psteinb commented Aug 7, 2020

Needless to say, it outScaling is a parameter for a function used for inference, but needs to be adhered to in code for the training; I would think this should be documented somewhere.

@alex-krull
Copy link

alex-krull commented Aug 7, 2020 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants