Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I met batch-size problem. #117

Open
edwardcho opened this issue Dec 22, 2021 · 3 comments
Open

I met batch-size problem. #117

edwardcho opened this issue Dec 22, 2021 · 3 comments

Comments

@edwardcho
Copy link

Hello Sir,

When try training my-dataset (image size 512x512 3ch, png),
I met batch-size error.
My GPU's spec is GTX 1080 Ti. (12 G)
I could train batch-size 1 only. If set 2 more, I could't train it.
Is this normal??

Thanks,
Edward Cho.

@taesungp
Copy link
Owner

Do you get an out-of-memory error? You are training at higher resolution (4 times more pixels than a 256x256 image), so you can't fit 2 or more samples in the batch.

I actually recommend training with batch size 1, which seems to produce the best result given the same number of epochs.

@edwardcho
Copy link
Author

Hello Sir,
Yes, I got out-of-memory error on GPU when batch-size 2.

Thanks,
Edward Cho.

@yichuan-huang
Copy link

@taesungp Hello, sir! I hope this comment finds you.
You mentioned that "I actually recommend training with batch size 1, which seems to produce the best result given the same number of epochs.". However, my tutor and I concluded that setting the batch size as 1 may cause serious overfitting or not coverage, and the gradient updates could become chaotic. So, can you explain why you set the batch size as 1 clearly and in detail?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants