Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory problems with custom dataset #85

Open
Mikey-E opened this issue Sep 24, 2024 · 4 comments
Open

Memory problems with custom dataset #85

Mikey-E opened this issue Sep 24, 2024 · 4 comments

Comments

@Mikey-E
Copy link

Mikey-E commented Sep 24, 2024

It appears that grads and grads_abs may not be getting cleared correctly between calls of densify_and_prune(). They seem to double in size each time, eventually leading to out of memory errors. As such, simply changing memory parameters or environment variables does little to help. What is the best way to manage this size during training?

@niujinshuchong
Copy link
Member

Hi, the number of Gaussians depends on your scene. If you have very sparse initial point, more Gaussians will be added at the beginning. We increase the same number of Gaussians for grads and grads_abs. see here: https://github.com/autonomousvision/gaussian-opacity-fields/blob/main/scene/gaussian_model.py#L637-L640 Therefore, it will have more Gaussians at the beginning. You can decrease it by using a smaller ratio.

@wjy-666
Copy link

wjy-666 commented Nov 28, 2024

Same problem, very small dataset can result in out of memory errors. Have you solve this problem?

@Mikey-E
Copy link
Author

Mikey-E commented Nov 29, 2024

@wjy-666 I have not.

@niujinshuchong
Copy link
Member

Hi, maybe you can increase the threshold for gaussian allocation. Could you share a plot to show how the number of Gaussians change during training?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants