-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory problems with custom dataset #85
Comments
Hi, the number of Gaussians depends on your scene. If you have very sparse initial point, more Gaussians will be added at the beginning. We increase the same number of Gaussians for grads and grads_abs. see here: https://github.com/autonomousvision/gaussian-opacity-fields/blob/main/scene/gaussian_model.py#L637-L640 Therefore, it will have more Gaussians at the beginning. You can decrease it by using a smaller ratio. |
Same problem, very small dataset can result in out of memory errors. Have you solve this problem? |
@wjy-666 I have not. |
Hi, maybe you can increase the threshold for gaussian allocation. Could you share a plot to show how the number of Gaussians change during training? |
It appears that grads and grads_abs may not be getting cleared correctly between calls of densify_and_prune(). They seem to double in size each time, eventually leading to out of memory errors. As such, simply changing memory parameters or environment variables does little to help. What is the best way to manage this size during training?
The text was updated successfully, but these errors were encountered: