Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confusion about palette output dimensions #8

Open
BCL123456-BAL opened this issue May 23, 2023 · 1 comment
Open

Confusion about palette output dimensions #8

BCL123456-BAL opened this issue May 23, 2023 · 1 comment

Comments

@BCL123456-BAL
Copy link

BCL123456-BAL commented May 23, 2023

models/palette_tensoRF.py
class PLTRender(torch.nn.Module):
def init():
self.n_dim = 3 + len_palette
......
layer1 = torch.nn.Linear(self.in_mlpC, featureC)
layer2 = torch.nn.Linear(featureC, featureC)
layer3 = torch.nn.Linear(featureC, len_palette - 1)
torch.nn.init.constant_(layer3.bias, 0)
self.mlp = torch.nn.Sequential(
layer1, torch.nn.LeakyReLU(inplace=True),
layer2, torch.nn.LeakyReLU(inplace=True),
layer3)
self.n_dim += 1
I recently tried to integrate this work of yours on instant-ngp.
But I do not understand the role of self.n_dim, and why the output dimension of layer3 is len_palette - 1.
Besides,why is the activation function after each layer LeakyReLU, and the Relu function is also used in TensoRF.

Hope you can help me

Best wish!

@yuehaowang
Copy link
Owner

Thanks for your question.

  1. n_dim is the total number of output channels, including color, palette weights, sparsity, etc.
  2. Since the weight of the last palette color is computed from other weights through alpha blending, the MLP layers only need to output len_palette - 1 channels.
  3. I think both LeakyReLU and ReLU could work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants