You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
models/palette_tensoRF.py
class PLTRender(torch.nn.Module):
def init():
self.n_dim = 3 + len_palette
......
layer1 = torch.nn.Linear(self.in_mlpC, featureC)
layer2 = torch.nn.Linear(featureC, featureC)
layer3 = torch.nn.Linear(featureC, len_palette - 1)
torch.nn.init.constant_(layer3.bias, 0)
self.mlp = torch.nn.Sequential(
layer1, torch.nn.LeakyReLU(inplace=True),
layer2, torch.nn.LeakyReLU(inplace=True),
layer3)
self.n_dim += 1
I recently tried to integrate this work of yours on instant-ngp.
But I do not understand the role of self.n_dim, and why the output dimension of layer3 is len_palette - 1.
Besides,why is the activation function after each layer LeakyReLU, and the Relu function is also used in TensoRF.
Hope you can help me
Best wish!
The text was updated successfully, but these errors were encountered:
n_dim is the total number of output channels, including color, palette weights, sparsity, etc.
Since the weight of the last palette color is computed from other weights through alpha blending, the MLP layers only need to output len_palette - 1 channels.
models/palette_tensoRF.py
class PLTRender(torch.nn.Module):
def init():
self.n_dim = 3 + len_palette
......
layer1 = torch.nn.Linear(self.in_mlpC, featureC)
layer2 = torch.nn.Linear(featureC, featureC)
layer3 = torch.nn.Linear(featureC, len_palette - 1)
torch.nn.init.constant_(layer3.bias, 0)
self.mlp = torch.nn.Sequential(
layer1, torch.nn.LeakyReLU(inplace=True),
layer2, torch.nn.LeakyReLU(inplace=True),
layer3)
self.n_dim += 1
I recently tried to integrate this work of yours on instant-ngp.
But I do not understand the role of self.n_dim, and why the output dimension of layer3 is len_palette - 1.
Besides,why is the activation function after each layer LeakyReLU, and the Relu function is also used in TensoRF.
Hope you can help me
Best wish!
The text was updated successfully, but these errors were encountered: