Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the position encoding of 'pe = 0.' code #19

Open
lishensuo opened this issue Nov 15, 2024 · 1 comment
Open

Question about the position encoding of 'pe = 0.' code #19

lishensuo opened this issue Nov 15, 2024 · 1 comment

Comments

@lishensuo
Copy link

https://github.com/OmicsML/CellPLM/blob/b59ee7688b1bd2a745856a610248b46f15019a22/CellPLM/embedder/omics.py#L88C1-L101C17

In the last forwrd part of CellPLM/CellPLM/embedder/omics.py, I notice that the pe variable is always initialized as 0 and it seems that pe not related the coordinate encoding. Does it mean that x only includes the expression embed and batch embed? Could you explain the reasons? Thanks.

    def forward(self, x_dict, input_gene_list=None):
        x = self.feat_enc(x_dict, input_gene_list)#self.act(self.feat_enc(x_dict, input_gene_list))
        if self.pe_enc is not None:
            pe_input = x_dict[self.pe_enc.pe_key]
            pe = 0.#self.pe_enc(pe_input)
            if self.inject_covariate:
                pe = pe + self.cov_enc(x_dict['batch'])
            if self.cat_pe:
                x = torch.cat([x, pe], 1)
            else:
                x = x + pe
        x = self.extra_linear(x)
        # x = self.norm0(self.dropout(x))
        return x
@wehos
Copy link
Contributor

wehos commented Jan 23, 2025

Sorry for the late response. This is a bug for sure and I'm very sad that I didn't see this issue until today. This comment should definitely be removed in order to facilitate spatial transcriptomics. I'll fix it in an update soon.

Thank you very much for this good catch!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants