Skip to content

You may have made a mistake in your forward(). #1

@ghost

Description

Your implementation:

def forward(self, inp):
        out = self.embeddings(inp).view(1, -1)
        out = out.view(1, -1)
        out = self.lin1(out)
        out = F.relu(out)
        out = self.lin2(out)
        out = F.log_softmax(out, dim=1)
        return out

However, you need to sum embeddings as the formula:
$ logp(w_i|C)= logSoftmax(A(\sum_{w∈C}q_w)+b)$

i.e

def forward(self, inp):
        out = torch.sum(self.embeddings(inp), dim=0).view(1, -1)
        .....

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions