Skip to content

No need to use forward method? #11

@nkkbr

Description

@nkkbr

pytorch-llama/model.py

Lines 230 to 235 in 067f8a3

# (B, Seq_Len, Dim) + (B, Seq_Len, Dim) --> (B, Seq_Len, Dim)
h = x + self.attention.forward(
self.attention_norm(x), start_pos, freqs_complex
)
# (B, Seq_Len, Dim) + (B, Seq_Len, Dim) --> (B, Seq_Len, Dim)
out = h + self.feed_forward.forward(self.ffn_norm(h))

No need to use forward method?
I mean, we could use nn.Module directly.

h = x + self.attention(self.attention_norm(x), start_pos, freqs_complex)
out = h + self.feed_forward(self.ffn_norm(h))  

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions