Skip to content

Conversation

kevin314
Copy link
Collaborator

TODO:

  • Add DistributedAttention for multi-gpu

Comment on lines +11 to +13
_class_name: str = "AutoencoderKLWan"
_diffusers_version: str = "0.34.0.dev0"
_name_or_path: str = ""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These fields should be pop/removed in the loader so can be removed. You can refer to how wan's vae config is defined


def __post_init__(self):
self.blend_num_frames = (self.tile_sample_min_num_frames -
self.tile_sample_stride_num_frames) * 2 No newline at end of file
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Newline char



@dataclass
class CosmosVideoConfigFixed(CosmosVideoConfig):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this needed?

class CosmosVideoConfigFixed(CosmosVideoConfig):
"""Fixed Cosmos Video Config that matches original Cosmos2 Video2World configuration."""

def update_model_arch(self, config: dict) -> None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you align against diffusers or original cosmos?



@dataclass
class CosmosTeaCacheParams(CacheParams):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can be removed for now, but you should add a default for cosmos sampling params. Refer to wan

if self.has_weight:
self.weight = nn.Parameter(self.weight)

def forward(self, hidden_states: torch.Tensor) -> torch.Tensor:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you change this to forward_diffusers?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants