Skip to content

Release Linear-MoE checkpoints on Hugging Face #18

@NielsRogge

Description

@NielsRogge

Hi @weigao266 🤗

Niels here from the open-source team at Hugging Face. I discovered your work on Linear-MoE through our daily papers feature: https://huggingface.co/papers/2503.05447. The paper page allows for community discussion and artifact discovery. You can also claim authorship for enhanced visibility on your HF profile.

Your Linear-MoE system looks very promising! We noticed your GitHub repository (https://github.com/OpenSparseLLMs/Linear-MoE) contains training scripts and configurations for various model sizes and architectures, suggesting the existence of pretrained checkpoints. Would you be interested in releasing these checkpoints on the Hugging Face Hub? This would significantly boost their discoverability and allow the community to easily access and build upon your work.

We can add appropriate tags to the model cards to enhance searchability and link them directly to your paper page. If the models are PyTorch-based, the PyTorchModelHubMixin can simplify uploading with push_to_hub. Alternatively, hf_hub_download allows direct downloads from the hub.

We encourage pushing each checkpoint to a separate model repository for better tracking and download stats. Let me know if you're interested and if you need any assistance with the process!

Cheers,

Niels
ML Engineer @ HF 🤗

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions