Skip to content

Change pre-trained model? #70

@ggnicolau

Description

@ggnicolau

I'm trying to create a spell checker proof-of-concept (POC) for an e-commerce search engine. We're already using Transformers architecture of other tasks and I thought about trying it also with spell checker.

I've came across this beatiful API and I want to give it a try. I've seen it uses BERT classical pre-trained model. But I need to use a pre-trained model in portuguese (such as 'BERTimbau') or multi-cross lingual (such as miniLM).

It would be good if we could pass the desired pre-trained model as a parameter for the function.

I may be wrong and it's already implemented. Correct me if I'm wrong. Is there an easy solution or where I can choose my pre-trained model without going low-level?

Metadata

Metadata

Assignees

No one assigned

    Labels

    documentationImprovements or additions to documentation

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions