-
Notifications
You must be signed in to change notification settings - Fork 64
Open
Labels
documentationImprovements or additions to documentationImprovements or additions to documentation
Description
I'm trying to create a spell checker proof-of-concept (POC) for an e-commerce search engine. We're already using Transformers architecture of other tasks and I thought about trying it also with spell checker.
I've came across this beatiful API and I want to give it a try. I've seen it uses BERT classical pre-trained model. But I need to use a pre-trained model in portuguese (such as 'BERTimbau') or multi-cross lingual (such as miniLM).
It would be good if we could pass the desired pre-trained model as a parameter for the function.
I may be wrong and it's already implemented. Correct me if I'm wrong. Is there an easy solution or where I can choose my pre-trained model without going low-level?
Metadata
Metadata
Assignees
Labels
documentationImprovements or additions to documentationImprovements or additions to documentation