Skip to content

Commit cc78866

Browse files
authored
[Fix] SPTS readme (#1761)
1 parent f250ea2 commit cc78866

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

projects/SPTS/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ $env:PYTHONPATH=Get-Location
3636

3737
### Dataset
3838

39-
As of now, the implementation uses datasets provided by SPTS for pre-training, and uses MMOCR's datasets for fine-tuning and testing. It's because the test split of SPTS's datasets does not contain enough information for e2e evaluation; and MMOCR's dataset preparer has not yet supported all the datasets used in SPTS. *We are working on this issue, and they will be available in MMOCR's dataset preparer very soon.*
39+
As of now, the implementation uses datasets provided by SPTS for **pre-training**, and uses MMOCR's datasets for **fine-tuning and testing**. It's because the test split of SPTS's datasets does not contain enough information for e2e evaluation; and MMOCR's dataset preparer has not yet supported all the datasets used in SPTS. *We are working on this issue, and they will be available in MMOCR's dataset preparer very soon.*
4040

4141
Please follow these steps to prepare the datasets:
4242

@@ -62,13 +62,13 @@ In the current directory, run the following command to train the model:
6262
#### Pretrain
6363

6464
```bash
65-
mim train mmocr config/spts/spts_resnet50_150e_pretrain-spts.py --work-dir work_dirs/ --amp
65+
mim train mmocr config/spts/spts_resnet50_8xb8-150e_pretrain-spts.py --work-dir work_dirs/ --amp
6666
```
6767

6868
To train on multiple GPUs, e.g. 8 GPUs, run the following command:
6969

7070
```bash
71-
mim train mmocr config/spts/spts_resnet50_150e_pretrain-spts.py --work-dir work_dirs/ --launcher pytorch --gpus 8 --amp
71+
mim train mmocr config/spts/spts_resnet50_8xb8-150e_pretrain-spts.py --work-dir work_dirs/ --launcher pytorch --gpus 8 --amp
7272
```
7373

7474
#### Finetune

0 commit comments

Comments
 (0)