diff --git a/.github/workflows/publish.yml b/.github/workflows/publish.yml index 0c9bbc0bb..16463708d 100644 --- a/.github/workflows/publish.yml +++ b/.github/workflows/publish.yml @@ -18,14 +18,12 @@ jobs: - name: Set up Python uses: actions/setup-python@v4 with: - python-version: "3.8" + python-version: "3.9" - name: Install dependencies run: | python -m pip install --upgrade pip pip install -r requirements/dev.txt - pip install "Pillow==9.1.1" - pip install "mindspore>=1.8,<=1.10" pip install build twine - name: Build package run: | diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index ccd7526d2..fa2d01522 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -17,9 +17,9 @@ Report bugs at https://github.com/mindspore-lab/mindcv/issues. If you are reporting a bug, please include: -* Your operating system name and version. -* Any details about your local setup that might be helpful in troubleshooting. -* Detailed steps to reproduce the bug. +- Your operating system name and version. +- Any details about your local setup that might be helpful in troubleshooting. +- Detailed steps to reproduce the bug. ### Fix Bugs @@ -43,9 +43,9 @@ The best way to send feedback is to file an issue at https://github.com/mindspor If you are proposing a feature: -* Explain in detail how it would work. -* Keep the scope as narrow as possible, to make it easier to implement. -* Remember that this is a volunteer-driven project, and that contributions are welcome :) +- Explain in detail how it would work. +- Keep the scope as narrow as possible, to make it easier to implement. +- Remember that this is a volunteer-driven project, and that contributions are welcome :) ## Getting Started diff --git a/LICENSE.md b/LICENSE.md index 261eeb9e9..c61b66391 100644 --- a/LICENSE.md +++ b/LICENSE.md @@ -2,180 +2,180 @@ Version 2.0, January 2004 http://www.apache.org/licenses/ - TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION - - 1. Definitions. - - "License" shall mean the terms and conditions for use, reproduction, - and distribution as defined by Sections 1 through 9 of this document. - - "Licensor" shall mean the copyright owner or entity authorized by - the copyright owner that is granting the License. - - "Legal Entity" shall mean the union of the acting entity and all - other entities that control, are controlled by, or are under common - control with that entity. For the purposes of this definition, - "control" means (i) the power, direct or indirect, to cause the - direction or management of such entity, whether by contract or - otherwise, or (ii) ownership of fifty percent (50%) or more of the - outstanding shares, or (iii) beneficial ownership of such entity. - - "You" (or "Your") shall mean an individual or Legal Entity - exercising permissions granted by this License. - - "Source" form shall mean the preferred form for making modifications, - including but not limited to software source code, documentation - source, and configuration files. - - "Object" form shall mean any form resulting from mechanical - transformation or translation of a Source form, including but - not limited to compiled object code, generated documentation, - and conversions to other media types. - - "Work" shall mean the work of authorship, whether in Source or - Object form, made available under the License, as indicated by a - copyright notice that is included in or attached to the work - (an example is provided in the Appendix below). - - "Derivative Works" shall mean any work, whether in Source or Object - form, that is based on (or derived from) the Work and for which the - editorial revisions, annotations, elaborations, or other modifications - represent, as a whole, an original work of authorship. For the purposes - of this License, Derivative Works shall not include works that remain - separable from, or merely link (or bind by name) to the interfaces of, - the Work and Derivative Works thereof. - - "Contribution" shall mean any work of authorship, including - the original version of the Work and any modifications or additions - to that Work or Derivative Works thereof, that is intentionally - submitted to Licensor for inclusion in the Work by the copyright owner - or by an individual or Legal Entity authorized to submit on behalf of - the copyright owner. For the purposes of this definition, "submitted" - means any form of electronic, verbal, or written communication sent - to the Licensor or its representatives, including but not limited to - communication on electronic mailing lists, source code control systems, - and issue tracking systems that are managed by, or on behalf of, the - Licensor for the purpose of discussing and improving the Work, but - excluding communication that is conspicuously marked or otherwise - designated in writing by the copyright owner as "Not a Contribution." - - "Contributor" shall mean Licensor and any individual or Legal Entity - on behalf of whom a Contribution has been received by Licensor and - subsequently incorporated within the Work. - - 2. Grant of Copyright License. Subject to the terms and conditions of - this License, each Contributor hereby grants to You a perpetual, - worldwide, non-exclusive, no-charge, royalty-free, irrevocable - copyright license to reproduce, prepare Derivative Works of, - publicly display, publicly perform, sublicense, and distribute the - Work and such Derivative Works in Source or Object form. - - 3. Grant of Patent License. Subject to the terms and conditions of - this License, each Contributor hereby grants to You a perpetual, - worldwide, non-exclusive, no-charge, royalty-free, irrevocable - (except as stated in this section) patent license to make, have made, - use, offer to sell, sell, import, and otherwise transfer the Work, - where such license applies only to those patent claims licensable - by such Contributor that are necessarily infringed by their - Contribution(s) alone or by combination of their Contribution(s) - with the Work to which such Contribution(s) was submitted. If You - institute patent litigation against any entity (including a - cross-claim or counterclaim in a lawsuit) alleging that the Work - or a Contribution incorporated within the Work constitutes direct - or contributory patent infringement, then any patent licenses - granted to You under this License for that Work shall terminate - as of the date such litigation is filed. - - 4. Redistribution. You may reproduce and distribute copies of the - Work or Derivative Works thereof in any medium, with or without - modifications, and in Source or Object form, provided that You - meet the following conditions: - - (a) You must give any other recipients of the Work or - Derivative Works a copy of this License; and - - (b) You must cause any modified files to carry prominent notices - stating that You changed the files; and - - (c) You must retain, in the Source form of any Derivative Works - that You distribute, all copyright, patent, trademark, and - attribution notices from the Source form of the Work, - excluding those notices that do not pertain to any part of - the Derivative Works; and - - (d) If the Work includes a "NOTICE" text file as part of its - distribution, then any Derivative Works that You distribute must - include a readable copy of the attribution notices contained - within such NOTICE file, excluding those notices that do not - pertain to any part of the Derivative Works, in at least one - of the following places: within a NOTICE text file distributed - as part of the Derivative Works; within the Source form or - documentation, if provided along with the Derivative Works; or, - within a display generated by the Derivative Works, if and - wherever such third-party notices normally appear. The contents - of the NOTICE file are for informational purposes only and - do not modify the License. You may add Your own attribution - notices within Derivative Works that You distribute, alongside - or as an addendum to the NOTICE text from the Work, provided - that such additional attribution notices cannot be construed - as modifying the License. - - You may add Your own copyright statement to Your modifications and - may provide additional or different license terms and conditions - for use, reproduction, or distribution of Your modifications, or - for any such Derivative Works as a whole, provided Your use, - reproduction, and distribution of the Work otherwise complies with - the conditions stated in this License. - - 5. Submission of Contributions. Unless You explicitly state otherwise, - any Contribution intentionally submitted for inclusion in the Work - by You to the Licensor shall be under the terms and conditions of - this License, without any additional terms or conditions. - Notwithstanding the above, nothing herein shall supersede or modify - the terms of any separate license agreement you may have executed - with Licensor regarding such Contributions. - - 6. Trademarks. This License does not grant permission to use the trade - names, trademarks, service marks, or product names of the Licensor, - except as required for reasonable and customary use in describing the - origin of the Work and reproducing the content of the NOTICE file. - - 7. Disclaimer of Warranty. Unless required by applicable law or - agreed to in writing, Licensor provides the Work (and each - Contributor provides its Contributions) on an "AS IS" BASIS, - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or - implied, including, without limitation, any warranties or conditions - of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A - PARTICULAR PURPOSE. You are solely responsible for determining the - appropriateness of using or redistributing the Work and assume any - risks associated with Your exercise of permissions under this License. - - 8. Limitation of Liability. In no event and under no legal theory, - whether in tort (including negligence), contract, or otherwise, - unless required by applicable law (such as deliberate and grossly - negligent acts) or agreed to in writing, shall any Contributor be - liable to You for damages, including any direct, indirect, special, - incidental, or consequential damages of any character arising as a - result of this License or out of the use or inability to use the - Work (including but not limited to damages for loss of goodwill, - work stoppage, computer failure or malfunction, or any and all - other commercial damages or losses), even if such Contributor - has been advised of the possibility of such damages. - - 9. Accepting Warranty or Additional Liability. While redistributing - the Work or Derivative Works thereof, You may choose to offer, - and charge a fee for, acceptance of support, warranty, indemnity, - or other liability obligations and/or rights consistent with this - License. However, in accepting such obligations, You may act only - on Your own behalf and on Your sole responsibility, not on behalf - of any other Contributor, and only if You agree to indemnify, - defend, and hold each Contributor harmless for any liability - incurred by, or claims asserted against, such Contributor by reason - of your accepting any such warranty or additional liability. - - END OF TERMS AND CONDITIONS - - APPENDIX: How to apply the Apache License to your work. +TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + +1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + +2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + +3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + +4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + +5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + +6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + +7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + +8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + +9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + +END OF TERMS AND CONDITIONS + +APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" @@ -186,16 +186,16 @@ same "printed page" as the copyright notice for easier identification within third-party archives. - Copyright [yyyy] [name of copyright owner] +Copyright [yyyy] [name of copyright owner] - Licensed under the Apache License, Version 2.0 (the "License"); - you may not use this file except in compliance with the License. - You may obtain a copy of the License at +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 - Unless required by applicable law or agreed to in writing, software - distributed under the License is distributed on an "AS IS" BASIS, - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - See the License for the specific language governing permissions and - limitations under the License. +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. diff --git a/README.md b/README.md index f4a9cdca2..4f860811d 100644 --- a/README.md +++ b/README.md @@ -30,33 +30,32 @@ MindCV is an open-source toolbox for computer vision research and development ba The following is the corresponding `mindcv` versions and supported `mindspore` versions. | mindcv | mindspore | -|:------:|:-----------:| +| :----: | :---------: | | main | master | -| 0.5 | 2.5.0 | +| 0.5 | 2.5.0 | | 0.4 | 2.3.0/2.3.1 | | 0.3 | 2.2.10 | | 0.2 | 2.0 | | 0.1 | 1.8 | - ### Major Features - **Easy-to-Use.** MindCV decomposes the vision framework into various configurable components. It is easy to customize your data pipeline, models, and learning pipeline with MindCV: - ```pycon - >>> import mindcv - # create a dataset - >>> dataset = mindcv.create_dataset('cifar10', download=True) - # create a model - >>> network = mindcv.create_model('resnet50', pretrained=True) - ``` + ```pycon + >>> import mindcv + # create a dataset + >>> dataset = mindcv.create_dataset('cifar10', download=True) + # create a model + >>> network = mindcv.create_model('resnet50', pretrained=True) + ``` - Users can customize and launch their transfer learning or training task in one command line. + Users can customize and launch their transfer learning or training task in one command line. - ```shell - # transfer learning in one command line - python train.py --model=swin_tiny --pretrained --opt=adamw --lr=0.001 --data_dir=/path/to/data - ``` + ```shell + # transfer learning in one command line + python train.py --model=swin_tiny --pretrained --opt=adamw --lr=0.001 --data_dir=/path/to/data + ``` - **State-of-The-Art.** MindCV provides various CNN-based and Transformer-based vision models including SwinTransformer. Their pretrained weights and performance reports are provided to help users select and reuse the right model: @@ -88,6 +87,7 @@ Below are a few code snippets for your taste. # Create the model object >>> network = mindcv.create_model('swin_tiny', pretrained=True) ``` + ```shell # Validate its accuracy python validate.py --model=swin_tiny --pretrained --dataset=imagenet --val_split=validation @@ -108,6 +108,7 @@ Classify the downloaded image with a pretrained SoTA model: python infer.py --model=swin_tiny --image_path='./dog.jpg' # {'Labrador retriever': 0.5700152, 'golden retriever': 0.034551315, 'kelpie': 0.010108651, 'Chesapeake Bay retriever': 0.008229004, 'Walker hound, Walker foxhound': 0.007791956} ``` + The top-1 prediction result is labrador retriever, which is the breed of this cut dog. ### Training @@ -116,61 +117,61 @@ It is easy to train your model on a standard or customized dataset using `train. - Standalone Training - ```shell - # standalone training - python train.py --model=resnet50 --dataset=cifar10 --dataset_download - ``` + ```shell + # standalone training + python train.py --model=resnet50 --dataset=cifar10 --dataset_download + ``` - Above is an example for training ResNet50 on CIFAR10 dataset on a CPU/GPU/Ascend device + Above is an example for training ResNet50 on CIFAR10 dataset on a CPU/GPU/Ascend device - Distributed Training - For large datasets like ImageNet, it is necessary to do training in distributed mode on multiple devices. This can be achieved with `msrun` and parallel features supported by MindSpore. + For large datasets like ImageNet, it is necessary to do training in distributed mode on multiple devices. This can be achieved with `msrun` and parallel features supported by MindSpore. - ```shell - # distributed training - # assume you have 4 NPUs - msrun --bind_core=True --worker_num 4 python train.py --distribute \ - --model=densenet121 --dataset=imagenet --data_dir=/path/to/imagenet - ``` + ```shell + # distributed training + # assume you have 4 NPUs + msrun --bind_core=True --worker_num 4 python train.py --distribute \ + --model=densenet121 --dataset=imagenet --data_dir=/path/to/imagenet + ``` - Notice that if you are using msrun startup with 2 devices, please add `--bind_core=True` to improve performance. For example: + Notice that if you are using msrun startup with 2 devices, please add `--bind_core=True` to improve performance. For example: - ```shell - msrun --bind_core=True --worker_num=2--local_worker_num=2 --master_port=8118 \ - --log_dir=msrun_log --join=True --cluster_time_out=300 \ - python train.py --distribute --model=densenet121 --dataset=imagenet --data_dir=/path/to/imagenet - ``` + ```shell + msrun --bind_core=True --worker_num=2--local_worker_num=2 --master_port=8118 \ + --log_dir=msrun_log --join=True --cluster_time_out=300 \ + python train.py --distribute --model=densenet121 --dataset=imagenet --data_dir=/path/to/imagenet + ``` - > For more information, please refer to https://www.mindspore.cn/docs/en/r2.5.0/model_train/parallel/startup_method.html + > For more information, please refer to https://www.mindspore.cn/docs/en/r2.5.0/model_train/parallel/startup_method.html - Detailed parameter definitions can be seen in `config.py` and checked by running `python train.py --help'. + Detailed parameter definitions can be seen in `config.py` and checked by running `python train.py --help'. - To resume training, please set the `--ckpt_path` and `--ckpt_save_dir` arguments. The optimizer state including the learning rate of the last stopped epoch will also be recovered. + To resume training, please set the `--ckpt_path` and `--ckpt_save_dir` arguments. The optimizer state including the learning rate of the last stopped epoch will also be recovered. - Config and Training Strategy - You can configure your model and other components either by specifying external parameters or by writing a yaml config file. Here is an example of training using a preset yaml file. + You can configure your model and other components either by specifying external parameters or by writing a yaml config file. Here is an example of training using a preset yaml file. - ```shell - msrun --bind_core=True --worker_num 4 python train.py -c configs/squeezenet/squeezenet_1.0_ascend.yaml - ``` + ```shell + msrun --bind_core=True --worker_num 4 python train.py -c configs/squeezenet/squeezenet_1.0_ascend.yaml + ``` - **Pre-defined Training Strategies:** - We provide more than 20 training recipes that achieve SoTA results on ImageNet currently. - Please look into the [`configs`](configs) folder for details. - Please feel free to adapt these training strategies to your own model for performance improvement, which can be easily done by modifying the yaml file. + **Pre-defined Training Strategies:** + We provide more than 20 training recipes that achieve SoTA results on ImageNet currently. + Please look into the [`configs`](configs) folder for details. + Please feel free to adapt these training strategies to your own model for performance improvement, which can be easily done by modifying the yaml file. - Train on ModelArts/OpenI Platform - To run training on the [ModelArts](https://www.huaweicloud.com/intl/en-us/product/modelarts.html) or [OpenI](https://openi.pcl.ac.cn/) cloud platform: + To run training on the [ModelArts](https://www.huaweicloud.com/intl/en-us/product/modelarts.html) or [OpenI](https://openi.pcl.ac.cn/) cloud platform: - ```text - 1. Create a new training task on the cloud platform. - 2. Add run parameter `config` and specify the path to the yaml config file on the website UI interface. - 3. Add run parameter `enable_modelarts` and set True on the website UI interface. - 4. Fill in other blanks on the website and launch the training task. - ``` + ```text + 1. Create a new training task on the cloud platform. + 2. Add run parameter `config` and specify the path to the yaml config file on the website UI interface. + 3. Add run parameter `enable_modelarts` and set True on the website UI interface. + 4. Fill in other blanks on the website and launch the training task. + ``` **Graph Mode and PyNative Mode**: @@ -226,41 +227,41 @@ Currently, MindCV supports the model families listed below. More models with pre
Supported models -* Big Transfer ResNetV2 (BiT) - https://arxiv.org/abs/1912.11370 -* ConvNeXt - https://arxiv.org/abs/2201.03545 -* ConViT (Soft Convolutional Inductive Biases Vision Transformers)- https://arxiv.org/abs/2103.10697 -* DenseNet - https://arxiv.org/abs/1608.06993 -* DPN (Dual-Path Network) - https://arxiv.org/abs/1707.01629 -* EfficientNet (MBConvNet Family) https://arxiv.org/abs/1905.11946 -* EfficientNet V2 - https://arxiv.org/abs/2104.00298 -* GhostNet - https://arxiv.org/abs/1911.11907 -* GoogLeNet - https://arxiv.org/abs/1409.4842 -* Inception-V3 - https://arxiv.org/abs/1512.00567 -* Inception-ResNet-V2 and Inception-V4 - https://arxiv.org/abs/1602.07261 -* MNASNet - https://arxiv.org/abs/1807.11626 -* MobileNet-V1 - https://arxiv.org/abs/1704.04861 -* MobileNet-V2 - https://arxiv.org/abs/1801.04381 -* MobileNet-V3 (MBConvNet w/ Efficient Head) - https://arxiv.org/abs/1905.02244 -* NASNet - https://arxiv.org/abs/1707.07012 -* PNasNet - https://arxiv.org/abs/1712.00559 -* PVT (Pyramid Vision Transformer) - https://arxiv.org/abs/2102.12122 -* PoolFormer models - https://github.com/sail-sg/poolformer -* RegNet - https://arxiv.org/abs/2003.13678 -* RepMLP https://arxiv.org/abs/2105.01883 -* RepVGG - https://arxiv.org/abs/2101.03697 -* ResNet (v1b/v1.5) - https://arxiv.org/abs/1512.03385 -* ResNeXt - https://arxiv.org/abs/1611.05431 -* Res2Net - https://arxiv.org/abs/1904.01169 -* ReXNet - https://arxiv.org/abs/2007.00992 -* ShuffleNet v1 - https://arxiv.org/abs/1707.01083 -* ShuffleNet v2 - https://arxiv.org/abs/1807.11164 -* SKNet - https://arxiv.org/abs/1903.06586 -* SqueezeNet - https://arxiv.org/abs/1602.07360 -* Swin Transformer - https://arxiv.org/abs/2103.14030 -* VGG - https://arxiv.org/abs/1409.1556 -* Visformer - https://arxiv.org/abs/2104.12533 -* Vision Transformer (ViT) - https://arxiv.org/abs/2010.11929 -* Xception - https://arxiv.org/abs/1610.02357 +- Big Transfer ResNetV2 (BiT) - https://arxiv.org/abs/1912.11370 +- ConvNeXt - https://arxiv.org/abs/2201.03545 +- ConViT (Soft Convolutional Inductive Biases Vision Transformers)- https://arxiv.org/abs/2103.10697 +- DenseNet - https://arxiv.org/abs/1608.06993 +- DPN (Dual-Path Network) - https://arxiv.org/abs/1707.01629 +- EfficientNet (MBConvNet Family) https://arxiv.org/abs/1905.11946 +- EfficientNet V2 - https://arxiv.org/abs/2104.00298 +- GhostNet - https://arxiv.org/abs/1911.11907 +- GoogLeNet - https://arxiv.org/abs/1409.4842 +- Inception-V3 - https://arxiv.org/abs/1512.00567 +- Inception-ResNet-V2 and Inception-V4 - https://arxiv.org/abs/1602.07261 +- MNASNet - https://arxiv.org/abs/1807.11626 +- MobileNet-V1 - https://arxiv.org/abs/1704.04861 +- MobileNet-V2 - https://arxiv.org/abs/1801.04381 +- MobileNet-V3 (MBConvNet w/ Efficient Head) - https://arxiv.org/abs/1905.02244 +- NASNet - https://arxiv.org/abs/1707.07012 +- PNasNet - https://arxiv.org/abs/1712.00559 +- PVT (Pyramid Vision Transformer) - https://arxiv.org/abs/2102.12122 +- PoolFormer models - https://github.com/sail-sg/poolformer +- RegNet - https://arxiv.org/abs/2003.13678 +- RepMLP https://arxiv.org/abs/2105.01883 +- RepVGG - https://arxiv.org/abs/2101.03697 +- ResNet (v1b/v1.5) - https://arxiv.org/abs/1512.03385 +- ResNeXt - https://arxiv.org/abs/1611.05431 +- Res2Net - https://arxiv.org/abs/1904.01169 +- ReXNet - https://arxiv.org/abs/2007.00992 +- ShuffleNet v1 - https://arxiv.org/abs/1707.01083 +- ShuffleNet v2 - https://arxiv.org/abs/1807.11164 +- SKNet - https://arxiv.org/abs/1903.06586 +- SqueezeNet - https://arxiv.org/abs/1602.07360 +- Swin Transformer - https://arxiv.org/abs/2103.14030 +- VGG - https://arxiv.org/abs/1409.1556 +- Visformer - https://arxiv.org/abs/2104.12533 +- Vision Transformer (ViT) - https://arxiv.org/abs/2010.11929 +- Xception - https://arxiv.org/abs/1610.02357 Please see [configs](./configs) for the details about model performance and pretrained weights. @@ -271,43 +272,43 @@ Please see [configs](./configs) for the details about model performance and pret
Supported algorithms -* Augmentation - * [AutoAugment](https://arxiv.org/abs/1805.09501) - * [RandAugment](https://arxiv.org/abs/1909.13719) - * [Repeated Augmentation](https://openaccess.thecvf.com/content_CVPR_2020/papers/Hoffer_Augment_Your_Batch_Improving_Generalization_Through_Instance_Repetition_CVPR_2020_paper.pdf) - * RandErasing (Cutout) - * CutMix - * MixUp - * RandomResizeCrop - * Color Jitter, Flip, etc -* Optimizer - * Adam - * AdamW - * [Lion](https://arxiv.org/abs/2302.06675) - * Adan (experimental) - * AdaGrad - * LAMB - * Momentum - * RMSProp - * SGD - * NAdam -* LR Scheduler - * Warmup Cosine Decay - * Step LR - * Polynomial Decay - * Exponential Decay -* Regularization - * Weight Decay - * Label Smoothing - * Stochastic Depth (depends on networks) - * Dropout (depends on networks) -* Loss - * Cross Entropy (w/ class weight and auxiliary logit support) - * Binary Cross Entropy (w/ class weight and auxiliary logit support) - * Soft Cross Entropy Loss (automatically enabled if mixup or label smoothing is used) - * Soft Binary Cross Entropy Loss (automatically enabled if mixup or label smoothing is used) -* Ensemble - * Warmup EMA (Exponential Moving Average) +- Augmentation + - [AutoAugment](https://arxiv.org/abs/1805.09501) + - [RandAugment](https://arxiv.org/abs/1909.13719) + - [Repeated Augmentation](https://openaccess.thecvf.com/content_CVPR_2020/papers/Hoffer_Augment_Your_Batch_Improving_Generalization_Through_Instance_Repetition_CVPR_2020_paper.pdf) + - RandErasing (Cutout) + - CutMix + - MixUp + - RandomResizeCrop + - Color Jitter, Flip, etc +- Optimizer + - Adam + - AdamW + - [Lion](https://arxiv.org/abs/2302.06675) + - Adan (experimental) + - AdaGrad + - LAMB + - Momentum + - RMSProp + - SGD + - NAdam +- LR Scheduler + - Warmup Cosine Decay + - Step LR + - Polynomial Decay + - Exponential Decay +- Regularization + - Weight Decay + - Label Smoothing + - Stochastic Depth (depends on networks) + - Dropout (depends on networks) +- Loss + - Cross Entropy (w/ class weight and auxiliary logit support) + - Binary Cross Entropy (w/ class weight and auxiliary logit support) + - Soft Cross Entropy Loss (automatically enabled if mixup or label smoothing is used) + - Soft Binary Cross Entropy Loss (automatically enabled if mixup or label smoothing is used) +- Ensemble + - Warmup EMA (Exponential Moving Average)
@@ -342,9 +343,9 @@ Release `0.3.0` is published. We will drop MindSpore 1.x in the future release. 6. BREAKING CHANGES: - We are going to drop support of MindSpore 1.x for it's EOL. - Configuration `filter_bias_and_bn` will be removed and renamed as `weight_decay_filter`, - due to a prolonged misunderstanding of the MindSpore optimizer. - We will migrate the existing training recipes, but the signature change of function `create_optimizer` will be incompatible - and the old version training recipes will also be incompatible. See [PR/752](https://github.com/mindspore-lab/mindcv/pull/752) for details. + due to a prolonged misunderstanding of the MindSpore optimizer. + We will migrate the existing training recipes, but the signature change of function `create_optimizer` will be incompatible + and the old version training recipes will also be incompatible. See [PR/752](https://github.com/mindspore-lab/mindcv/pull/752) for details. See [RELEASE](RELEASE.md) for detailed history. diff --git a/README_CN.md b/README_CN.md index 1347e652a..03a27a10e 100644 --- a/README_CN.md +++ b/README_CN.md @@ -33,20 +33,20 @@ MindCV是一个基于 [MindSpore](https://www.mindspore.cn/) 开发的,致力 - **高易用性** MindCV将视觉任务分解为各种可配置的组件,用户可以轻松地构建自己的数据处理和模型训练流程。 - ```pycon - >>> import mindcv - # 创建数据集 - >>> dataset = mindcv.create_dataset('cifar10', download=True) - # 创建模型 - >>> network = mindcv.create_model('resnet50', pretrained=True) - ``` + ```pycon + >>> import mindcv + # 创建数据集 + >>> dataset = mindcv.create_dataset('cifar10', download=True) + # 创建模型 + >>> network = mindcv.create_model('resnet50', pretrained=True) + ``` - 用户可通过预定义的训练和微调脚本,快速配置并完成训练或迁移学习任务。 + 用户可通过预定义的训练和微调脚本,快速配置并完成训练或迁移学习任务。 - ```shell - # 配置和启动迁移学习任务 - python train.py --model swin_tiny --pretrained --opt=adamw --lr=0.001 --data_dir=/path/to/dataset - ``` + ```shell + # 配置和启动迁移学习任务 + python train.py --model swin_tiny --pretrained --opt=adamw --lr=0.001 --data_dir=/path/to/dataset + ``` - **高性能** MindCV集成了大量基于CNN和Transformer的高性能模型, 如SwinTransformer,并提供预训练权重、训练策略和性能报告,帮助用户快速选型并将其应用于视觉模型。 @@ -108,61 +108,61 @@ python infer.py --model=swin_tiny --image_path='./dog.jpg' - 单卡训练 - ```shell - # 单卡训练 - python train.py --model resnet50 --dataset cifar10 --dataset_download - ``` + ```shell + # 单卡训练 + python train.py --model resnet50 --dataset cifar10 --dataset_download + ``` - 以上代码是在CIFAR10数据集上单卡(CPU/GPU/Ascend)训练ResNet的示例,通过`model`和`dataset`参数分别指定需要训练的模型和数据集。 + 以上代码是在CIFAR10数据集上单卡(CPU/GPU/Ascend)训练ResNet的示例,通过`model`和`dataset`参数分别指定需要训练的模型和数据集。 - 分布式训练 - 对于像ImageNet这样的大型数据集,有必要在多个设备上以分布式模式进行训练。基于MindSpore对分布式相关功能的良好支持,用户可以使用`msrun`来进行模型的分布式训练。 + 对于像ImageNet这样的大型数据集,有必要在多个设备上以分布式模式进行训练。基于MindSpore对分布式相关功能的良好支持,用户可以使用`msrun`来进行模型的分布式训练。 - ```shell - # 分布式训练 - # 假设你有4张NPU卡 - msrun --bind_core=True --worker_num 4 python train.py --distribute \ - --model densenet121 --dataset imagenet --data_dir ./datasets/imagenet - ``` + ```shell + # 分布式训练 + # 假设你有4张NPU卡 + msrun --bind_core=True --worker_num 4 python train.py --distribute \ + --model densenet121 --dataset imagenet --data_dir ./datasets/imagenet + ``` - 注意,如果在两卡环境下选用msrun作为启动方式,请添加配置项 `--bind_core=True` 增加绑核操作以优化两卡性能,范例代码如下: + 注意,如果在两卡环境下选用msrun作为启动方式,请添加配置项 `--bind_core=True` 增加绑核操作以优化两卡性能,范例代码如下: - ```shell - msrun --bind_core=True --worker_num=2--local_worker_num=2 --master_port=8118 \ - --log_dir=msrun_log --join=True --cluster_time_out=300 \ - python train.py --distribute --model=densenet121 --dataset=imagenet --data_dir=/path/to/imagenet - ``` + ```shell + msrun --bind_core=True --worker_num=2--local_worker_num=2 --master_port=8118 \ + --log_dir=msrun_log --join=True --cluster_time_out=300 \ + python train.py --distribute --model=densenet121 --dataset=imagenet --data_dir=/path/to/imagenet + ``` - > 如需更多操作指导,请参考 https://www.mindspore.cn/docs/zh-CN/r2.5.0/model_train/parallel/startup_method.html + > 如需更多操作指导,请参考 https://www.mindspore.cn/docs/zh-CN/r2.5.0/model_train/parallel/startup_method.html - 完整的参数列表及说明在`config.py`中定义,可运行`python train.py --help`快速查看。 + 完整的参数列表及说明在`config.py`中定义,可运行`python train.py --help`快速查看。 - 如需恢复训练,请指定`--ckpt_path`和`--ckpt_save_dir`参数,脚本将加载路径中的模型权重和优化器状态,并恢复中断的训练进程。 + 如需恢复训练,请指定`--ckpt_path`和`--ckpt_save_dir`参数,脚本将加载路径中的模型权重和优化器状态,并恢复中断的训练进程。 - 超参配置和预训练策略 - 您可以编写yaml文件或设置外部参数来指定配置数据、模型、优化器等组件及其超参数。以下是使用预设的训练策略(yaml文件)进行模型训练的示例。 + 您可以编写yaml文件或设置外部参数来指定配置数据、模型、优化器等组件及其超参数。以下是使用预设的训练策略(yaml文件)进行模型训练的示例。 - ```shell - msrun --bind_core=True --worker_num 4 python train.py -c configs/squeezenet/squeezenet_1.0_ascend.yaml - ``` + ```shell + msrun --bind_core=True --worker_num 4 python train.py -c configs/squeezenet/squeezenet_1.0_ascend.yaml + ``` - **预定义的训练策略** - MindCV目前提供了超过20种模型训练策略,在ImageNet取得SoTA性能。 - 具体的参数配置和详细精度性能汇总请见[`configs`](configs)文件夹。 - 您可以便捷地将这些训练策略用于您的模型训练中以提高性能(复用或修改相应的yaml文件即可)。 + **预定义的训练策略** + MindCV目前提供了超过20种模型训练策略,在ImageNet取得SoTA性能。 + 具体的参数配置和详细精度性能汇总请见[`configs`](configs)文件夹。 + 您可以便捷地将这些训练策略用于您的模型训练中以提高性能(复用或修改相应的yaml文件即可)。 - 在ModelArts/OpenI平台上训练 - 在[ModelArts](https://www.huaweicloud.com/intl/en-us/product/modelarts.html)或[OpenI](https://openi.pcl.ac.cn/)云平台上进行训练,需要执行以下操作: + 在[ModelArts](https://www.huaweicloud.com/intl/en-us/product/modelarts.html)或[OpenI](https://openi.pcl.ac.cn/)云平台上进行训练,需要执行以下操作: - ```text - 1、在云平台上创建新的训练任务。 - 2、在网站UI界面添加运行参数`config`,并指定yaml配置文件的路径。 - 3、在网站UI界面添加运行参数`enable_modelarts`并设置为True。 - 4、在网站上填写其他训练信息并启动训练任务。 - ``` + ```text + 1、在云平台上创建新的训练任务。 + 2、在网站UI界面添加运行参数`config`,并指定yaml配置文件的路径。 + 3、在网站UI界面添加运行参数`enable_modelarts`并设置为True。 + 4、在网站上填写其他训练信息并启动训练任务。 + ``` **静态图和动态图模式** @@ -219,41 +219,41 @@ python train.py --model=resnet50 --dataset=cifar10 \
支持模型 -* Big Transfer ResNetV2 (BiT) - https://arxiv.org/abs/1912.11370 -* ConvNeXt - https://arxiv.org/abs/2201.03545 -* ConViT (Soft Convolutional Inductive Biases Vision Transformers)- https://arxiv.org/abs/2103.10697 -* DenseNet - https://arxiv.org/abs/1608.06993 -* DPN (Dual-Path Network) - https://arxiv.org/abs/1707.01629 -* EfficientNet (MBConvNet Family) https://arxiv.org/abs/1905.11946 -* EfficientNet V2 - https://arxiv.org/abs/2104.00298 -* GhostNet - https://arxiv.org/abs/1911.11907 -* GoogLeNet - https://arxiv.org/abs/1409.4842 -* Inception-V3 - https://arxiv.org/abs/1512.00567 -* Inception-ResNet-V2 and Inception-V4 - https://arxiv.org/abs/1602.07261 -* MNASNet - https://arxiv.org/abs/1807.11626 -* MobileNet-V1 - https://arxiv.org/abs/1704.04861 -* MobileNet-V2 - https://arxiv.org/abs/1801.04381 -* MobileNet-V3 (MBConvNet w/ Efficient Head) - https://arxiv.org/abs/1905.02244 -* NASNet - https://arxiv.org/abs/1707.07012 -* PNasNet - https://arxiv.org/abs/1712.00559 -* PVT (Pyramid Vision Transformer) - https://arxiv.org/abs/2102.12122 -* PoolFormer models - https://github.com/sail-sg/poolformer -* RegNet - https://arxiv.org/abs/2003.13678 -* RepMLP https://arxiv.org/abs/2105.01883 -* RepVGG - https://arxiv.org/abs/2101.03697 -* ResNet (v1b/v1.5) - https://arxiv.org/abs/1512.03385 -* ResNeXt - https://arxiv.org/abs/1611.05431 -* Res2Net - https://arxiv.org/abs/1904.01169 -* ReXNet - https://arxiv.org/abs/2007.00992 -* ShuffleNet v1 - https://arxiv.org/abs/1707.01083 -* ShuffleNet v2 - https://arxiv.org/abs/1807.11164 -* SKNet - https://arxiv.org/abs/1903.06586 -* SqueezeNet - https://arxiv.org/abs/1602.07360 -* Swin Transformer - https://arxiv.org/abs/2103.14030 -* VGG - https://arxiv.org/abs/1409.1556 -* Visformer - https://arxiv.org/abs/2104.12533 -* Vision Transformer (ViT) - https://arxiv.org/abs/2010.11929 -* Xception - https://arxiv.org/abs/1610.02357 +- Big Transfer ResNetV2 (BiT) - https://arxiv.org/abs/1912.11370 +- ConvNeXt - https://arxiv.org/abs/2201.03545 +- ConViT (Soft Convolutional Inductive Biases Vision Transformers)- https://arxiv.org/abs/2103.10697 +- DenseNet - https://arxiv.org/abs/1608.06993 +- DPN (Dual-Path Network) - https://arxiv.org/abs/1707.01629 +- EfficientNet (MBConvNet Family) https://arxiv.org/abs/1905.11946 +- EfficientNet V2 - https://arxiv.org/abs/2104.00298 +- GhostNet - https://arxiv.org/abs/1911.11907 +- GoogLeNet - https://arxiv.org/abs/1409.4842 +- Inception-V3 - https://arxiv.org/abs/1512.00567 +- Inception-ResNet-V2 and Inception-V4 - https://arxiv.org/abs/1602.07261 +- MNASNet - https://arxiv.org/abs/1807.11626 +- MobileNet-V1 - https://arxiv.org/abs/1704.04861 +- MobileNet-V2 - https://arxiv.org/abs/1801.04381 +- MobileNet-V3 (MBConvNet w/ Efficient Head) - https://arxiv.org/abs/1905.02244 +- NASNet - https://arxiv.org/abs/1707.07012 +- PNasNet - https://arxiv.org/abs/1712.00559 +- PVT (Pyramid Vision Transformer) - https://arxiv.org/abs/2102.12122 +- PoolFormer models - https://github.com/sail-sg/poolformer +- RegNet - https://arxiv.org/abs/2003.13678 +- RepMLP https://arxiv.org/abs/2105.01883 +- RepVGG - https://arxiv.org/abs/2101.03697 +- ResNet (v1b/v1.5) - https://arxiv.org/abs/1512.03385 +- ResNeXt - https://arxiv.org/abs/1611.05431 +- Res2Net - https://arxiv.org/abs/1904.01169 +- ReXNet - https://arxiv.org/abs/2007.00992 +- ShuffleNet v1 - https://arxiv.org/abs/1707.01083 +- ShuffleNet v2 - https://arxiv.org/abs/1807.11164 +- SKNet - https://arxiv.org/abs/1903.06586 +- SqueezeNet - https://arxiv.org/abs/1602.07360 +- Swin Transformer - https://arxiv.org/abs/2103.14030 +- VGG - https://arxiv.org/abs/1409.1556 +- Visformer - https://arxiv.org/abs/2104.12533 +- Vision Transformer (ViT) - https://arxiv.org/abs/2010.11929 +- Xception - https://arxiv.org/abs/1610.02357 关于模型性能和预训练权重的信息请查看 [configs](./configs) 文件夹。 @@ -266,43 +266,43 @@ python train.py --model=resnet50 --dataset=cifar10 \
支持算法列表 -* 数据增强 - * [AutoAugment](https://arxiv.org/abs/1805.09501) - * [RandAugment](https://arxiv.org/abs/1909.13719) - * [Repeated Augmentation](https://openaccess.thecvf.com/content_CVPR_2020/papers/Hoffer_Augment_Your_Batch_Improving_Generalization_Through_Instance_Repetition_CVPR_2020_paper.pdf) - * RandErasing (Cutout) - * CutMix - * MixUp - * RandomResizeCrop - * Color Jitter, Flip, etc -* 优化器 - * Adam - * AdamW - * [Lion](https://arxiv.org/abs/2302.06675) - * Adan (experimental) - * AdaGrad - * LAMB - * Momentum - * RMSProp - * SGD - * NAdam -* 学习率调度器 - * Warmup Cosine Decay - * Step LR - * Polynomial Decay - * Exponential Decay -* 正则化 - * Weight Decay - * Label Smoothing - * Stochastic Depth (depends on networks) - * Dropout (depends on networks) -* 损失函数 - * Cross Entropy (w/ class weight and auxiliary logit support) - * Binary Cross Entropy (w/ class weight and auxiliary logit support) - * Soft Cross Entropy Loss (automatically enabled if mixup or label smoothing is used) - * Soft Binary Cross Entropy Loss (automatically enabled if mixup or label smoothing is used) -* 模型融合 - * Warmup EMA (Exponential Moving Average) +- 数据增强 + - [AutoAugment](https://arxiv.org/abs/1805.09501) + - [RandAugment](https://arxiv.org/abs/1909.13719) + - [Repeated Augmentation](https://openaccess.thecvf.com/content_CVPR_2020/papers/Hoffer_Augment_Your_Batch_Improving_Generalization_Through_Instance_Repetition_CVPR_2020_paper.pdf) + - RandErasing (Cutout) + - CutMix + - MixUp + - RandomResizeCrop + - Color Jitter, Flip, etc +- 优化器 + - Adam + - AdamW + - [Lion](https://arxiv.org/abs/2302.06675) + - Adan (experimental) + - AdaGrad + - LAMB + - Momentum + - RMSProp + - SGD + - NAdam +- 学习率调度器 + - Warmup Cosine Decay + - Step LR + - Polynomial Decay + - Exponential Decay +- 正则化 + - Weight Decay + - Label Smoothing + - Stochastic Depth (depends on networks) + - Dropout (depends on networks) +- 损失函数 + - Cross Entropy (w/ class weight and auxiliary logit support) + - Binary Cross Entropy (w/ class weight and auxiliary logit support) + - Soft Cross Entropy Loss (automatically enabled if mixup or label smoothing is used) + - Soft Binary Cross Entropy Loss (automatically enabled if mixup or label smoothing is used) +- 模型融合 + - Warmup EMA (Exponential Moving Average)
@@ -337,9 +337,10 @@ python train.py --model=resnet50 --dataset=cifar10 \ 6. BREAKING CHANGES: - 我们将在此小版本的未来发布中丢弃对MindSpore1.x的支持。 - 配置项`filter_bias_and_bn`将被移除并更名为`weight_decay_filter`。 - 我们会对已有训练策略进行迁移,但函数`create_optimizer`的签名变更将是不兼容的,且未迁移旧版本的训练策略也将变得不兼容。详见[PR/752](https://github.com/mindspore-lab/mindcv/pull/752)。 + 我们会对已有训练策略进行迁移,但函数`create_optimizer`的签名变更将是不兼容的,且未迁移旧版本的训练策略也将变得不兼容。详见[PR/752](https://github.com/mindspore-lab/mindcv/pull/752)。 - 2023/6/16 + 1. 新版本 `0.2.2` 发布啦!我们将`MindSpore`升级到了2.0版本,同时保持了对1.8版本的兼容 2. 新模型: - [ConvNextV2](configs/convnextv2) @@ -357,90 +358,101 @@ python train.py --model=resnet50 --dataset=cifar10 \ - 文档网站上的损坏链接 - 2023/6/2 + 1. 新版本:`0.2.1` 发布 2. 新[文档](https://mindspore-lab.github.io/mindcv/zh/)上线 - 2023/5/30 + 1. 新模型: - - [VGG](configs/vgg)混合精度(O2)版本 - - [GhostNet](configs/ghostnet) - - [MobileNetV2](configs/mobilenetv2) 和 [MobileNetV3](configs/mobilenetv3)混合精度(O3)版本 - - [RegNet](configs/regnet)的(x,y)_(200,400,600,800)mf版本 - - [RepVGG](configs/repvgg)的b1g2, b1g4 & b2g4版本 - - [MnasNet](configs/mnasnet)的0.5版本 - - [PVTv2](configs/pvtv2)的b3 & b4版本 + - [VGG](configs/vgg)混合精度(O2)版本 + - [GhostNet](configs/ghostnet) + - [MobileNetV2](configs/mobilenetv2) 和 [MobileNetV3](configs/mobilenetv3)混合精度(O3)版本 + - [RegNet](configs/regnet)的(x,y)\_(200,400,600,800)mf版本 + - [RepVGG](configs/repvgg)的b1g2, b1g4 & b2g4版本 + - [MnasNet](configs/mnasnet)的0.5版本 + - [PVTv2](configs/pvtv2)的b3 & b4版本 2. 新特性: - - 3-Augment, Augmix, TrivialAugmentWide + - 3-Augment, Augmix, TrivialAugmentWide 3. 错误修复: - - ViT 池化模式 + - ViT 池化模式 - 2023/04/28 + 1. 增添了一些新模型,列出如下: - - [VGG](configs/vgg) - - [DPN](configs/dpn) - - [ResNet v2](configs/resnetv2) - - [MnasNet](configs/mnasnet) - - [MixNet](configs/mixnet) - - [RepVGG](configs/repvgg) - - [ConvNeXt](configs/convnext) - - [Swin Transformer](configs/swintransformer) - - [EdgeNeXt](configs/edgenext) - - [CrossViT](configs/crossvit) - - [XCiT](configs/xcit) - - [CoAT](configs/coat) - - [PiT](configs/pit) - - [PVT v2](configs/pvtv2) - - [MobileViT](configs/mobilevit) + - [VGG](configs/vgg) + - [DPN](configs/dpn) + - [ResNet v2](configs/resnetv2) + - [MnasNet](configs/mnasnet) + - [MixNet](configs/mixnet) + - [RepVGG](configs/repvgg) + - [ConvNeXt](configs/convnext) + - [Swin Transformer](configs/swintransformer) + - [EdgeNeXt](configs/edgenext) + - [CrossViT](configs/crossvit) + - [XCiT](configs/xcit) + - [CoAT](configs/coat) + - [PiT](configs/pit) + - [PVT v2](configs/pvtv2) + - [MobileViT](configs/mobilevit) 2. 错误修正: - - 分布式训练时,需对每个进程设置相同的随机数种子 - - 检查YAML配置文件中的选项是否存在于命令行解析器 - - 修正了优化器`Adan`中标志变量不为`Tensor`的错误 + - 分布式训练时,需对每个进程设置相同的随机数种子 + - 检查YAML配置文件中的选项是否存在于命令行解析器 + - 修正了优化器`Adan`中标志变量不为`Tensor`的错误 - 2023/03/25 + 1. 更新ResNet网络预训练权重,现在预训练权重有更高Top1精度 - - ResNet18精度从70.09提升到70.31 - - ResNet34精度从73.69提升到74.15 - - ResNet50精度从76.64提升到76.69 - - ResNet101精度从77.63提升到78.24 - - ResNet152精度从78.63提升到78.72 + - ResNet18精度从70.09提升到70.31 + - ResNet34精度从73.69提升到74.15 + - ResNet50精度从76.64提升到76.69 + - ResNet101精度从77.63提升到78.24 + - ResNet152精度从78.63提升到78.72 2. 按照规则(model_scale-sha256sum.ckpt)更新预训练权重名字和相应下载URL链接 - 2023/03/05 + 1. 增加Lion (EvoLved Sign Momentum)优化器,论文 https://arxiv.org/abs/2302.06675 - - Lion所使用的学习率一般比Adamw小3到10倍,而权重衰减(weigt_decay)要大3到10倍 + - Lion所使用的学习率一般比Adamw小3到10倍,而权重衰减(weigt_decay)要大3到10倍 2. 增加6个模型及其训练策略、预训练权重: - - [HRNet](configs/hrnet) - - [SENet](configs/senet) - - [GoogLeNet](configs/googlenet) - - [Inception V3](configs/inceptionv3) - - [Inception V4](configs/inceptionv4) - - [Xception](configs/xception) + - [HRNet](configs/hrnet) + - [SENet](configs/senet) + - [GoogLeNet](configs/googlenet) + - [Inception V3](configs/inceptionv3) + - [Inception V4](configs/inceptionv4) + - [Xception](configs/xception) 3. 支持梯度裁剪 - 2023/01/10 + 1. MindCV v0.1发布! 支持通过PyPI安装 (`pip install mindcv`) 2. 新增4个模型的预训练权重及其策略: googlenet, inception_v3, inception_v4, xception - 2022/12/09 + 1. 支持在所有学习率策略中添加学习率预热操作,除cosine decay策略外 2. 支持`Repeated Augmenation`操作,可以通过`--aug_repeats`对其进行设置,设置值应大于1(通常为3或4) 3. 支持EMA 4. 通过支持mixup和cutmix操作进一步优化BCE损失函数 - 2022/11/21 + 1. 支持模型损失和正确率的可视化 2. 支持轮次维度的cosine decay策略的学习率预热操作(之前仅支持步维度) - 2022/11/09 + 1. 支持2个ViT预训练模型 2. 支持RandAugment augmentation操作 3. 提高了CutMix操作的可用性,CutMix和Mixup目前可以一起使用 4. 解决了学习率画图的bug - 2022/10/12 + 1. BCE和CE损失函数目前都支持class-weight config操作、label smoothing操作、auxilary logit input操作(适用于类似Inception模型) - 2022/09/13 + 1. 支持Adan优化器(试用版) ## 贡献方式 diff --git a/RELEASE.md b/RELEASE.md index a1b3c72f6..d36d68e71 100644 --- a/RELEASE.md +++ b/RELEASE.md @@ -29,9 +29,9 @@ Release `0.3.0` is published. We will drop MindSpore 1.x in the future release. 6. BREAKING CHANGES: - We are going to drop support of MindSpore 1.x for it's EOL. - Configuration `filter_bias_and_bn` will be removed and renamed as `weight_decay_filter`, - due to a prolonged misunderstanding of the MindSpore optimizer. - We will migrate the existing training recipes, but the signature change of function a will be incompatible - and the old version training recipes will also be incompatible. See [PR/752](https://github.com/mindspore-lab/mindcv/pull/752) for details. + due to a prolonged misunderstanding of the MindSpore optimizer. + We will migrate the existing training recipes, but the signature change of function a will be incompatible + and the old version training recipes will also be incompatible. See [PR/752](https://github.com/mindspore-lab/mindcv/pull/752) for details. ## 0.2.2 (2023/6/16) @@ -54,99 +54,110 @@ Release `0.3.0` is published. We will drop MindSpore 1.x in the future release. ## 0.2.1 - 2023/6/2 + 1. New version: `0.2.1` is released! 2. New [documents](https://mindspore-lab.github.io/mindcv/) is online! - 2023/5/30 + 1. New Models: - - AMP(O2) version of [VGG](configs/vgg) - - [GhostNet](configs/ghostnet) - - AMP(O3) version of [MobileNetV2](configs/mobilenetv2) and [MobileNetV3](configs/mobilenetv3) - - (x,y)_(200,400,600,800)mf of [RegNet](configs/regnet) - - b1g2, b1g4 & b2g4 of [RepVGG](configs/repvgg) - - 0.5 of [MnasNet](configs/mnasnet) - - b3 & b4 of [PVTv2](configs/pvtv2) + - AMP(O2) version of [VGG](configs/vgg) + - [GhostNet](configs/ghostnet) + - AMP(O3) version of [MobileNetV2](configs/mobilenetv2) and [MobileNetV3](configs/mobilenetv3) + - (x,y)\_(200,400,600,800)mf of [RegNet](configs/regnet) + - b1g2, b1g4 & b2g4 of [RepVGG](configs/repvgg) + - 0.5 of [MnasNet](configs/mnasnet) + - b3 & b4 of [PVTv2](configs/pvtv2) 2. New Features: - - 3-Augment, Augmix, TrivialAugmentWide + - 3-Augment, Augmix, TrivialAugmentWide 3. Bug Fixes: - - ViT pooling mode + - ViT pooling mode - 2023/04/28 + 1. Add some new models, listed as following - - [VGG](configs/vgg) - - [DPN](configs/dpn) - - [ResNet v2](configs/resnetv2) - - [MnasNet](configs/mnasnet) - - [MixNet](configs/mixnet) - - [RepVGG](configs/repvgg) - - [ConvNeXt](configs/convnext) - - [Swin Transformer](configs/swintransformer) - - [EdgeNeXt](configs/edgenext) - - [CrossViT](configs/crossvit) - - [XCiT](configs/xcit) - - [CoAT](configs/coat) - - [PiT](configs/pit) - - [PVT v2](configs/pvtv2) - - [MobileViT](configs/mobilevit) + - [VGG](configs/vgg) + - [DPN](configs/dpn) + - [ResNet v2](configs/resnetv2) + - [MnasNet](configs/mnasnet) + - [MixNet](configs/mixnet) + - [RepVGG](configs/repvgg) + - [ConvNeXt](configs/convnext) + - [Swin Transformer](configs/swintransformer) + - [EdgeNeXt](configs/edgenext) + - [CrossViT](configs/crossvit) + - [XCiT](configs/xcit) + - [CoAT](configs/coat) + - [PiT](configs/pit) + - [PVT v2](configs/pvtv2) + - [MobileViT](configs/mobilevit) 2. Bug fix: - - Setting the same random seed for each rank - - Checking if options from yaml config exist in argument parser - - Initializing flag variable as `Tensor` in Optimizer `Adan` + - Setting the same random seed for each rank + - Checking if options from yaml config exist in argument parser + - Initializing flag variable as `Tensor` in Optimizer `Adan` ## 0.2.0 - 2023/03/25 + 1. Update checkpoints for pretrained ResNet for better accuracy - - ResNet18 (from 70.09 to 70.31 @Top1 accuracy) - - ResNet34 (from 73.69 to 74.15 @Top1 accuracy) - - ResNet50 (from 76.64 to 76.69 @Top1 accuracy) - - ResNet101 (from 77.63 to 78.24 @Top1 accuracy) - - ResNet152 (from 78.63 to 78.72 @Top1 accuracy) + - ResNet18 (from 70.09 to 70.31 @Top1 accuracy) + - ResNet34 (from 73.69 to 74.15 @Top1 accuracy) + - ResNet50 (from 76.64 to 76.69 @Top1 accuracy) + - ResNet101 (from 77.63 to 78.24 @Top1 accuracy) + - ResNet152 (from 78.63 to 78.72 @Top1 accuracy) 2. Rename checkpoint file name to follow naming rule ({model_scale-sha256sum.ckpt}) and update download URLs. - 2023/03/05 + 1. Add Lion (EvoLved Sign Momentum) optimizer from paper https://arxiv.org/abs/2302.06675 - - To replace adamw with lion, LR is usually 3-10x smaller, and weight decay is usually 3-10x larger than adamw. + - To replace adamw with lion, LR is usually 3-10x smaller, and weight decay is usually 3-10x larger than adamw. 2. Add 6 new models with training recipes and pretrained weights for - - [HRNet](configs/hrnet) - - [SENet](configs/senet) - - [GoogLeNet](configs/googlenet) - - [Inception V3](configs/inceptionv3) - - [Inception V4](configs/inceptionv4) - - [Xception](configs/xception) + - [HRNet](configs/hrnet) + - [SENet](configs/senet) + - [GoogLeNet](configs/googlenet) + - [Inception V3](configs/inceptionv3) + - [Inception V4](configs/inceptionv4) + - [Xception](configs/xception) 3. Support gradient clip 4. Arg name `use_ema` changed to **`ema`**, add `ema: True` in yaml to enable EMA. ## 0.1.1 - 2023/01/10 + 1. MindCV v0.1 released! It can be installed via PyPI `pip install mindcv` now. 2. Add training recipe and trained weights of googlenet, inception_v3, inception_v4, xception ## 0.1.0 - 2022/12/09 + 1. Support lr warmup for all lr scheduling algorithms besides cosine decay. 2. Add repeated augmentation, which can be enabled by setting `--aug_repeats` to be a value larger than 1 (typically, 3 or 4 is a common choice). 3. Add EMA. 4. Improve BCE loss to support mixup/cutmix. - 2022/11/21 + 1. Add visualization for loss and acc curves 2. Support epochwise lr warmup cosine decay (previous is stepwise) - 2022/11/09 + 1. Add 7 pretrained ViT models. 2. Add RandAugment augmentation. 3. Fix CutMix efficiency issue and CutMix and Mixup can be used together. 4. Fix lr plot and scheduling bug. - 2022/10/12 + 1. Both BCE and CE loss now support class-weight config, label smoothing, and auxiliary logit input (for networks like inception). ## 0.0.1-beta - 2022/09/13 + 1. Add Adan optimizer (experimental) ## MindSpore Computer Vision 0.0.1 diff --git a/benchmark_results.md b/benchmark_results.md index ae930e38c..ef87c29ba 100644 --- a/benchmark_results.md +++ b/benchmark_results.md @@ -1,8 +1,6 @@ -
performance tested on Ascend 910(8p) with graph mode - | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | | ---------------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------- | | bit_resnet50 | 25.55 | 8 | 32 | 224x224 | O2 | 146s | 74.52 | 3413.33 | 76.81 | 93.17 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/bit/bit_resnet50_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/bit/BiT_resnet50-1e4795a4.ckpt) | @@ -63,9 +61,6 @@
performance tested on Ascend 910*(8p) with graph mode - - - | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | | ---------------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------- | | convit_tiny | 5.71 | 8 | 256 | 224x224 | O2 | 153s | 226.51 | 9022.03 | 73.79 | 91.70 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/convit/convit_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/convit/convit_tiny-1961717e-910v2.ckpt) | @@ -115,4 +110,5 @@
### Notes + - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/README.md b/configs/README.md index 79e2d5d98..465737abf 100644 --- a/configs/README.md +++ b/configs/README.md @@ -1,6 +1,6 @@ ### File Structure and Naming -This folder contains training recipes and model readme files for each model. The folder structure and naming rule of model configurations are as follows. +This folder contains training recipes and model readme files for each model. The folder structure and naming rule of model configurations are as follows. ``` ├── configs @@ -25,22 +25,20 @@ This folder contains training recipes and model readme files for each model. The > generalization across different hardware. ### Model Readme Writing Guideline + The model readme file in each sub-folder provides the introduction, reproduced results, and running guideline for each model. Please follow the outline structure and **table format** shown in [densenet/README.md](https://github.com/mindspore-lab/mindcv/blob/main/configs/densenet/README.md) when contributing your models :) #### Table Format - - | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | | ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------- | | densenet121 | 8.06 | 8 | 32 | 224x224 | O2 | 300s | 47,34 | 5446.81 | 75.67 | 92.77 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/densenet/densenet_121_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/densenet/densenet121-bf4ab27f-910v2.ckpt) | - - Illustration: -- model name: model name in lower case with _ seperator. + +- model name: model name in lower case with \_ seperator. - top-1 and top-5: Accuracy reported on the validatoin set of ImageNet-1K. Keep 2 digits after the decimal point. - params(M): # of model parameters in millions (10^6). Keep **2 digits** after the decimal point - batch size: Training batch size @@ -51,27 +49,28 @@ Illustration: - weight: url of the pretrained model weights ### Model Checkpoint Format - The checkpoint (i.e., model weight) name should follow this format: **{model_name}_{specification}-{sha256sum}.ckpt**, e.g., `poolformer_s12-5be5c4e4.ckpt`. - You can run the following command and take the first 8 characters of the computing result as the sha256sum value in the checkpoint name. +The checkpoint (i.e., model weight) name should follow this format: **{model*name}*{specification}-{sha256sum}.ckpt**, e.g., `poolformer_s12-5be5c4e4.ckpt`. - ```shell - sha256sum your_model.ckpt - ``` +You can run the following command and take the first 8 characters of the computing result as the sha256sum value in the checkpoint name. +```shell +sha256sum your_model.ckpt +``` #### Training Script Format For consistency, it is recommended to provide distributed training commands based on `msrun --bind_core=True --worker_num {num_devices} python train.py`, instead of using shell script such as `distrubuted_train.sh`. - ```shell - # standalone training on single NPU device - python train.py --config configs/densenet/densenet_121_gpu.yaml --data_dir /path/to/dataset --distribute False +```shell +# standalone training on single NPU device +python train.py --config configs/densenet/densenet_121_gpu.yaml --data_dir /path/to/dataset --distribute False - # distributed training on NPU divices - msrun --bind_core=True --worker_num 8 python train.py --config configs/densenet/densenet_121_ascend.yaml --data_dir /path/to/imagenet +# distributed training on NPU divices +msrun --bind_core=True --worker_num 8 python train.py --config configs/densenet/densenet_121_ascend.yaml --data_dir /path/to/imagenet - ``` +``` #### URL and Hyperlink Format + Please use **absolute path** in the hyperlink or url for linking the target resource in the readme file and table. diff --git a/configs/bit/README.md b/configs/bit/README.md index ab287b379..5f6b6d219 100644 --- a/configs/bit/README.md +++ b/configs/bit/README.md @@ -69,16 +69,9 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------- | -| bit_resnet50 | 25.55 | 8 | 32 | 224x224 | O2 | 146s | 74.52 | 3413.33 | 76.81 | 93.17 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/bit/bit_resnet50_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/bit/BiT_resnet50-1e4795a4.ckpt) | - - +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- |-----------| ------------- | ------- | ------- | ------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------- | -------- | -------- | +| bit | 8 | 32 | 224x224 | O2 | 171s | 60.48 | 4232.80 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/bit/bit_resnet50_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/bit/BiT_resnet50_best_v2.ckpt) | 76.72 | 93.25 | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/cmt/README.md b/configs/cmt/README.md index 3e2cd97c7..af94335a9 100644 --- a/configs/cmt/README.md +++ b/configs/cmt/README.md @@ -67,13 +67,10 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | ------------------------------------------------------------------------------------------------ | ----------------------------------------------------------------------------------------------- | -------- | -------- | +| cmt | 8 | 128 | 224x224 | O2 | 1210s | 324.95 | 3151.25 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/cmt/cmt_small_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/cmt/cmt_small-184_1251_v2.ckpt) | 83.15 | 96.48 | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------ | -| cmt_small | 26.09 | 8 | 128 | 224x224 | O2 | 1268s | 500.64 | 2048.01 | 83.24 | 96.41 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/cmt/cmt_small_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/cmt/cmt_small-6858ee22.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/coat/README.md b/configs/coat/README.md index c9cbc9eab..edd8770c4 100644 --- a/configs/coat/README.md +++ b/configs/coat/README.md @@ -63,17 +63,9 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | -------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------- | -| coat_tiny | 5.50 | 8 | 32 | 224x224 | O2 | 543s | 254.95 | 1003.92 | 79.67 | 94.88 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/coat/coat_tiny_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/coat/coat_tiny-071cb792.ckpt) | +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | ------------------------------------------------------------------------------------------------ | ---------------------------------------------------------------------------------------------- | -------- | -------- | +| coat | 8 | 32 | 224x224 | O2 | 644s | 373.00 | 686.33 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/coat/coat_lite_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/coat/coat_tiny-dcca16b1-910v2.ckpt) | 79.27 | 94.29 | diff --git a/configs/convit/README.md b/configs/convit/README.md index 3ba9dbfc6..a493a44d0 100644 --- a/configs/convit/README.md +++ b/configs/convit/README.md @@ -77,15 +77,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------- | -| convit_tiny | 5.71 | 8 | 256 | 224x224 | O2 | 153s | 226.51 | 9022.03 | 73.79 | 91.70 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/convit/convit_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/convit/convit_tiny-1961717e-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------ | ----------------------------------------------------------------------------------------- | -| convit_tiny | 5.71 | 8 | 256 | 224x224 | O2 | 133s | 231.62 | 8827.59 | 73.66 | 91.72 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/convit/convit_tiny_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/convit/convit_tiny-e31023f2.ckpt) | - +| ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------- | +| convit_tiny | 5.71 | 8 | 256 | 224x224 | O2 | 153s | 221.21 | 9258.17 | 73.79 | 91.70 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/convit/convit_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/convit/convit_tiny-1961717e-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/convnext/README.md b/configs/convnext/README.md index 781150b92..9d4cd8fd4 100644 --- a/configs/convnext/README.md +++ b/configs/convnext/README.md @@ -75,14 +75,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------- | -| convnext_tiny | 28.59 | 8 | 16 | 224x224 | O2 | 137s | 48.7 | 2612.24 | 81.28 | 95.61 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/convnext/convnext_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/convnext/convnext_tiny-db11dc82-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------- | -| convnext_tiny | 28.59 | 8 | 16 | 224x224 | O2 | 127s | 66.79 | 1910.45 | 81.91 | 95.79 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/convnext/convnext_tiny_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/convnext/convnext_tiny-ae5ff8d7.ckpt) | +| ------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------- | +| convnext_tiny | 28.59 | 8 | 16 | 224x224 | O2 | 137s | 36.08 | 3547.67 | 81.28 | 95.61 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/convnext/convnext_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/convnext/convnext_tiny-db11dc82-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/convnextv2/README.md b/configs/convnextv2/README.md index 4ef7f251b..41bdaca27 100644 --- a/configs/convnextv2/README.md +++ b/configs/convnextv2/README.md @@ -74,14 +74,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | -------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- | -| convnextv2_tiny | 28.64 | 8 | 128 | 224x224 | O2 | 268s | 257.2 | 3984.44 | 82.39 | 95.95 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/convnextv2/convnextv2_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/convnextv2/convnextv2_tiny-a35b79ce-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | -------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------- | -| convnextv2_tiny | 28.64 | 8 | 128 | 224x224 | O2 | 237s | 400.20 | 2560.00 | 82.43 | 95.98 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/convnextv2/convnextv2_tiny_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/convnextv2/convnextv2_tiny-d441ba2c.ckpt) | +| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | -------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- | +| convnextv2_tiny | 28.64 | 8 | 128 | 224x224 | O2 | 268s | 280.47 | 3651.01 | 82.39 | 95.95 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/convnextv2/convnextv2_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/convnextv2/convnextv2_tiny-a35b79ce-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/crossvit/README.md b/configs/crossvit/README.md index 1e19361c2..9f792cddf 100644 --- a/configs/crossvit/README.md +++ b/configs/crossvit/README.md @@ -70,14 +70,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------- | -| crossvit_9 | 8.55 | 8 | 256 | 240x240 | O2 | 221s | 514.36 | 3984.44 | 73.38 | 91.51 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/crossvit/crossvit_9_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/crossvit/crossvit_9-32c69c96-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------ | -| crossvit_9 | 8.55 | 8 | 256 | 240x240 | O2 | 206s | 550.79 | 3719.30 | 73.56 | 91.79 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/crossvit/crossvit_9_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/crossvit/crossvit_9-e74c8e18.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------- | +| crossvit_9 | 8.55 | 8 | 256 | 240x240 | O2 | 221s | 498.96 | 4104.53 | 73.38 | 91.51 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/crossvit/crossvit_9_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/crossvit/crossvit_9-32c69c96-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/densenet/README.md b/configs/densenet/README.md index 77f4402b0..f5d6ed835 100644 --- a/configs/densenet/README.md +++ b/configs/densenet/README.md @@ -78,15 +78,9 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------- | -| densenet121 | 8.06 | 8 | 32 | 224x224 | O2 | 300s | 47,34 | 5446.81 | 75.67 | 92.77 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/densenet/densenet_121_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/densenet/densenet121-bf4ab27f-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------- | -| densenet121 | 8.06 | 8 | 32 | 224x224 | O2 | 191s | 43.28 | 5914.97 | 75.64 | 92.84 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/densenet/densenet_121_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/densenet/densenet121-120_5004_Ascend.ckpt) | +| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | +| ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| --- | -------- | -------- | --------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------- | +| densenet121 | 8.06 | 8 | 32 | 224x224 | O2 | 300s | 50.01 | 5118.97 | 75.67 | 92.77 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/densenet/densenet_121_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/densenet/densenet121-bf4ab27f-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/dpn/README.md b/configs/dpn/README.md index a7b58c5d9..8d724a17f 100644 --- a/configs/dpn/README.md +++ b/configs/dpn/README.md @@ -75,14 +75,10 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | ------------------------------------------------------------------------------------------------ | ----------------------------------------------------------------------------------------------- | -------- | -------- | +| dpn | 8 | 32 | 224x224 | O2 | 336s | 76.23 | 3358.26 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/dpn/dpn131_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/dpn/dpn92-189_5004_v2.ckpt) | 76.00 | 92.45 | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------- | -| dpn92 | 37.79 | 8 | 32 | 224x224 | O2 | 293s | 78.22 | 3272.82 | 79.46 | 94.49 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/dpn/dpn92_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/dpn/dpn92-e3e0fca.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/edgenext/README.md b/configs/edgenext/README.md index a91d50d8f..ca49d8c30 100644 --- a/configs/edgenext/README.md +++ b/configs/edgenext/README.md @@ -76,14 +76,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ----------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | -------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- | -| edgenext_xx_small | 1.33 | 8 | 256 | 256x256 | O2 | 389s | 239.38 | 8555.43 | 70.64 | 89.75 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/edgenext/edgenext_xx_small_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/edgenext/edgenext_xx_small-cad13d2c-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ----------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | -------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------- | -| edgenext_xx_small | 1.33 | 8 | 256 | 256x256 | O2 | 311s | 191.24 | 10709.06 | 71.02 | 89.99 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/edgenext/edgenext_xx_small_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/edgenext/edgenext_xx_small-afc971fb.ckpt) | +| ----------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |--------| ------- | -------- | -------- | -------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- | +| edgenext_xx_small | 1.33 | 8 | 256 | 256x256 | O2 | 389s | 225.09 | 9098.58 | 70.64 | 89.75 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/edgenext/edgenext_xx_small_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/edgenext/edgenext_xx_small-cad13d2c-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/efficientnet/README.md b/configs/efficientnet/README.md index e05800fc9..ea1cba257 100644 --- a/configs/efficientnet/README.md +++ b/configs/efficientnet/README.md @@ -76,14 +76,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------- | -| efficientnet_b0 | 5.33 | 8 | 128 | 224x224 | O2 | 353s | 172.64 | 5931.42 | 76.88 | 93.28 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/efficientnet/efficientnet_b0_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/efficientnet/efficientnet_b0-f8d7aa2a-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------- | -| efficientnet_b0 | 5.33 | 8 | 128 | 224x224 | O2 | 203s | 172.78 | 5926.61 | 76.89 | 93.16 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/efficientnet/efficientnet_b0_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/efficientnet/efficientnet_b0-103ec70c.ckpt) | +| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------- | +| efficientnet_b0 | 5.33 | 8 | 128 | 224x224 | O2 | 353s | 173.47 | 5903.03| 76.88 | 93.28 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/efficientnet/efficientnet_b0_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/efficientnet/efficientnet_b0-f8d7aa2a-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/ghostnet/README.md b/configs/ghostnet/README.md index afe4055db..da8e781fc 100644 --- a/configs/ghostnet/README.md +++ b/configs/ghostnet/README.md @@ -77,13 +77,10 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- |-----------| ------------- | ------- | ------- | ------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------ | -------- | -------- | +| ghostnet | 8 | 128 | 224x224 | O2 | 125s | 201.46 | 5082.89 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/ghostnet/ghostnet_050_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/ghostnet/ghostnet_050-ae7771cb-910v2.ckpt) | 65.93 | 86.65 | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------- | -| ghostnet_050 | 2.60 | 8 | 128 | 224x224 | O2 | 383s | 211.13 | 4850.09 | 66.03 | 86.64 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/ghostnet/ghostnet_050_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/ghostnet/ghostnet_050-85b91860.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/googlenet/README.md b/configs/googlenet/README.md index d76ec1dc6..b75607c05 100644 --- a/configs/googlenet/README.md +++ b/configs/googlenet/README.md @@ -77,12 +77,6 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------- | | googlenet | 6.99 | 8 | 32 | 224x224 | O2 | 113s | 23.5 | 10893.62 | 72.89 | 90.89 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/googlenet/googlenet_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/googlenet/googlenet-de74c31d-910v2.ckpt) | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------ | -| googlenet | 6.99 | 8 | 32 | 224x224 | O2 | 72s | 21.40 | 11962.62 | 72.68 | 90.89 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/googlenet/googlenet_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/googlenet/googlenet-5552fcd3.ckpt) | - ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/halonet/README.md b/configs/halonet/README.md index 86fa5a24d..86e563e6c 100644 --- a/configs/halonet/README.md +++ b/configs/halonet/README.md @@ -82,11 +82,10 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- |-----------| ------------- | ------- | ------- | ------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------- | -------- | -------- | +| halonet | 8 | 64 | 224x224 | O2 | 351s | 134.72 | 3800.48 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/halonet/halonet_50t_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/halonet/halonet_50t-533da6be.ckpt) | 0.10 | 0.10 | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -*coming soon* ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/hrnet/README.md b/configs/hrnet/README.md index 89027f29a..5d8747fe3 100644 --- a/configs/hrnet/README.md +++ b/configs/hrnet/README.md @@ -74,14 +74,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------- | -| hrnet_w32 | 41.30 | 8 | 128 | 224x224 | O2 | 1069s | 238.03 | 4301.98 | 80.66 | 95.30 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/hrnet/hrnet_w32_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/hrnet/hrnet_w32-e616cdcb-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------- | -| hrnet_w32 | 41.30 | 128 | 8 | 224x224 | O2 | 1312s | 279.10 | 3668.94 | 80.64 | 95.44 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/hrnet/hrnet_w32_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/hrnet/hrnet_w32-cc4fbd91.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | --------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------- | +| hrnet_w32 | 41.30 | 8 | 128 | 224x224 | O2 | 1069s | 244.02 | 4196.37 | 80.66 | 95.30 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/hrnet/hrnet_w32_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/hrnet/hrnet_w32-e616cdcb-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/inceptionv3/README.md b/configs/inceptionv3/README.md index 3245a0481..a763c0bb0 100644 --- a/configs/inceptionv3/README.md +++ b/configs/inceptionv3/README.md @@ -77,14 +77,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------------------------------------------------- | -| inception_v3 | 27.20 | 8 | 32 | 299x299 | O2 | 172s | 70.83 | 3614.29 | 79.25 | 94.47 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/inceptionv3/inception_v3_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/inception_v3/inception_v3-61a8e9ed-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------ | -| inception_v3 | 27.20 | 8 | 32 | 299x299 | O2 | 120s | 76.42 | 3349.91 | 79.11 | 94.40 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/inceptionv3/inception_v3_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/inception_v3/inception_v3-38f67890.ckpt) | +| ------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------------------------------------------------- | +| inception_v3 | 27.20 | 8 | 32 | 299x299 | O2 | 172s | 73.43 | 3486.31 | 79.25 | 94.47 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/inceptionv3/inception_v3_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/inception_v3/inception_v3-61a8e9ed-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/inceptionv4/README.md b/configs/inceptionv4/README.md index d2ed27bb1..4ee9430d3 100644 --- a/configs/inceptionv4/README.md +++ b/configs/inceptionv4/README.md @@ -73,16 +73,10 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------ | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------------------------------------------------- | -| inception_v4 | 42.74 | 8 | 32 | 299x299 | O2 | 263s | 80.97 | 3161.66 | 80.98 | 95.25 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/inceptionv4/inception_v4_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/inception_v4/inception_v4-56e798fc-910v2.ckpt) | +| ------------ | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------------------------------------------------- | +| inception_v4 | 42.74 | 8 | 32 | 299x299 | O2 | 263s | 83.24 | 3075.44 | 80.98 | 95.25 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/inceptionv4/inception_v4_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/inception_v4/inception_v4-56e798fc-910v2.ckpt) | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------ | -| inception_v4 | 42.74 | 8 | 32 | 299x299 | O2 | 177s | 76.19 | 3360.02 | 80.88 | 95.34 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/inceptionv4/inception_v4_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/inception_v4/inception_v4-db9c45b3.ckpt) | - ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/mixnet/README.md b/configs/mixnet/README.md index 5f825b80b..23464810d 100644 --- a/configs/mixnet/README.md +++ b/configs/mixnet/README.md @@ -77,15 +77,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------- | -| mixnet_s | 4.17 | 8 | 128 | 224x224 | O2 | 706s | 228.03 | 4490.64 | 75.58 | 95.54 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mixnet/mixnet_s_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mixnet/mixnet_s-fe4fcc63-910v2.ckpt) | - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------- | -| mixnet_s | 4.17 | 8 | 128 | 224x224 | O2 | 556s | 252.49 | 4055.61 | 75.52 | 92.52 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mixnet/mixnet_s_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/mixnet/mixnet_s-2a5ef3a3.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | --------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------- | +| mixnet_s | 4.17 | 8 | 128 | 224x224 | O2 | 706s | 223.55 | 4580.63 | 75.58 | 95.54 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mixnet/mixnet_s_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mixnet/mixnet_s-fe4fcc63-910v2.ckpt) | ### Notes diff --git a/configs/mnasnet/README.md b/configs/mnasnet/README.md index edc2877b6..a99f05923 100644 --- a/configs/mnasnet/README.md +++ b/configs/mnasnet/README.md @@ -43,8 +43,6 @@ msrun --bind_core=True --worker_num 8 python train.py --config configs/mnasnet/m ``` - - For detailed illustration of all hyper-parameters, please refer to [config.py](https://github.com/mindspore-lab/mindcv/blob/main/config.py). **Note:** As the global batch size (batch_size x num_devices) is an important hyper-parameter, it is recommended to keep the global batch size unchanged for reproduction or adjust the learning rate linearly to a new global batch size. @@ -74,16 +72,9 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | -------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------- | -| mnasnet_075 | 3.20 | 8 | 256 | 224x224 | O2 | 144s | 175.85 | 11646.29 | 71.77 | 90.52 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mnasnet/mnasnet_0.75_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mnasnet/mnasnet_075-083b2bc4-910v2.ckpt) | - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - +| ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| -------- | -------- | -------- | -------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------- | +| mnasnet_075 | 3.20 | 8 | 256 | 224x224 | O2 | 144s | 175.6 | 11662.87 | 71.77 | 90.52 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mnasnet/mnasnet_0.75_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mnasnet/mnasnet_075-083b2bc4-910v2.ckpt) | -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | -------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------ | -| mnasnet_075 | 3.20 | 8 | 256 | 224x224 | O2 | 140s | 165.43 | 12379.86 | 71.81 | 90.53 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mnasnet/mnasnet_0.75_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/mnasnet/mnasnet_075-465d366d.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/mobilenetv1/README.md b/configs/mobilenetv1/README.md index c3a615392..be327d4cd 100644 --- a/configs/mobilenetv1/README.md +++ b/configs/mobilenetv1/README.md @@ -70,14 +70,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | -------- | -------- | -------- | ----------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------- | -| mobilenet_v1_025 | 0.47 | 8 | 64 | 224x224 | O2 | 195s | 47.47 | 10785.76 | 54.05 | 77.74 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv1/mobilenet_v1_0.25_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mobilenet/mobilenetv1/mobilenet_v1_025-cbe3d3b3-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ----------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------- | -| mobilenet_v1_025 | 0.47 | 8 | 64 | 224x224 | O2 | 89s | 42.43 | 12066.93 | 53.87 | 77.66 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv1/mobilenet_v1_0.25_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/mobilenet/mobilenetv1/mobilenet_v1_025-d3377fba.ckpt) | +| ---------------- | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| -------- | -------- | -------- | ----------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------- | +| mobilenet_v1_025 | 0.47 | 8 | 64 | 224x224 | O2 | 195s | 46.4 | 11034.48 | 54.05 | 77.74 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv1/mobilenet_v1_0.25_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mobilenet/mobilenetv1/mobilenet_v1_025-cbe3d3b3-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/mobilenetv2/README.md b/configs/mobilenetv2/README.md index 226cd6b73..35fb585e7 100644 --- a/configs/mobilenetv2/README.md +++ b/configs/mobilenetv2/README.md @@ -74,16 +74,10 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | -------- | -------- | -------- | ----------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------- | -| mobilenet_v2_075 | 2.66 | 8 | 256 | 224x224 | O2 | 233s | 174.65 | 11726.31 | 69.73 | 89.35 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv2/mobilenet_v2_0.75_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mobilenet/mobilenetv2/mobilenet_v2_075-755932c4-910v2.ckpt) | +| ---------------- | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| -------- | -------- | -------- | ----------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------- | +| mobilenet_v2_075 | 2.66 | 8 | 256 | 224x224 | O2 | 233s | 175.15 | 11692.83 | 69.73 | 89.35 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv2/mobilenet_v2_0.75_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mobilenet/mobilenetv2/mobilenet_v2_075-755932c4-910v2.ckpt) | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ----------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------- | -| mobilenet_v2_075 | 2.66 | 8 | 256 | 224x224 | O2 | 164s | 155.94 | 13133.26 | 69.98 | 89.32 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv2/mobilenet_v2_0.75_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/mobilenet/mobilenetv2/mobilenet_v2_075-bd7bd4c4.ckpt) | - ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/mobilenetv3/README.md b/configs/mobilenetv3/README.md index 1a7b98d2a..f958699b5 100644 --- a/configs/mobilenetv3/README.md +++ b/configs/mobilenetv3/README.md @@ -76,17 +76,9 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------ | --------------------------------------------------------------------------------------------------------------------------------- | -| mobilenet_v3_small_100 | 2.55 | 8 | 75 | 224x224 | O2 | 184s | 52.38 | 11454.75 | 68.07 | 87.77 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv3/mobilenet_v3_small_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mobilenet/mobilenetv3/mobilenet_v3_small_100-6fa3c17d-910v2.ckpt) | -| mobilenet_v3_large_100 | 5.51 | 8 | 75 | 224x224 | O2 | 354s | 55.89 | 10735.37 | 75.59 | 92.57 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv3/mobilenet_v3_large_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mobilenet/mobilenetv3/mobilenet_v3_large_100-bd4e7bdc-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------- | -| mobilenet_v3_small_100 | 2.55 | 8 | 75 | 224x224 | O2 | 145s | 48.14 | 12463.65 | 68.10 | 87.86 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv3/mobilenet_v3_small_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/mobilenet/mobilenetv3/mobilenet_v3_small_100-509c6047.ckpt) | -| mobilenet_v3_large_100 | 5.51 | 8 | 75 | 224x224 | O2 | 271s | 47.49 | 12634.24 | 75.23 | 92.31 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv3/mobilenet_v3_large_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/mobilenet/mobilenetv3/mobilenet_v3_large_100-1279ad5f.ckpt) | +| ---------------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------ | --------------------------------------------------------------------------------------------------------------------------------- | +| mobilenet_v3_small_100 | 2.55 | 8 | 75 | 224x224 | O2 | 184s | 54.21 | 11068.06 | 68.07 | 87.77 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv3/mobilenet_v3_small_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mobilenet/mobilenetv3/mobilenet_v3_small_100-6fa3c17d-910v2.ckpt) | +| mobilenet_v3_large_100 | 5.51 | 8 | 75 | 224x224 | O2 | 354s | 56.87 | 10550.37 | 75.59 | 92.57 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilenetv3/mobilenet_v3_large_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mobilenet/mobilenetv3/mobilenet_v3_large_100-bd4e7bdc-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/mobilevit/README.md b/configs/mobilevit/README.md index 2daf7cf84..aae42ef67 100644 --- a/configs/mobilevit/README.md +++ b/configs/mobilevit/README.md @@ -72,16 +72,9 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------- | -| mobilevit_xx_small | 1.27 | 8 | 64 | 256x256 | O2 | 437s | 67.24 | 7614.52 | 67.11 | 87.85 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilevit/mobilevit_xx_small_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mobilevit/mobilevit_xx_small-6f2745c3-910v2.ckpt) | +| ------------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------- | +| mobilevit_xx_small | 1.27 | 8 | 64 | 256x256 | O2 | 437s | 57.34 | 8929.19 | 67.11 | 87.85 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilevit/mobilevit_xx_small_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/mobilevit/mobilevit_xx_small-6f2745c3-910v2.ckpt) | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------- | -| mobilevit_xx_small | 1.27 | 64 | 8 | 256x256 | O2 | 301s | 53.52 | 9566.52 | 68.91 | 88.91 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/mobilevit/mobilevit_xx_small_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/mobilevit/mobilevit_xx_small-af9da8a0.ckpt) | - ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/nasnet/README.md b/configs/nasnet/README.md index 612313679..57f4bb49f 100644 --- a/configs/nasnet/README.md +++ b/configs/nasnet/README.md @@ -79,17 +79,11 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------ | -| nasnet_a_4x1056 | 5.33 | 8 | 256 | 224x224 | O2 | 800s | 364.35 | 5620.97 | 74.12 | 91.36 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/nasnet/nasnet_a_4x1056_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/nasnet/nasnet_a_4x1056-015ba575c-910v2.ckpt) | +| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------ | +| nasnet_a_4x1056 | 5.33 | 8 | 256 | 224x224 | O2 | 800s | 364.55 | 5617.88 | 74.12 | 91.36 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/nasnet/nasnet_a_4x1056_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/nasnet/nasnet_a_4x1056-015ba575c-910v2.ckpt) | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------- | -| nasnet_a_4x1056 | 5.33 | 8 | 256 | 224x224 | O2 | 656s | 330.89 | 6189.37 | 73.65 | 91.25 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/nasnet/nasnet_a_4x1056_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/nasnet/nasnet_a_4x1056-0fbb5cdd.ckpt) | - ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/pit/README.md b/configs/pit/README.md index 765a66e14..cc531df1b 100644 --- a/configs/pit/README.md +++ b/configs/pit/README.md @@ -76,20 +76,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- | -| pit_ti | 4.85 | 8 | 128 | 224x224 | O2 | 212s | 266.47 | 3842.83 | 73.26 | 91.57 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/pit/pit_ti_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/pit/pit_ti-33466a0d-910v2.ckpt) | - - - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------- | -| pit_ti | 4.85 | 8 | 128 | 224x224 | O2 | 192s | 271.50 | 3771.64 | 72.96 | 91.33 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/pit/pit_ti_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/pit/pit_ti-e647a593.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ---------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- | +| pit_ti | 4.85 | 8 | 128 | 224x224 | O2 | 212s | 257.31 | 3979.63 | 73.26 | 91.57 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/pit/pit_ti_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/pit/pit_ti-33466a0d-910v2.ckpt) | diff --git a/configs/poolformer/README.md b/configs/poolformer/README.md index ad4046a23..ce4246f23 100644 --- a/configs/poolformer/README.md +++ b/configs/poolformer/README.md @@ -73,15 +73,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| -------------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------- | -| poolformer_s12 | 11.92 | 8 | 128 | 224x224 | O2 | 177s | 211.81 | 4834.52 | 77.49 | 93.55 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/poolformer/poolformer_s12_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/poolformer/poolformer_s12-c7e14eea-910v2.ckpt) | - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| -------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------ | -| poolformer_s12 | 11.92 | 8 | 128 | 224x224 | O2 | 118s | 220.13 | 4651.80 | 77.33 | 93.34 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/poolformer/poolformer_s12_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/poolformer/poolformer_s12-5be5c4e4.ckpt) | +| -------------- | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------- | +| poolformer_s12 | 11.92 | 8 | 128 | 224x224 | O2 | 177s | 222.46 | 4603.07 | 77.49 | 93.55 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/poolformer/poolformer_s12_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/poolformer/poolformer_s12-c7e14eea-910v2.ckpt) | ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/pvt/README.md b/configs/pvt/README.md index 704746139..cbc10f6c8 100644 --- a/configs/pvt/README.md +++ b/configs/pvt/README.md @@ -81,16 +81,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------- | -| pvt_tiny | 13.23 | 8 | 128 | 224x224 | O2 | 212s | 237.5 | 4311.58 | 74.88 | 92.12 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/pvt/pvt_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/pvt/pvt_tiny-6676051f-910v2.ckpt) | - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------ | ----------------------------------------------------------------------------------- | -| pvt_tiny | 13.23 | 8 | 128 | 224x224 | O2 | 192s | 229.63 | 4459.35 | 74.81 | 92.18 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/pvt/pvt_tiny_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/pvt/pvt_tiny-6abb953d.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------- | +| pvt_tiny | 13.23 | 8 | 128 | 224x224 | O2 | 212s | 231.81 | 4417.41 | 74.88 | 92.12 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/pvt/pvt_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/pvt/pvt_tiny-6676051f-910v2.ckpt) | ### Notes diff --git a/configs/pvtv2/README.md b/configs/pvtv2/README.md index 37e74eefd..2d5bf8cce 100644 --- a/configs/pvtv2/README.md +++ b/configs/pvtv2/README.md @@ -82,16 +82,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- | -| pvt_v2_b0 | 3.67 | 8 | 128 | 224x224 | O2 | 323s | 255.76 | 4003.75 | 71.25 | 90.50 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/pvtv2/pvt_v2_b0_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/pvt_v2/pvt_v2_b0-d9cd9d6a-910v2.ckpt) | - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------- | -| pvt_v2_b0 | 3.67 | 8 | 128 | 224x224 | O2 | 269s | 269.38 | 3801.32 | 71.50 | 90.60 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/pvtv2/pvt_v2_b0_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/pvt_v2/pvt_v2_b0-1c4f6683.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | --------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- | +| pvt_v2_b0 | 3.67 | 8 | 128 | 224x224 | O2 | 323s | 264.24 | 3875.26 | 71.25 | 90.50 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/pvtv2/pvt_v2_b0_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/pvt_v2/pvt_v2_b0-d9cd9d6a-910v2.ckpt) | ### Notes diff --git a/configs/regnet/README.md b/configs/regnet/README.md index 4aeca99b7..6d0cd5598 100644 --- a/configs/regnet/README.md +++ b/configs/regnet/README.md @@ -83,16 +83,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| -------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | --------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------- | -| regnet_x_800mf | 7.26 | 8 | 64 | 224x224 | O2 | 228s | 50.74 | 10090.66 | 76.11 | 93.00 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/regnet/regnet_x_800mf_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/regnet/regnet_x_800mf-68fe1cca-910v2.ckpt) | - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| -------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | --------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------- | -| regnet_x_800mf | 7.26 | 8 | 64 | 224x224 | O2 | 99s | 42.49 | 12049.89 | 76.04 | 92.97 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/regnet/regnet_x_800mf_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/regnet/regnet_x_800mf-617227f4.ckpt) | +| -------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| -------- | -------- | -------- | --------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------- | +| regnet_x_800mf | 7.26 | 8 | 64 | 224x224 | O2 | 228s | 52.48 | 9756.09 | 76.11 | 93.00 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/regnet/regnet_x_800mf_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/regnet/regnet_x_800mf-68fe1cca-910v2.ckpt) | ### Notes diff --git a/configs/repmlp/README.md b/configs/repmlp/README.md index 578c5457e..137ba95ce 100644 --- a/configs/repmlp/README.md +++ b/configs/repmlp/README.md @@ -89,13 +89,6 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. *coming soon* -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------ | ----------------------------------------------------------------------------------------- | -| repmlp_t224 | 38.30 | 8 | 128 | 224x224 | O2 | 289s | 578.23 | 1770.92 | 76.71 | 93.30 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/repmlp/repmlp_t224_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/repmlp/repmlp_t224-8dbedd00.ckpt) | - ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/repvgg/README.md b/configs/repvgg/README.md index 7eb5623a6..3997e06c7 100644 --- a/configs/repvgg/README.md +++ b/configs/repvgg/README.md @@ -93,19 +93,11 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | -------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- | -| repvgg_a0 | 9.13 | 8 | 32 | 224x224 | O2 | 76s | 24.12 | 10613.60 | 72.29 | 90.78 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/repvgg/repvgg_a0_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/repvgg/repvgg_a0-b67a9f15-910v2.ckpt) | -| repvgg_a1 | 14.12 | 8 | 32 | 224x224 | O2 | 81s | 28.29 | 9096.13 | 73.68 | 91.51 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/repvgg/repvgg_a1_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/repvgg/repvgg_a1-a40aa623-910v2.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| -------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- | +| repvgg_a0 | 9.13 | 8 | 32 | 224x224 | O2 | 76s | 25.2 | 10158.73 | 72.29 | 90.78 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/repvgg/repvgg_a0_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/repvgg/repvgg_a0-b67a9f15-910v2.ckpt) | +| repvgg_a1 | 14.12 | 8 | 32 | 224x224 | O2 | 81s | 28.31 | 9042.74 | 73.68 | 91.51 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/repvgg/repvgg_a1_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/repvgg/repvgg_a1-a40aa623-910v2.ckpt) | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------- | -| repvgg_a0 | 9.13 | 8 | 32 | 224x224 | O2 | 50s
| 20.58 | 12439.26 | 72.19 | 90.75 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/repvgg/repvgg_a0_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/repvgg/repvgg_a0-6e71139d.ckpt) | -| repvgg_a1 | 14.12 | 8 | 32 | 224x224 | O2 | 29s | 20.70 | 12367.15 | 74.19 | 91.89 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/repvgg/repvgg_a1_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/repvgg/repvgg_a1-539513ac.ckpt) | - ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/res2net/README.md b/configs/res2net/README.md index af3a839f3..3d7a2b366 100644 --- a/configs/res2net/README.md +++ b/configs/res2net/README.md @@ -89,19 +89,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------ | -| res2net50 | 25.76 | 8 | 32 | 224x224 | O2 | 174s | 39.6 | 6464.65 | 79.33 | 94.64 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/res2net/res2net_50_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/res2net/res2net50-aa758355-910v2.ckpt) | - - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------ | ---------------------------------------------------------------------------------------- | -| res2net50 | 25.76 | 8 | 32 | 224x224 | O2 | 119s | 39.68 | 6451.61 | 79.35 | 94.64 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/res2net/res2net_50_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/res2net/res2net50-f42cf71b.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------|---------| -------- | -------- | ------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------ | +| res2net50 | 25.76 | 8 | 32 | 224x224 | O2 | 174s | 40.63 | 6300.76 | 79.33 | 94.64 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/res2net/res2net_50_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/res2net/res2net50-aa758355-910v2.ckpt) | diff --git a/configs/resnest/README.md b/configs/resnest/README.md index 55285f134..31afbf330 100644 --- a/configs/resnest/README.md +++ b/configs/resnest/README.md @@ -84,17 +84,9 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ----------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------- | -| resnest50 | 27.55 | 8 | 128 | 224x224 | O2 | 83s | 244.92 | 4552.73 | 80.81 | 95.16 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/resnest/resnest50_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/resnest/resnest50-f2e7fc9c.ckpt) | - +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- |-------------------------------------------------------------------------------------------------| ------------------------------------------------------------------------------------------------------- | -------- | -------- | +| resnest | 8 | 128 | 224x224 | O2 | 279s | 243.77 | 4200.68 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/resnest/resnest50_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/resnest/resnest50-180_1251_v2.ckpt) | 80.88 | 95.32 | ### Notes diff --git a/configs/resnet/README.md b/configs/resnet/README.md index d1906dac8..a071bd4c2 100644 --- a/configs/resnet/README.md +++ b/configs/resnet/README.md @@ -87,20 +87,11 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------- | -| resnet50 | 25.61 | 8 | 32 | 224x224 | O2 | 77s | 31.9 | 8025.08 | 76.76 | 93.31 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/resnet/resnet_50_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/resnet/resnet50-f369a08d-910v2.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------- | +| resnet50 | 25.61 | 8 | 32 | 224x224 | O2 | 77s | 33.09 | 7736.47 | 76.76 | 93.31 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/resnet/resnet_50_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/resnet/resnet50-f369a08d-910v2.ckpt) | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------- | -| resnet50 | 25.61 | 8 | 32 | 224x224 | O2 | 43s | 31.41 | 8150.27 | 76.69 | 93.50 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/resnet/resnet_50_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/resnet/resnet50-e0733ab8.ckpt) | - ### Notes - top-1 and top-5: Accuracy reported on the validation set of ImageNet-1K. diff --git a/configs/resnetv2/README.md b/configs/resnetv2/README.md index 10e1dc463..b2a1b672e 100644 --- a/configs/resnetv2/README.md +++ b/configs/resnetv2/README.md @@ -85,20 +85,9 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ----------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | ------- | -------- | -------- | -------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------- | -| resnetv2_50 | 25.60 | 8 | 32 | 224x224 | O2 | 120s | 32.19 | 7781.16 | 77.03 | 93.29 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/resnetv2/resnetv2_50_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/resnetv2/resnetv2_50-a0b9f7f8-910v2.ckpt) | - - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ----------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | -------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------- | -| resnetv2_50 | 25.60 | 8 | 32 | 224x224 | O2 | 52s | 32.66 | 7838.33 | 76.90 | 93.37 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/resnetv2/resnetv2_50_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/resnetv2/resnetv2_50-3c2f143b.ckpt) | +| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | +| ----------- | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| ---- | -------- | -------- | -------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------- | +| resnetv2_50 | 25.60 | 8 | 32 | 224x224 | O2 | 120s | 33.2 | 7710.84 | 77.03 | 93.29 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/resnetv2/resnetv2_50_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/resnetv2/resnetv2_50-a0b9f7f8-910v2.ckpt) | diff --git a/configs/resnext/README.md b/configs/resnext/README.md index ac62a9e03..f2ecf3e34 100644 --- a/configs/resnext/README.md +++ b/configs/resnext/README.md @@ -90,19 +90,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ----------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------ | -| resnext50_32x4d | 25.10 | 8 | 32 | 224x224 | O2 | 156s | 44.61 | 5738.62 | 78.64 | 94.18 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/resnext/resnext50_32x4d_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/resnext/resnext50_32x4d-988f75bc-910v2.ckpt) | - - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ----------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------- | -| resnext50_32x4d | 25.10 | 8 | 32 | 224x224 | O2 | 49s | 37.22 | 6878.02 | 78.53 | 94.10 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/resnext/resnext50_32x4d_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/resnext/resnext50_32x4d-af8aba16.ckpt) | +| --------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ----------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------ | +| resnext50_32x4d | 25.10 | 8 | 32 | 224x224 | O2 | 156s | 45.49 | 5627.61 | 78.64 | 94.18 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/resnext/resnext50_32x4d_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/resnext/resnext50_32x4d-988f75bc-910v2.ckpt) | diff --git a/configs/rexnet/README.md b/configs/rexnet/README.md index 86c96592c..132016b91 100644 --- a/configs/rexnet/README.md +++ b/configs/rexnet/README.md @@ -76,11 +76,10 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | ------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------------------------------- | -------- | -------- | +| rexnet | 8 | 64 | 224x224 | O2 | 463s | 122.56 | 4177.55 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/rexnet/rexnet_x09_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/rexnet/rexnet_09-00223eb4-910v2.ckpt) | 76.15 | 92.89 | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -*coming soon* ### Notes diff --git a/configs/senet/README.md b/configs/senet/README.md index 3b852d18c..3aa6bc35d 100644 --- a/configs/senet/README.md +++ b/configs/senet/README.md @@ -89,19 +89,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | -------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- | -| seresnet18 | 11.80 | 8 | 64 | 224x224 | O2 | 90s | 51.09 | 10021.53 | 72.05 | 90.59 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/senet/seresnet18_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/senet/seresnet18-7b971c78-910v2.ckpt) | - - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | -------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------- | -| seresnet18 | 11.80 | 8 | 64 | 224x224 | O2 | 43s | 44.40 | 11531.53 | 71.81 | 90.49 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/senet/seresnet18_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/senet/seresnet18-7880643b.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| -------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- | +| seresnet18 | 11.80 | 8 | 64 | 224x224 | O2 | 90s | 50.43 | 10152.68 | 72.05 | 90.59 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/senet/seresnet18_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/senet/seresnet18-7b971c78-910v2.ckpt) | diff --git a/configs/shufflenetv1/README.md b/configs/shufflenetv1/README.md index a45290e56..3dd2f9996 100644 --- a/configs/shufflenetv1/README.md +++ b/configs/shufflenetv1/README.md @@ -88,19 +88,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------------------------------------------------------------------- | -| shufflenet_v1_g3_05 | 0.73 | 8 | 64 | 224x224 | O2 | 191s | 47.77 | 10718.02 | 57.08 | 79.89 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/shufflenetv1/shufflenet_v1_0.5_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/shufflenet/shufflenetv1/shufflenet_v1_g3_05-56209ef3-910v2.ckpt) | - - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------ | -| shufflenet_v1_g3_05 | 0.73 | 8 | 64 | 224x224 | O2 | 169s | 40.62 | 12604.63 | 57.05 | 79.73 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/shufflenetv1/shufflenet_v1_0.5_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/shufflenet/shufflenetv1/shufflenet_v1_g3_05-42cfe109.ckpt) | +| ------------------- | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------------------------------------------------------------------- | +| shufflenet_v1_g3_05 | 0.73 | 8 | 64 | 224x224 | O2 | 191s | 45.46 | 11262.64 | 57.08 | 79.89 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/shufflenetv1/shufflenet_v1_0.5_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/shufflenet/shufflenetv1/shufflenet_v1_g3_05-56209ef3-910v2.ckpt) | diff --git a/configs/shufflenetv2/README.md b/configs/shufflenetv2/README.md index 820e962ba..cb47f141d 100644 --- a/configs/shufflenetv2/README.md +++ b/configs/shufflenetv2/README.md @@ -95,19 +95,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------- | -| shufflenet_v2_x0_5 | 1.37 | 8 | 64 | 224x224 | O2 | 100s | 47.32 | 10819.95 | 60.65 | 82.26 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/shufflenetv2/shufflenet_v2_0.5_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/shufflenet/shufflenetv2/shufflenet_v2_x0_5-39d05bb6-910v2.ckpt) | - - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------ | ----------------------------------------------------------------------------------------------------------------- | -| shufflenet_v2_x0_5 | 1.37 | 8 | 64 | 224x224 | O2 | 62s | 41.87 | 12228.33 | 60.53 | 82.11 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/shufflenetv2/shufflenet_v2_0.5_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/shufflenet/shufflenetv2/shufflenet_v2_x0_5-8c841061.ckpt) | +| ------------------ | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------- | +| shufflenet_v2_x0_5 | 1.37 | 8 | 64 | 224x224 | O2 | 100s | 48.15 | 10633.43 | 60.65 | 82.26 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/shufflenetv2/shufflenet_v2_0.5_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/shufflenet/shufflenetv2/shufflenet_v2_x0_5-39d05bb6-910v2.ckpt) | diff --git a/configs/sknet/README.md b/configs/sknet/README.md index 9d2ece81e..940c1d14d 100644 --- a/configs/sknet/README.md +++ b/configs/sknet/README.md @@ -93,19 +93,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | -------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- | -| skresnet18 | 11.97 | 8 | 64 | 224x224 | O2 | 134s | 49.83 | 10274.93 | 72.85 | 90.83 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/sknet/skresnet18_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/sknet/skresnet18-9d8b1afc-910v2.ckpt) | - - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | -------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------- | -| skresnet18 | 11.97 | 8 | 64 | 224x224 | O2 | 60s | 45.84 | 11169.28 | 73.09 | 91.20 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/sknet/skresnet18_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/sknet/skresnet18-868228e5.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| -------- | -------- | -------- | ---------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- | +| skresnet18 | 11.97 | 8 | 64 | 224x224 | O2 | 134s | 46.83 | 10933.16 | 72.85 | 90.83 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/sknet/skresnet18_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/sknet/skresnet18-9d8b1afc-910v2.ckpt) | diff --git a/configs/squeezenet/README.md b/configs/squeezenet/README.md index f56c19331..2844a4247 100644 --- a/configs/squeezenet/README.md +++ b/configs/squeezenet/README.md @@ -90,19 +90,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------- | -| squeezenet1_0 | 1.25 | 8 | 32 | 224x224 | O2 | 64s | 23.48 | 10902.90 | 58.75 | 80.76 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/squeezenet/squeezenet_1.0_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/squeezenet/squeezenet1_0-24010b28-910v2.ckpt) | - - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- | -| squeezenet1_0 | 1.25 | 8 | 32 | 224x224 | O2 | 45s | 22.36 | 11449.02 | 58.67 | 80.61 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/squeezenet/squeezenet_1.0_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/squeezenet/squeezenet1_0-eb911778.ckpt) | +| ------------- | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| -------- | -------- | -------- | ------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------- | +| squeezenet1_0 | 1.25 | 8 | 32 | 224x224 | O2 | 64s | 23.7 | 10801.68 | 58.75 | 80.76 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/squeezenet/squeezenet_1.0_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/squeezenet/squeezenet1_0-24010b28-910v2.ckpt) | diff --git a/configs/swintransformer/README.md b/configs/swintransformer/README.md index 33c087b28..bed287058 100644 --- a/configs/swintransformer/README.md +++ b/configs/swintransformer/README.md @@ -98,19 +98,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------- | -| swin_tiny | 33.38 | 8 | 256 | 224x224 | O2 | 266s | 466.6 | 4389.20 | 80.90 | 94.90 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/swintransformer/swin_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/swin/swin_tiny-72b3c5e6-910v2.ckpt) | - - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------- | -| swin_tiny | 33.38 | 8 | 256 | 224x224 | O2 | 226s | 454.49 | 4506.15 | 80.82 | 94.80 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/swintransformer/swin_tiny_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/swin/swin_tiny-0ff2f96d.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- | ------------- |---------| ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------- | +| swin_tiny | 33.38 | 8 | 256 | 224x224 | O2 | 266s | 454.01 | 4510.91 | 80.90 | 94.90 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/swintransformer/swin_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/swin/swin_tiny-72b3c5e6-910v2.ckpt) | diff --git a/configs/swintransformerv2/README.md b/configs/swintransformerv2/README.md index 8621c3a4a..dd7b348c8 100644 --- a/configs/swintransformerv2/README.md +++ b/configs/swintransformerv2/README.md @@ -91,20 +91,8 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- | -| swinv2_tiny_window8 | 28.78 | 8 | 128 | 256x256 | O2 | 385s | 335.18 | 3055.07 | 81.38 | 95.46 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/swintransformerv2/swinv2_tiny_window8_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/swinv2/swinv2_tiny_window8-70c5e903-910v2.ckpt) | - - - -Experiments are tested on ascend 910 with mindspore 2.3.1 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ------------------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------- | -| swinv2_tiny_window8 | 28.78 | 8 | 128 | 256x256 | O2 | 273s | 317.19 | 3228.35 | 81.42 | 95.43 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/swintransformerv2/swinv2_tiny_window8_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/swinv2/swinv2_tiny_window8-3ef8b787.ckpt) | - +| ------------------- | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- | +| swinv2_tiny_window8 | 28.78 | 8 | 128 | 256x256 | O2 | 385s | 326.16 | 3139.56 | 81.38 | 95.46 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/swintransformerv2/swinv2_tiny_window8_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/swinv2/swinv2_tiny_window8-70c5e903-910v2.ckpt) | ### Notes diff --git a/configs/vgg/README.md b/configs/vgg/README.md index 1b7544a01..500912beb 100644 --- a/configs/vgg/README.md +++ b/configs/vgg/README.md @@ -92,21 +92,9 @@ Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. | model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------- | -| vgg13 | 133.04 | 8 | 32 | 224x224 | O2 | 41s | 30.52 | 8387.94 | 72.81 | 91.02 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/vgg/vgg13_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/vgg/vgg13-7756f33c-910v2.ckpt) | -| vgg19 | 143.66 | 8 | 32 | 224x224 | O2 | 53s | 39.17 | 6535.61 | 75.24 | 92.55 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/vgg/vgg19_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/vgg/vgg19-5104d1ea-910v2.ckpt) | - - - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| ---------- | --------- | ----- | ---------- | ---------- | --------- |---------------| ------- | ------- | -------- | -------- | --------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------- | -| vgg13 | 133.04 | 8 | 32 | 224x224 | O2 | 23s | 55.20 | 4637.68 | 72.87 | 91.02 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/vgg/vgg13_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/vgg/vgg13-da805e6e.ckpt) | -| vgg19 | 143.66 | 8 | 32 | 224x224 | O2 | 22s | 67.42 | 3797.09 | 75.21 | 92.56 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/vgg/vgg19_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/vgg/vgg19-bedee7b6.ckpt) | +| ---------- | --------- | ----- | ---------- | ---------- | --------- |---------------|---------| ------- | -------- | -------- | --------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------- | +| vgg13 | 133.04 | 8 | 32 | 224x224 | O2 | 41s | 30.15 | 8490.87 | 72.81 | 91.02 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/vgg/vgg13_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/vgg/vgg13-7756f33c-910v2.ckpt) | +| vgg19 | 143.66 | 8 | 32 | 224x224 | O2 | 53s | 38.72 | 6611.57 | 75.24 | 92.55 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/vgg/vgg19_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/vgg/vgg19-5104d1ea-910v2.ckpt) | diff --git a/configs/visformer/README.md b/configs/visformer/README.md index 8aa4530ac..2c2c815ae 100644 --- a/configs/visformer/README.md +++ b/configs/visformer/README.md @@ -83,15 +83,9 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - - -| model name | params(M) | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | acc@top1 | acc@top5 | recipe | weight | -| -------------- | --------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | -------- | -------- | ------------------------------------------------------------------------------------------------------ | ----------------------------------------------------------------------------------------------- | -| visformer_tiny | 10.33 | 8 | 128 | 224x224 | O2 | 137s | 217.92 | 4698.97 | 78.28 | 94.15 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/visformer/visformer_tiny_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/visformer/visformer_tiny-daee0322.ckpt) | - +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- |-----------| ------------- | ------- | ------- | ---------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------- | -------- | -------- | +| visformer | 8 | 128 | 224x224 | O2 | 141s | 207.35 | 4938.51 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/visformer/visformer_tiny_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/visformer/visformer_tiny-df995ba4-910v2.ckpt) | 74.93 | 92.55 | ### Notes diff --git a/configs/vit/README.md b/configs/vit/README.md index a0ae7b534..84e9f3a54 100644 --- a/configs/vit/README.md +++ b/configs/vit/README.md @@ -102,11 +102,10 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | ------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------- | -------- | -------- | +| vit | 8 | 256 | 224x224 | O2 | 225s | 425.36 | 4814.75 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/vit/vit_l32_224_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/vit/vit_l_32_224-e0039f16-910v2.ckpt) | 74.63 | 92.21 | -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -*coming soon* ### Notes diff --git a/configs/volo/README.md b/configs/volo/README.md index 26ee17272..93e99392e 100644 --- a/configs/volo/README.md +++ b/configs/volo/README.md @@ -84,7 +84,10 @@ Our reproduced model performance on ImageNet-1K is reported as follows. performance tested on ascend 910*(8p) with graph mode -*coming soon* +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- |-----------| ------------- | ------- | ------- |--------------------------------------------------------------------------------------------| ------------------------------------------------------------------------------------------------------- | -------- | -------- | +| volo | 8 | 128 | 224x224 | O2 | 368s | 230.05 | 4451.21 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/volo/volo_d1_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/volo/volo_d1-177_1251_v2.ckpt) | 82.97 | 96.21 | + performance tested on ascend 910(8p) with graph mode diff --git a/configs/xception/README.md b/configs/xception/README.md index 20ed6d449..f7b93f3cc 100644 --- a/configs/xception/README.md +++ b/configs/xception/README.md @@ -87,11 +87,9 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -*coming soon* +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | ------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------- | -------- | -------- | +| xception | 8 | 32 | 224x224 | O2 | 186s | 83.40 | 3069.54 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/xception/xception_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/xception/xception-174_5004_v2.ckpt) | 76.31 | 92.80 | ### Notes diff --git a/configs/xcit/README.md b/configs/xcit/README.md index a794e1e12..b56d7cad0 100644 --- a/configs/xcit/README.md +++ b/configs/xcit/README.md @@ -81,11 +81,9 @@ Our reproduced model performance on ImageNet-1K is reported as follows. Experiments are tested on ascend 910* with mindspore 2.5.0 graph mode. -*coming soon* - -Experiments are tested on ascend 910 with mindspore 2.5.0 graph mode. - -*coming soon* +| model name | cards | batch size | resolution | jit level | graph compile | ms/step | img/s | recipe | weight | acc@top1 | acc@top5 | +| ----------- | ----- | ---------- | ---------- | --------- | ------------- | ------- | ------- | ------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------- | -------- | -------- | +| xcit | 8 | 128 | 224x224 | O2 | 329s | 233.86 | 4378.69 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/xcit/xcit_tiny_12_p16_224_ascend.yaml) | [weights](https://download-mindspore.osinfra.cn/toolkits/mindcv/xcit/xcit_tiny_12_p16_224-bd90776e-910v2.ckpt) | 77.16 | 93.57 | ### Notes diff --git a/mindcv/version.py b/mindcv/version.py index e5ded93b9..9bcd67d23 100644 --- a/mindcv/version.py +++ b/mindcv/version.py @@ -1,2 +1,2 @@ """version init""" -__version__ = "0.3.0" +__version__ = "0.5.0"