Commit 70a8a9e0 authored by wangwei990215's avatar wangwei990215
Browse files

initial commit

parents
Pipeline #1738 failed with stages
in 0 seconds
---
name: ❓ Questions/Help
about: If you have questions, please first search existing issues and docs
labels: 'question, needs triage'
---
Notice: In order to resolve issues more efficiently, please raise issue following the template.
(注意:为了更加高效率解决您遇到的问题,请按照模板提问,补充细节)
## ❓ Questions and Help
### Before asking:
1. search the issues.
2. search the docs.
<!-- If you still can't find what you need: -->
#### What is your question?
#### Code
<!-- Please paste a code snippet if your question requires it! -->
#### What have you tried?
#### What's your environment?
- OS (e.g., Linux):
- FunASR Version (e.g., 1.0.0):
- ModelScope Version (e.g., 1.11.0):
- PyTorch Version (e.g., 2.0.0):
- How you installed funasr (`pip`, source):
- Python version:
- GPU (e.g., V100M32)
- CUDA/cuDNN version (e.g., cuda11.7):
- Docker version (e.g., funasr-runtime-sdk-cpu-0.4.1)
- Any other relevant information:
\ No newline at end of file
---
name: 🐛 Bug Report
about: Submit a bug report to help us improve
labels: 'bug, needs triage'
---
Notice: In order to resolve issues more efficiently, please raise issue following the template.
(注意:为了更加高效率解决您遇到的问题,请按照模板提问,补充细节)
## 🐛 Bug
<!-- A clear and concise description of what the bug is. -->
### To Reproduce
Steps to reproduce the behavior (**always include the command you ran**):
1. Run cmd '....'
2. See error
<!-- If you have a code sample, error messages, stack traces, please provide it here as well -->
#### Code sample
<!-- Ideally attach a minimal code sample to reproduce the decried issue.
Minimal means having the shortest code but still preserving the bug. -->
### Expected behavior
<!-- A clear and concise description of what you expected to happen. -->
### Environment
- OS (e.g., Linux):
- FunASR Version (e.g., 1.0.0):
- ModelScope Version (e.g., 1.11.0):
- PyTorch Version (e.g., 2.0.0):
- How you installed funasr (`pip`, source):
- Python version:
- GPU (e.g., V100M32)
- CUDA/cuDNN version (e.g., cuda11.7):
- Docker version (e.g., funasr-runtime-sdk-cpu-0.4.1)
- Any other relevant information:
### Additional context
<!-- Add any other context about the problem here. -->
\ No newline at end of file
blank_issues_enabled: false
\ No newline at end of file
---
name: 📚 Documentation/Typos
about: Report an issue related to documentation or a typo
labels: 'documentation, needs triage'
---
## 📚 Documentation
For typos and doc fixes, please go ahead and:
1. Create an issue.
2. Fix the typo.
3. Submit a PR.
Thanks!
\ No newline at end of file
repos:
- repo: https://github.com/psf/black
rev: 24.4.0
hooks:
- id: black
args: ['--line-length=100'] # 示例参数,black默认使用4个空格缩进
## Acknowledge
1. We borrowed a lot of code from [Kaldi](http://kaldi-asr.org/) for data preparation.
2. We borrowed a lot of code from [ESPnet](https://github.com/espnet/espnet). FunASR follows up the training and finetuning pipelines of ESPnet.
3. We referred [Wenet](https://github.com/wenet-e2e/wenet) for building dataloader for large scale data training.
4. We acknowledge [ChinaTelecom](https://github.com/zhuzizyf/damo-fsmn-vad-infer-httpserver) for contributing the VAD runtime.
5. We acknowledge [RapidAI](https://github.com/RapidAI) for contributing the Paraformer and CT_Transformer-punc runtime.
6. We acknowledge [AiHealthx](http://www.aihealthx.com/) for contributing the websocket service and html5.
7. We acknowledge [XVERSE](http://www.xverse.cn/index.html) for contributing the grpc service.
8. We acknowledge [blt](https://github.com/bltcn) for develop and deploy website.
\ No newline at end of file
FunASR Model Open Source License
Version 1.0
Copyright (C) [2023-2028] Alibaba Group. All rights reserved.
Thank you for choosing the FunASR open source models. The FunASR open source models contain a series of open-source models that allow everyone to use, modify, share, and learn from it.
To ensure better community collaboration, we have developed the following agreement and hope that you carefully read and abide by it.
1 Definitions
In this agreement, [FunASR software] refers to the FunASR open source model, and its derivatives, including fine-tuned models. [You] refer to individuals or organizations who use, modify, share, and learn from [FunASR software].
2 License and Restrictions
2.1 License
You are free to use, copy, modify, and share [FunASR software] under the conditions of this agreement.
2.2 Restrictions
You should indicate the code and model source and author information when using, copying, modifying and sharing [FunASR software]. You should keep the relevant names of models in [FunASR software].
3 Responsibility and Risk
[FunASR software] is for reference and learning purposes only and is not responsible for any direct or indirect losses caused by your use or modification of [FunASR software]. You should take responsibility and risks for your use and modification of [FunASR software].
4 Termination
If you violate any terms of this agreement, your license will be automatically terminated, and you must stop using, copying, modifying, and sharing [FunASR software].
5 Revision
This agreement may be updated and revised from time to time. The revised agreement will be published in the FunASR official repository and automatically take effect. If you continue to use, copy, modify, and share [FunASR software], it means you agree to the revised agreement.
6 Other Provisions
This agreement is subject to the laws of [Country/Region]. If any provisions are found to be illegal, invalid, or unenforceable, they shall be deemed deleted from this agreement, and the remaining provisions shall remain valid and binding.
If you have any questions or comments about this agreement, please contact us.
Copyright (c) [2023-2028] Alibaba Group. All rights reserved.
FunASR 模型开源协议
版本号:1.0
版权所有 (C) [2023-2028] [阿里巴巴集团]。保留所有权利。
感谢您选择 FunASR 开源模型。FunASR 开源模型包含一系列免费且开源的工业模型,让大家可以使用、修改、分享和学习该模型。
为了保证更好的社区合作,我们制定了以下协议,希望您仔细阅读并遵守本协议。
1 定义
本协议中,[FunASR 软件]指 FunASR 开源模型权重及其衍生品,包括 Finetune 后的模型;[您]指使用、修改、分享和学习[FunASR 软件]的个人或组织。
2 许可和限制
2.1 许可
您可以在遵守本协议的前提下,自由地使用、复制、修改和分享[FunASR 软件]。
2.2 限制
您在使用、复制、修改和分享[FunASR 软件]时,必须注明出处以及作者信息,并保留[FunASR 软件]中相关模型名称。
3 责任和风险承担
[FunASR 软件]仅作为参考和学习使用,不对您使用或修改[FunASR 软件]造成的任何直接或间接损失承担任何责任。您对[FunASR 软件]的使用和修改应该自行承担风险。
4 终止
如果您违反本协议的任何条款,您的许可将自动终止,您必须停止使用、复制、修改和分享[FunASR 软件]。
5 修订
本协议可能会不时更新和修订。修订后的协议将在[FunASR 软件]官方仓库发布,并自动生效。如果您继续使用、复制、修改和分享[FunASR 软件],即表示您同意修订后的协议。
6 其他规定
本协议受到[国家/地区] 的法律管辖。如果任何条款被裁定为不合法、无效或无法执行,则该条款应被视为从本协议中删除,而其余条款应继续有效并具有约束力。
如果您对本协议有任何问题或意见,请联系我们。
版权所有© [2023-2028] [阿里巴巴集团]。保留所有权利。
This diff is collapsed.
This diff is collapsed.
# Leaderboard IO
## Configuration
### Data set:
[Aishell1](https://www.openslr.org/33/): dev, test
[Aishell2](https://www.aishelltech.com/aishell_2): dev_ios, test_ios, test_android, test_mic
[WenetSpeech](https://github.com/wenet-e2e/WenetSpeech): dev, test_meeting, test_net
### Tools
#### [Install Requirements](https://alibaba-damo-academy.github.io/FunASR/en/installation/installation.html#installation)
Install ModelScope and FunASR from pip
```shell
pip install -U modelscope funasr
# For the users in China, you could install with the command:
#pip install -U funasr -i https://mirror.sjtu.edu.cn/pypi/web/simple
```
Or install FunASR from source code
```shell
git clone https://github.com/alibaba/FunASR.git && cd FunASR
pip install -e ./
# For the users in China, you could install with the command:
# pip install -e ./ -i https://mirror.sjtu.edu.cn/pypi/web/simple
```
#### Recipe
##### [Test CER](https://alibaba-damo-academy.github.io/FunASR/en/modelscope_pipeline/asr_pipeline.html#inference-with-multi-thread-cpus-or-multi-gpus)
set the `model`, `data_dir` and `output_dir` in `infer.sh`.
```shell
cd egs_modelscope/asr/TEMPLATE
bash infer.sh
```
## Benchmark CER
### Chinese Dataset
<table border="1">
<tr align="center">
<td style="border: 1px solid">Model</td>
<td style="border: 1px solid">Offline/Online</td>
<td colspan="2" style="border: 1px solid">Aishell1</td>
<td colspan="4" style="border: 1px solid">Aishell2</td>
<td colspan="3" style="border: 1px solid">WenetSpeech</td>
</tr>
<tr align="center">
<td style="border: 1px solid"></td>
<td style="border: 1px solid"></td>
<td style="border: 1px solid">dev</td>
<td style="border: 1px solid">test</td>
<td style="border: 1px solid">dev_ios</td>
<td style="border: 1px solid">test_ios</td>
<td style="border: 1px solid">test_android</td>
<td style="border: 1px solid">test_mic</td>
<td style="border: 1px solid">dev</td>
<td style="border: 1px solid">test_meeting</td>
<td style="border: 1px solid">test_net</td>
</tr>
<tr align="center">
<td style="border: 1px solid"> <a href="https://www.modelscope.cn/models/damo/speech_paraformer-large_asr_nat-zh-cn-16k-common-vocab8404-pytorch/summary">Paraformer-large</a> </td>
<td style="border: 1px solid">Offline</td>
<td style="border: 1px solid">1.76</td>
<td style="border: 1px solid">1.94</td>
<td style="border: 1px solid">2.79</td>
<td style="border: 1px solid">2.84</td>
<td style="border: 1px solid">3.08</td>
<td style="border: 1px solid">3.03</td>
<td style="border: 1px solid">3.43</td>
<td style="border: 1px solid">7.01</td>
<td style="border: 1px solid">6.66</td>
</tr>
<tr align="center">
<td style="border: 1px solid"> <a href="https://www.modelscope.cn/models/damo/speech_paraformer-large-vad-punc_asr_nat-zh-cn-16k-common-vocab8404-pytorch/summary">Paraformer-large-long</a> </td>
<td style="border: 1px solid">Offline</td>
<td style="border: 1px solid">1.80</td>
<td style="border: 1px solid">2.10</td>
<td style="border: 1px solid">2.78</td>
<td style="border: 1px solid">2.87</td>
<td style="border: 1px solid">3.12</td>
<td style="border: 1px solid">3.11</td>
<td style="border: 1px solid">3.44</td>
<td style="border: 1px solid">13.28</td>
<td style="border: 1px solid">7.08</td>
</tr>
<tr align="center">
<td style="border: 1px solid"> <a href="https://www.modelscope.cn/models/damo/speech_paraformer-large-contextual_asr_nat-zh-cn-16k-common-vocab8404/summary">Paraformer-large-contextual</a> </td>
<td style="border: 1px solid">Offline</td>
<td style="border: 1px solid">1.76</td>
<td style="border: 1px solid">2.02</td>
<td style="border: 1px solid">2.73</td>
<td style="border: 1px solid">2.85</td>
<td style="border: 1px solid">2.98</td>
<td style="border: 1px solid">2.95</td>
<td style="border: 1px solid">3.42</td>
<td style="border: 1px solid">7.16</td>
<td style="border: 1px solid">6.72</td>
</tr>
<tr align="center">
<td style="border: 1px solid"> <a href="https://www.modelscope.cn/models/damo/speech_paraformer-large_asr_nat-zh-cn-16k-common-vocab8404-online/summary">Paraformer-large-online</a> </td>
<td style="border: 1px solid">Online</td>
<td style="border: 1px solid">2.37</td>
<td style="border: 1px solid">3.34</td>
<td style="border: 1px solid">4.04</td>
<td style="border: 1px solid">3.86</td>
<td style="border: 1px solid">4.38</td>
<td style="border: 1px solid">4.21</td>
<td style="border: 1px solid">4.55</td>
<td style="border: 1px solid">10.64</td>
<td style="border: 1px solid">7.78</td>
</tr>
<tr align="center">
<td style="border: 1px solid"> <a href="https://modelscope.cn/models/damo/speech_paraformer_asr_nat-zh-cn-16k-common-vocab8358-tensorflow1/summary">Paraformer</a> </td>
<td style="border: 1px solid">Offline</td>
<td style="border: 1px solid">3.24</td>
<td style="border: 1px solid">3.69</td>
<td style="border: 1px solid">4.58</td>
<td style="border: 1px solid">4.63</td>
<td style="border: 1px solid">4.83</td>
<td style="border: 1px solid">4.71</td>
<td style="border: 1px solid">4.19</td>
<td style="border: 1px solid">8.32</td>
<td style="border: 1px solid">9.19</td>
</tr>
<tr align="center">
<td style="border: 1px solid"> <a href="https://modelscope.cn/models/damo/speech_UniASR_asr_2pass-zh-cn-16k-common-vocab8358-tensorflow1-online/summary">UniASR</a> </td>
<td style="border: 1px solid">Online</td>
<td style="border: 1px solid">3.34</td>
<td style="border: 1px solid">3.99</td>
<td style="border: 1px solid">4.62</td>
<td style="border: 1px solid">4.52</td>
<td style="border: 1px solid">4.77</td>
<td style="border: 1px solid">4.73</td>
<td style="border: 1px solid">4.51</td>
<td style="border: 1px solid">10.63</td>
<td style="border: 1px solid">9.70</td>
</tr>
<tr align="center">
<td style="border: 1px solid"> <a href="https://modelscope.cn/models/damo/speech_UniASR-large_asr_2pass-zh-cn-16k-common-vocab8358-tensorflow1-offline/summary">UniASR-large</a> </td>
<td style="border: 1px solid">Offline</td>
<td style="border: 1px solid">2.93</td>
<td style="border: 1px solid">3.48</td>
<td style="border: 1px solid">3.95</td>
<td style="border: 1px solid">3.87</td>
<td style="border: 1px solid">4.11</td>
<td style="border: 1px solid">4.11</td>
<td style="border: 1px solid">4.16</td>
<td style="border: 1px solid">10.09</td>
<td style="border: 1px solid">8.69</td>
</tr>
<tr align="center">
<td style="border: 1px solid"> <a href="https://www.modelscope.cn/models/damo/speech_paraformer_asr_nat-aishell1-pytorch/summary">Paraformer-aishell</a> </td>
<td style="border: 1px solid">Offline</td>
<td style="border: 1px solid">4.88</td>
<td style="border: 1px solid">5.43</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
</tr>
<tr align="center">
<td style="border: 1px solid"> <a href="https://modelscope.cn/models/damo/speech_paraformerbert_asr_nat-zh-cn-16k-aishell1-vocab4234-pytorch/summary">ParaformerBert-aishell</a> </td>
<td style="border: 1px solid">Offline</td>
<td style="border: 1px solid">6.14</td>
<td style="border: 1px solid">7.01</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
</tr>
<tr align="center">
<td style="border: 1px solid"> <a href="https://www.modelscope.cn/models/damo/speech_paraformer_asr_nat-zh-cn-16k-aishell2-vocab5212-pytorch/summary">Paraformer-aishell2</a> </td>
<td style="border: 1px solid">Offline</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">5.82</td>
<td style="border: 1px solid">6.30</td>
<td style="border: 1px solid">6.60</td>
<td style="border: 1px solid">5.83</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
</tr>
<tr align="center">
<td style="border: 1px solid"> <a href="https://www.modelscope.cn/models/damo/speech_paraformerbert_asr_nat-zh-cn-16k-aishell2-vocab5212-pytorch/summary">ParaformerBert-aishell2</a> </td>
<td style="border: 1px solid">Offline</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">4.95</td>
<td style="border: 1px solid">5.45</td>
<td style="border: 1px solid">5.59</td>
<td style="border: 1px solid">5.83</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
<td style="border: 1px solid">-</td>
</tr>
</table>
### English Dataset
{"key": "BAC009S0764W0121", "source": "https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/BAC009S0764W0121.wav", "source_len": 90, "target": "甚至出现交易几乎停滞的情况", "target_len": 13}
{"key": "BAC009S0916W0489", "source": "https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/BAC009S0916W0489.wav", "source_len": 90, "target": "湖北一公司以员工名义贷款数十员工负债千万", "target_len": 20}
{"key": "asr_example_cn_en", "source": "https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/asr_example_cn_en.wav", "source_len": 91, "target": "所有只要处理 data 不管你是做 machine learning 做 deep learning 做 data analytics 做 data science 也好 scientist 也好通通都要都做的基本功啊那 again 先先对有一些也许对", "target_len": 19}
{"key": "ID0012W0014", "source": "https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/asr_example_en.wav", "source_len": 88, "target": "he tried to think how it could be", "target_len": 8}
BAC009S0764W0121 <|NEUTRAL|>
BAC009S0916W0489 <|NEUTRAL|>
asr_example_cn_en <|NEUTRAL|>
ID0012W0014 <|NEUTRAL|>
BAC009S0764W0121 <|Speech|>
BAC009S0916W0489 <|Speech|>
asr_example_cn_en <|Speech|>
ID0012W0014 <|Speech|>
BAC009S0764W0121 甚至出现交易几乎停滞的情况
BAC009S0916W0489 湖北一公司以员工名义贷款数十员工负债千万
asr_example_cn_en 所有只要处理 data 不管你是做 machine learning 做 deep learning 做 data analytics 做 data science 也好 scientist 也好通通都要都做的基本功啊那 again 先先对有一些也许对
ID0012W0014 he tried to think how it could be
\ No newline at end of file
BAC009S0764W0121 <|zh|>
BAC009S0916W0489 <|zh|>
asr_example_cn_en <|zh|>
ID0012W0014 <|en|>
BAC009S0764W0121 https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/BAC009S0764W0121.wav
BAC009S0916W0489 https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/BAC009S0916W0489.wav
asr_example_cn_en https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/asr_example_cn_en.wav
ID0012W0014 https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/asr_example_en.wav
\ No newline at end of file
{"key": "ID0012W0013", "source": "https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/asr_example_zh.wav", "source_len": 88, "target": "欢迎大家来体验达摩院推出的语音识别模型", "target_len": 19}
{"key": "ID0012W0014", "source": "https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/asr_example_en.wav", "source_len": 88, "target": "he tried to think how it could be", "target_len": 8}
ID0012W0013 欢迎大家来体验达摩院推出的语音识别模型
ID0012W0014 he tried to think how it could be
\ No newline at end of file
ID0012W0013 https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/asr_example_zh.wav
ID0012W0014 https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_audio/asr_example_en.wav
\ No newline at end of file
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SPHINXPROJ = FunASR
SOURCEDIR = .
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
\ No newline at end of file
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment