Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Repository
3acbf6d4968e0559629f0d6d317e5bac41ad5df0
Switch branch/tag
colossalai
colossalai
booster
mixed_precision
fp16_torch.py
Find file
Blame
History
Permalink
[npu] add npu support for hybrid plugin and llama (#5090)
· 3acbf6d4
Xuanlei Zhao
authored
Nov 22, 2023
* llama 3d * update * fix autocast
3acbf6d4
fp16_torch.py
4.69 KB
Edit
Web IDE
Replace fp16_torch.py
×
Attach a file by drag & drop or
click to upload
Commit message
Replace fp16_torch.py
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.