Skip to content

GitLab

  • Menu
Projects Groups Snippets
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • M ModelZoo
  • Group information
    • Group information
    • Activity
    • Labels
    • Members
  • Issues 42
    • Issues 42
    • List
    • Board
    • Milestones
  • Merge requests 6
    • Merge requests 6
  • Packages & Registries
    • Packages & Registries
    • Package Registry
    • Dependency Proxy
  • Collapse sidebar
Collapse sidebar
  • ModelZoo
  • Issues

  • Open 42
  • Closed 142
  • All 184
  • Priority Created date Last updated Milestone due date Due date Popularity Label priority Manual
  • 容器环境下多体蛋白推理报错
    fastfold_pytorch#1 · created Dec 04, 2024 by JayFu
    • CLOSED
    • 2
    updated Dec 09, 2024
  • RuntimeError: mamba_ssm is only supported on ROCm 6.0 and above. Note: make sure HIP has a supported version by running hipcc --version.
    mamba_pytorch#1 · created Nov 19, 2024 by quyuanhao123
    • CLOSED
    • 1
    updated Nov 20, 2024
  • Error: No module named 'tyro'
    llama-factory-llama3.2_pytorch#2 · created Nov 18, 2024 by cluslab
    • CLOSED
    • 1
    updated Nov 18, 2024
  • 推理结果展示
    sadtalker_pytorch#1 · created Nov 13, 2024 by suily
    • CLOSED
    • 0
    updated Nov 13, 2024
  • ModuleNotFoundError: No module named 'torch. six'
    u-kan-optimize_pytorch#1 · created Nov 05, 2024 by chenzk
    • CLOSED
    • 1
    • 1
    updated Nov 05, 2024
  • ModuleNotFoundError: No module named 'torch. six'
    u-kan_pytorch#1 · created Nov 05, 2024 by chenzk
    • CLOSED
    • 1
    updated Nov 05, 2024
  • 运行train.py进行训练时报错ConnectTimeout,这个改如何更改
    yolov10_pytorch#1 · created Jun 27, 2024 by xiaxsh
    • CLOSED
    • 2
    updated Nov 05, 2024
  • 编译报错
    llama_fastertransformer#7 · created May 14, 2024 by xurui
    • CLOSED
    • 2
    updated Nov 05, 2024
  • K100AI用24.04.1镜像运行qwen2-72b无法启动
    qwen1.5_vllm#3 · created Jul 02, 2024 by jixx
    • CLOSED
    • 1
    updated Nov 05, 2024
  • vllm能否做个webserver的推理例子
    llama_vllm#1 · created Jun 21, 2024 by ncic_liuyao
    • CLOSED
    • 3
    updated Oct 21, 2024
  • Qwen2-72B-Instruct-GPTQ-Int4测试benchmark_throughput,在固定输入输出长度下,随着num-prompts增加没有线性能力
    qwen1.5_vllm#4 · created Aug 01, 2024 by xurui
    • CLOSED
    • 2
    updated Oct 16, 2024
  • K100上报错CUDA error: HIPBLAS_STATUS_NOT_SUPPORTED
    qwen1.5_vllm#2 · created Jun 14, 2024 by xurui
    • CLOSED
    • 1
    updated Oct 16, 2024
  • glm4 9b 推理都是感叹号!!!
    chatglm_vllm#1 · created Jul 10, 2024 by quyuanhao123
    • CLOSED
    • 0
    updated Oct 16, 2024
  • 使用 CodeLlama-7b-Instruct-hf 推理,生成文本为噪音
    llama_vllm#5 · created Jul 04, 2024 by acane
    • CLOSED
    • 2
    updated Oct 16, 2024
  • 使用llama-7b-chat推理测试,得到答案不合理
    llama_vllm#4 · created Jul 02, 2024 by eating1
    • CLOSED
    • 2
    updated Oct 16, 2024
  • vllm能否适配最新版本,为了能支持推理llama3
    llama_vllm#2 · created Jun 21, 2024 by ncic_liuyao
    • CLOSED
    • 2
    updated Oct 16, 2024
  • qwen2.5 7B跑8卡报错
    qwen2.5_pytorch#1 · created Oct 15, 2024 by gaoxuan
    • CLOSED
    • 1
    updated Oct 15, 2024
  • RuntimeError: [enforce fail at /data/jenkins_workspace/workspace/pytorch@4/third_party/gloo/gloo/transport/tcp/device.cc:210] ifa != nullptr. Unable to find interface for: [0.5.118.124]
    internlm_2.5_pytorch#2 · created Sep 25, 2024 by xurui
    • CLOSED
    • 1
    updated Oct 11, 2024
  • 该项目没有包含Internlm_2.5工程的代码?
    internlm_2.5_pytorch#1 · created Sep 24, 2024 by xurui
    • CLOSED
    • 0
    updated Sep 25, 2024
  • readme中数据集下载链接失效
    bert-pytorch#1 · created Sep 13, 2024 by xurui
    • CLOSED
    • 1
    updated Sep 23, 2024
  • Prev
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • …
  • 8
  • Next