Skip to content

GitLab

  • Menu
Projects Groups Snippets
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • Q Qwen1.5_vllm
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 1
    • Issues 1
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 0
    • Merge requests 0
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Monitor
    • Monitor
    • Incidents
  • Packages & Registries
    • Packages & Registries
    • Package Registry
    • Infrastructure Registry
  • Analytics
    • Analytics
    • CI/CD
    • Repository
    • Value stream
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • ModelZoo
  • Qwen1.5_vllm
  • Issues
  • #1

Closed
Open
Created Jun 03, 2024 by JayFu@JayFu

Z100进行单节点推理报OOM如何解决

环境:曙光超算集群 centOS7 节点信息DCU1号*4

操作: 修改LLM为32B-Chat模型路径,修改模型config文件为float16 单节点运行推理python vllm/examples/offline_inference.py

问题: HIP out of memory

尝试在config文件添加tp参数,没有变化

能否进行跨节点或者多节点推理

Edited Jun 04, 2024 by JayFu
Assignee
Assign to
Time tracking