Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
change
sglang
Repository
7c1692aa90b208b1925292f155908fa745a52335
Switch branch/tag
sglang
examples
runtime
engine
offline_batch_inference_vlm.py
Find file
Blame
History
Permalink
[fix] added support for vlm in offline inference (#3548)
· fb4c9c3a
Shenggui Li
authored
Feb 15, 2025
fb4c9c3a
offline_batch_inference_vlm.py
1.95 KB
Edit
Web IDE
Replace offline_batch_inference_vlm.py
×
Attach a file by drag & drop or
click to upload
Commit message
Replace offline_batch_inference_vlm.py
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.