• Hailey Schoelkopf's avatar
    Upstream Mamba Support (`mamba_ssm`) (#1110) · 5503b274
    Hailey Schoelkopf authored
    * modularize HFLM code
    
    * pass through extra kwargs to AutoModel.from_pretrained call
    
    * remove explicit model_kwargs
    
    * rename gptq -> autogptq
    
    * fix tokenizer pad token errors
    
    * ensure model always respects device_map and autogptq's selected devices
    
    * add a _get_config helper fn
    
    * add mambaLMWrapper
    
    * add mamba extra
    
    * add mamba extra
    
    * fix conditional import
    
    * Fix botched merge commit
    
    * Remove beginning-of-file comment for consistency
    
    * Add docstring for mambaLM re: supported kwargs
    
    * Alphabetize extras
    
    * Update extras table
    
    * appease precommit
    
    * run precommit on mamba_lm
    5503b274
__init__.py 230 Bytes