1. 12 Feb, 2024 1 commit
  2. 09 Feb, 2024 1 commit
    • Ilyas Moutawwakil's avatar
      ROCm AWQ support (#1514) · a4e58016
      Ilyas Moutawwakil authored
      # What does this PR do?
      
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      This PR adds the possibility to run AWQ models with Exllama/GPTQ
      kernels, specifically for ROCm devices that support Exllama kernels but
      not AWQ's GEMM.
      
      This is done by :
      - un-packing, reordering and re-packing AWQ weights when `--quantize
      gptq` but the model's `quant_method=awq`.
      - avoiding overflows when adding 1 to zeros in exllama and triton.
      
      Ref: https://github.com/casper-hansen/AutoAWQ/pull/313
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation
      
      ).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      
      ---------
      Co-authored-by: default avatarNicolas Patry <patry.nicolas@protonmail.com>
      a4e58016
  3. 24 Jan, 2024 1 commit
    • Nicolas Patry's avatar
      Fixing non divisible embeddings. (#1476) · 7e542d4d
      Nicolas Patry authored
      # What does this PR do?
      
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      Fixes # (issue)
      
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      7e542d4d
  4. 22 Dec, 2023 1 commit
  5. 21 Dec, 2023 1 commit
  6. 18 Dec, 2023 1 commit
  7. 14 Dec, 2023 1 commit
  8. 11 Dec, 2023 1 commit
  9. 25 Nov, 2023 1 commit
    • Nicolas Patry's avatar
      Exllama v2 (#1211) · ed2a3f61
      Nicolas Patry authored
      # What does this PR do?
      
      See #1165
      
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      Fixes # (issue)
      
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation
      
      ).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      
      ---------
      Co-authored-by: default avatarFlorian Zimmermeister <flozi00.fz@gmail.com>
      Co-authored-by: default avatarUbuntu <ubuntu@ip-172-31-24-153.ec2.internal>
      ed2a3f61
  10. 05 Oct, 2023 1 commit
    • Nicolas Patry's avatar
      Fixing GPTQ exllama kernel usage. (#1101) · 87f43814
      Nicolas Patry authored
      # What does this PR do?
      
      Fixes #1098 
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      Fixes # (issue)
      
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      87f43814
  11. 03 Oct, 2023 1 commit
    • Nicolas Patry's avatar
      Handling bloom prefix. (#1090) · 85acb11b
      Nicolas Patry authored
      # What does this PR do?
      
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      Fixes # (issue)
      
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      85acb11b
  12. 27 Sep, 2023 1 commit
  13. 25 Sep, 2023 1 commit
    • Nicolas Patry's avatar
      Add AWQ quantization inference support (#1019) (#1054) · c5de7cd8
      Nicolas Patry authored
      # Add AWQ quantization inference support
      
      Fixes
      https://github.com/huggingface/text-generation-inference/issues/781
      
      This PR (partially) adds support for AWQ quantization for inference.
      More information on AWQ [here](https://arxiv.org/abs/2306.00978). In
      general, AWQ is faster and more accurate than GPTQ, which is currently
      supported by TGI.
      
      This PR installs 4-bit GEMM custom CUDA kernels released by AWQ authors
      (in `requirements.txt`, just one line change).
      
      Quick way to test this PR would be bring up TGI as follows:
      
      ```
      text-generation-server download-weights abhinavkulkarni/codellama-CodeLlama-7b-Python-hf-w4-g128-awq
      
      text-generation-launcher \
      --huggingface-hub-cache ~/.cache/huggingface/hub/ \
      --model-id abhinavkulkarni/codellama-CodeLlama-7b-Python-hf-w4-g128-awq \
      --trust-remote-code --port 8080 \
      --max-input-length 2048 --max-total-tokens 4096 --max-batch-prefill-tokens 4096 \
      --quantize awq
      ```
      
      Please note:
      * This PR was tested with FlashAttention v2 and vLLM.
      * This PR adds support for AWQ inference, not quantizing the models.
      That needs to be done outside of TGI, instructions
      
      [here](https://github.com/mit-han-lab/llm-awq/tree/f084f40bd996f3cf3a0633c1ad7d9d476c318aaa).
      * This PR only adds support for `FlashLlama` models for now.
      * Multi-GPU setup has not been tested. 
      * No integration tests have been added so far, will add later if
      maintainers are interested in this change.
      * This PR can be tested on any of the models released
      
      [here](https://huggingface.co/abhinavkulkarni?sort_models=downloads#models).
      
      Please refer to the linked issue for benchmarks for
      
      [abhinavkulkarni/meta-llama-Llama-2-7b-chat-hf-w4-g128-awq](https://huggingface.co/abhinavkulkarni/meta-llama-Llama-2-7b-chat-hf-w4-g128-awq)
      vs
      
      [TheBloke/Llama-2-7b-Chat-GPTQ](https://huggingface.co/TheBloke/Llama-2-7b-Chat-GPTQ).
      
      Please note, AWQ has released faster (and in case of Llama, fused)
      kernels for 4-bit GEMM, currently at the top of the `main` branch at
      https://github.com/mit-han-lab/llm-awq, but this PR uses an older commit
      that has been tested to work. We can switch to latest commit later on.
      
      ## Who can review?
      
      @OlivierDehaene OR @Narsil
      
      ---------
      
      
      
      # What does this PR do?
      
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      Fixes # (issue)
      
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation
      
      ).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      
      ---------
      Co-authored-by: default avatarAbhinav M Kulkarni <abhinavkulkarni@gmail.com>
      Co-authored-by: default avatarAbhinav Kulkarni <abhinav@concentric.ai>
      c5de7cd8
  14. 08 Sep, 2023 1 commit
    • xiaobin's avatar
      fit for baichuan models (#981) · 4cce8430
      xiaobin authored
      
      
      As more and more people begin to use Baichuan's open-source models, the
      influence of Baichuan models is growing, especially in China. Many
      community members are interested in adding support for Baichuan models
      to TGI. Meanwhile, Baichuan is a very open company, and in the future,
      it plans to open-source more and more models, taking all this into
      consideration, we would like to add support for the Baichuan model to
      TGI. To do this, we need to make some changes, which we hope can be
      merged into the main branch of TGI. In the future, we would be happy to
      help maintain support for Baichuan models in TGI. We sincerely hope that
      our pull request can be accepted. Thank you.
      
      By the way, the changes of this time mainly for supporting Baichuan-7B.
      
      ---------
      Co-authored-by: default avatarxiaoyuze <xiaoyuze@baichuan.com>
      Co-authored-by: default avatarNicolas Patry <patry.nicolas@protonmail.com>
      4cce8430
  15. 07 Sep, 2023 2 commits
  16. 06 Sep, 2023 1 commit
    • Nicolas Patry's avatar
      Disabling exllama on old compute. (#986) · 211e7b7e
      Nicolas Patry authored
      # What does this PR do?
      
      Disabling exllama on old compute.
      
      Exllama + T4 don't play nice together, this will disable it right away
      to avoid issues at runtime.
      
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      Fixes # (issue)
      
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      211e7b7e
  17. 31 Jul, 2023 2 commits
    • Nicolas Patry's avatar
      fix(server): Failing quantize config after local read. (#743) · 15fc6466
      Nicolas Patry authored
      # What does this PR do?
      
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      Fixes # (issue)
      
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      15fc6466
    • Nicolas Patry's avatar
      Local gptq support. (#738) · 92bb56b0
      Nicolas Patry authored
      # What does this PR do?
      
      Redoes #719
      
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      Fixes # (issue)
      
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      92bb56b0
  18. 25 Jul, 2023 1 commit
    • Nicolas Patry's avatar
      feat(server): Using `quantize_config.json` instead of GPTQ_BITS env variables. (#671) · a0d55358
      Nicolas Patry authored
      - Current PR is not great because we're side stepping the
        `Weights.__init__` but Weights shouldn't requires anything related
        to the config or the model_id as it aims to be a simple Wrapper
        over multi file loading.
      - Ideal solution would be to use something like Rust enum
        ```
        enum Quantize{
          Bitandbytes(Bitsandbytes),
          GPTQ(bits: usize, groupsize: usize)
        ```
        And passing that around during load. Unfortunately we don't
        have access to this, so for now, side-stepping seems easier.
      
      - Re-enabling groupsize<0 with exllama (confirmed it works.)
      
      Helps #601 
      
      In next steps we should make sure our quantization script uses that
      format and make it standard.
      
      
      # What does this PR do?
      
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      Fixes # (issue)
      
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      a0d55358
  19. 24 Jul, 2023 1 commit
  20. 21 Jul, 2023 1 commit
    • Nicolas Patry's avatar
      feat(server): Add exllama GPTQ CUDA kernel support #553 (#666) · d5b5bc75
      Nicolas Patry authored
      Just trying to get the integration tests to pass.
      
      
      # What does this PR do?
      
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      Fixes # (issue)
      
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation
      
      ).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      
      ---------
      Co-authored-by: default avatarFelix Marty <9808326+fxmarty@users.noreply.github.com>
      d5b5bc75
  21. 12 Jul, 2023 4 commits
    • ssmi153's avatar
      GPTQ Env vars: catch correct type of error (#596) · 36285595
      ssmi153 authored
      # What does this PR do?
      
      When passing in environment variables like gptq_bits, we still get
      errors thrown from TGI because the try/catch block is catching the wrong
      type of error. This PR aims to fix that.
      
      @Narsil - let me know if this is how you want this formatted. My Python
      is a little shaky, so I hope this syntax is correct.
      36285595
    • Nicolas Patry's avatar
      feat(server): Implements sharding for non divisible `vocab_size`. (#583) · 67347950
      Nicolas Patry authored
      - The code is relatively easy (just disable the checks on Embedding and
      Head)
      
      This cannot be done in the same easy fashion for hidden_dim/head_dim.
      It's relatively easy on some models (classic MHA) but it would make the
      other
      models (MQA) much more complex, and GPTQ quantization another quite
      hairy piece
      of code.
      67347950
    • ssmi153's avatar
      fix(server): Bug fixes for GPTQ_BITS environment variable passthrough (#590) · 2c4bf882
      ssmi153 authored
      # What does this PR do?
      
      This fixes a typo and extends the GPTP_BITS environment variables
      through to the second method which requires the same logic. Please let
      me know if there's anything I've misunderstood in this change.
      
      Thanks @Narsil for the original fix.
      2c4bf882
    • Nicolas Patry's avatar
      feat(server): Support for env value for GPTQ_BITS and GPTQ_GROUPSIZE. (#580) · 5bd2ab65
      Nicolas Patry authored
      # What does this PR do?
      
      Some models are already converted, and do not have those values in the
      file, this enables users to use them with less friction.
      
      Went for pure env based because adding flags would end up (imo) very
      tedious to maintain. There's a lot of sanitation to do: those flags
      would be errors if not used in conjuction with `--quantize gptq`.
      Then the flags need to exist in the launcher and the server passing them
      all throughout all function calls.
      
      This PR is intended as an easy escape hatch, not the defacto method to
      use gptq in TGI.
      
      Fixes #500
      5bd2ab65
  22. 30 Jun, 2023 1 commit
  23. 26 Jun, 2023 1 commit
    • Nicolas Patry's avatar
      feat(server): Add inference support for GPTQ (llama + falcon tested) + Quantization script (#438) · aefde28b
      Nicolas Patry authored
      Let's start discussing implementation.
      
      - Need to expose the quantization scripts (either included here or add
      doc on how to use https://github.com/qwopqwop200/GPTQ-for-LLaMa)
      - Make sure GPTQ works for multiple models (priority to Falcon).
      
      Currently it means that every place we use `get_{tensor|sharded}` to
      check for quantization.
      
      My idea is to reintegrate as much as possible into `utils/layer.py` by
      expanding `load_multi` to be a bit more generic.
      This might require some thinking, but ultimately the
      `qweight,qzeros,scales,g_idx` should be in a single place, and
      independant of bias presence.
      
      # What does this PR do?
      
      <!--
      Congratulations! You've made it this far! You're not quite done yet
      though.
      
      Once merged, your PR is going to appear in the release notes with the
      title you set, so make sure it's a great title that fully reflects the
      extent of your awesome contribution.
      
      Then, please replace this with a description of the change and which
      issue is fixed (if applicable). Please also include relevant motivation
      and context. List any dependencies (if any) that are required for this
      change.
      
      Once you're done, someone will review your PR shortly (see the section
      "Who can review?" below to tag some potential reviewers). They may
      suggest changes to make the code even better. If no one reviewed your PR
      after a week has passed, don't hesitate to post a new comment
      @-mentioning the same persons---sometimes notifications get lost.
      -->
      
      <!-- Remove if not applicable -->
      
      Fixes # (issue)
      
      
      ## Before submitting
      - [ ] This PR fixes a typo or improves the docs (you can dismiss the
      other checks if that's the case).
      - [ ] Did you read the [contributor
      guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
            Pull Request section?
      - [ ] Was this discussed/approved via a Github issue or the
      [forum](https://discuss.huggingface.co/)? Please add a link
            to it if that's the case.
      - [ ] Did you make sure to update the documentation with your changes?
      Here are the
      [documentation
      guidelines](https://github.com/huggingface/transformers/tree/main/docs),
      and
      [here are tips on formatting
      docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation
      
      ).
      - [ ] Did you write any new necessary tests?
      
      
      ## Who can review?
      
      Anyone in the community is free to review the PR once the tests have
      passed. Feel free to tag
      members/contributors who may be interested in your PR.
      
      <!-- Your PR will be replied to more quickly if you can figure out the
      right person to tag with @
      
      
      @OlivierDehaene OR @Narsil
      
       -->
      
      ---------
      Co-authored-by: default avatarUbuntu <ubuntu@ip-172-31-41-161.ec2.internal>
      Co-authored-by: default avatarOlivierDehaene <olivier@huggingface.co>
      aefde28b
  24. 23 Jun, 2023 1 commit
  25. 08 Jun, 2023 1 commit