Add `local-completions` support using OpenAI interface (#1277)
* Add `local-completions` support using OpenAI interface
* Refactor oa_completion
* Address tokenizer comments and change request chunks to batch size
* Add warning message for tiktoken backend
* fix formatting
* fix whitespace
* Update README.md
---------
Co-authored-by:
Hailey Schoelkopf <65563625+haileyschoelkopf@users.noreply.github.com>
Showing
Please register or sign in to comment