GitHub - oobabooga/text-generation-webui: A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
🚀 Transform AI-generated text like a pro with GitHub - oobabooga/text-generation-webui! 🤖📝 This Gradio web UI supports various large language models including Transformers and GPTQ. Get creative with text generation in various modes and backends. #AI #TextGeneration #GitHub
- Text Generation Web UI for Large Language Models
- Interface modes: default, notebook, chat
- Model backends: Transformers, Llama.cpp, ExLlamaV2, AutoGPTQ, AutoAWQ, GPTQ-for-LLaMa, CTransformers, QuIP#
- Extensions for TTS, STT, translation, multimodal pipelines, Stable Diffusion integration
- LoRA: train, load, unload LoRAs for generation
- Transformers library integration: precision options and CPU inference
- OpenAI-compatible API server with Chat and Completions endpoints
- Installation via start scripts, browse using http://localhost:7860/?__theme=dark
- Models downloaded from Hugging Face, placed in models folder
- Colab notebook available for GPU setup
- Contributions welcome: check Contributing guidelines
- Community on subreddit and Discord
- Grant from a16z in August 2023
- Supports various models: transformers, GPTQ, AWQ, EXL2, Llama models.