Loading request...
Integrate 'thinking' chat models into Tabby's chat functionality. Given that latency requirements are not stringent for chat, models that use reasoning can provide better outputs.
**Please describe the feature you want** Can you add thinking models to chat models Thinking chat models https://tabby.tabbyml.com/docs/models/#chat-models---chat-model Given that latency requirements are not stringent in this scenario models that use reasoning can provide better outputs. --- Please reply with a 👍 if you want this feature.