Request to add a Responses API to the XAI module and set it as the default for better user experience.
### Checked other resources - [x] This is a feature request, not a bug report or usage question. - [x] I added a clear and descriptive title that summarizes the feature request. - [x] I used the GitHub search to find a similar feature request and didn't find it. - [x] I checked the LangChain documentation and API reference to see if this feature already exists. - [x] This is not related to the langchain-community package. ### Package (Required) - [x] langchain-xai ### Feature Description I would like to add support for using the Responses API for ChatXAI, instead of chat completions. This is the [recommended way of interacting with the models](https://docs.x.ai/developers/model-capabilities/text/generate-text). I believe this will be beneficial for agents using xAI models. ### Use Case I want to improve the performance of the Grok models in Deepagents CLI. My interest in these models is that Grok 4.20 is showing good performance, I would say up to par with Sonnet 4.5, but at a