Request to add support for OpenAI's Responses API `conversation` parameter for durable server-side conversation state, including the necessary history-stripping behavior, to `OpenAIResponsesModel`.
### Description OpenAI's Responses API supports durable server-side conversation state via the `conversation` parameter, e.g. `conversation='conv_...'`, backed by the Conversations API. Pydantic AI currently supports OpenAI server-side state via `openai_previous_response_id`, but does not appear to expose the Responses API `conversation` field or implement the associated history-stripping behavior needed to use it safely. This is not just a passthrough setting. A Responses request with a conversation ID expects only the new input items. If Pydantic AI sends the full accumulated `message_history` on subsequent run steps, it can duplicate prior items already stored in the OpenAI conversation. The desired behavior would be similar to `openai_previous_response_id`, but keyed by durable conversation ID rather than prior response ID. Potential user-facing API: ```python OpenAIResponsesModelSettings( openai_conversation_id='conv_...', ) ``` When this setting is present, `OpenAIResp