Skip to content

[codex-analytics] ingest server requests and responses#17088

Open
rhan-oai wants to merge 1 commit intomainfrom
pr17088
Open

[codex-analytics] ingest server requests and responses#17088
rhan-oai wants to merge 1 commit intomainfrom
pr17088

Conversation

let (outgoing_tx, mut outgoing_rx) = mpsc::channel::<OutgoingEnvelope>(channel_capacity);
let outgoing_message_sender = Arc::new(OutgoingMessageSender::new(outgoing_tx));
let auth_manager =
AuthManager::shared_from_config(args.config.as_ref(), args.enable_codex_api_key_env);
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't we have an auth_manager on InProcessStartArgs?

@rhan-oai rhan-oai force-pushed the pr17088 branch 2 times, most recently from 254f07f to 32d2d05 Compare April 22, 2026 09:19
@rhan-oai
Copy link
Copy Markdown
Collaborator Author

@codex

Copy link
Copy Markdown
Contributor

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

if let Err(err) = self
.sender
.send(OutgoingEnvelope::ToConnection {
connection_id,
message: OutgoingMessage::Request(request),
write_complete_tx: None,
})

P2 Badge Track replayed requests when resending to a connection

replay_requests_to_connection_for_thread resends pending ServerRequests to a rejoined connection but does not call track_server_request. If that connection responds, notify_client_response records a ServerResponse for the new connection ID without a corresponding request fact, breaking per-connection request/response correlation.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines 309 to 310
None => {
self.sender
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Track broadcast server requests before dispatching them

send_request_to_connections only records track_server_request in the Some(connection_ids) branch. The broadcast branch (None) sends the request but never emits a request fact. Paths like ExternalAuthRefreshBridge::refresh use broadcast, so notify_client_response can emit a ServerResponse with no matching ServerRequest, yielding incomplete analytics.

Useful? React with 👍 / 👎.

Comment on lines +198 to +205
AnalyticsFact::ServerRequest {
connection_id: _connection_id,
request: _request,
} => {}
AnalyticsFact::ServerResponse {
connection_id: _connection_id,
response: _response,
} => {}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Process server request/response facts in the reducer

OutgoingMessageSender now emits AnalyticsFact::ServerRequest/ServerResponse, but the reducer drops both variants with empty arms. The added instrumentation therefore produces no analytics events; under load these no-op facts still consume slots in the bounded queue and can increase drops of meaningful events.

Useful? React with 👍 / 👎.


match entry {
Some((id, entry)) => {
if let Ok(response) = entry.request.response_from_result(result.clone()) {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Eliminate full JSON clone on server-response tracking

notify_client_response clones result before deserializing it into a typed ServerResponse, then still forwards the original result to the waiting callback. For large payloads (for example dynamic tool call outputs), this doubles JSON allocations and adds extra parse CPU on a hot response path.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants