How to Update Logged Inferences

Updating Logged Inferences

Every inference logged to Athina returns a prompt_run_id.

If you store this prompt_run_id, you can update the logs with additional information as your application has new events.

You can do this by sending a patch request with prompt_run_id of the original inference log.

  • Method: PATCH

  • Endpoint: https://log.athina.ai/api/v1/log/inference/{prompt_run_id}

  • Headers:

    • athina-api-key: YOUR_ATHINA_API_KEY
    • Content-Type: application/json
Request Body
{
  // ...Fields you want to update,
  "prompt_slug": "new_prompt_slug"
}

Allowed fields to update

  • prompt_slug
  • language_model_id
  • prompt_sent
  • functions
  • prompt_response
  • function_call_response
  • response_time
  • prompt_tokens
  • completion_tokens
  • total_tokens
  • cost
  • context
  • environment
  • customer_id
  • customer_user_id
  • session_id
  • user_query
  • external_reference_id
  • expected_response
💡

Note: You can only update the above fields. You cannot update the prompt_run_id or created_at fields.