Table of Contents
Installation
Python
pip install aitracer
Requirements: Python 3.8 or higher
PHP
composer require haro/aitracer
Requirements: PHP 8.0 or higher
TypeScript / JavaScript
npm install @haro/aitracer
# or
yarn add @haro/aitracer
Requirements: Node.js 18 or higher
Initialization
AITracer Class
from aitracer import AITracer
tracer = AITracer(
api_key="at-xxxx", # Required (or env var AITRACER_API_KEY)
project="my-project", # Project name
enabled=True, # Enable/disable logging
sync=False, # Synchronous mode
batch_size=10, # Batch size
flush_interval=5.0, # Flush interval (seconds)
pii_detection=False, # PII detection
pii_action="mask", # Action on PII detection
base_url="https://api.aitracer.co" # API endpoint
)
Constructor Options
| Parameter | Type | Default | Description |
|---|---|---|---|
api_key |
str | - | AITracer API key (required) |
project |
str | None | Project identifier |
enabled |
bool | True | Enable logging |
sync |
bool | False | Synchronous mode (recommended for serverless) |
batch_size |
int | 10 | Batch send size |
flush_interval |
float | 5.0 | Auto-flush interval (seconds) |
flush_on_exit |
bool | True | Auto-flush on exit |
pii_detection |
bool | False | Automatic PII detection |
pii_action |
str | "mask" | mask / redact / hash / none |
pii_types |
list | ["email", "phone", ...] | PII types to detect |
Wrapper Methods
Wrap existing LLM clients to automatically record logs.
wrap_openai(client)
Wraps an OpenAI client.
from openai import OpenAI
client = tracer.wrap_openai(OpenAI())
# Use as normal - logs are recorded automatically
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello"}]
)
wrap_anthropic(client)
Wraps an Anthropic client.
from anthropic import Anthropic
client = tracer.wrap_anthropic(Anthropic())
response = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello"}]
)
wrap_gemini(model)
Wraps a Google Gemini model.
import google.generativeai as genai
model = tracer.wrap_gemini(genai.GenerativeModel("gemini-pro"))
response = model.generate_content("Hello")
Manual Logging
Record logs manually without using auto-wrap.
log(data)
tracer.log({
"model": "gpt-4",
"provider": "openai",
"input_data": {
"messages": [{"role": "user", "content": "Hello"}]
},
"output_data": {
"content": "Hi there!"
},
"input_tokens": 10,
"output_tokens": 5,
"latency_ms": 450,
"status": "success",
"metadata": {
"user_id": "user-123",
"feature": "chat"
}
})
Log Data Fields
| Field | Type | Required | Description |
|---|---|---|---|
model |
str | Yes | Model name used |
provider |
str | Yes | openai / anthropic / gemini / other |
input_data |
dict | No | Input data |
output_data |
dict | No | Output data |
input_tokens |
int | No | Number of input tokens |
output_tokens |
int | No | Number of output tokens |
latency_ms |
int | No | Latency (milliseconds) |
status |
str | No | success / error |
error_message |
str | No | Error message |
metadata |
dict | No | Custom metadata (max 10 keys) |
Session Management
Group logs by user session.
session(session_id, **kwargs)
with tracer.session(
session_id="session-abc123",
user_id="user-456",
metadata={"channel": "web"}
) as session:
# All requests within the session are automatically grouped
response1 = client.chat.completions.create(...)
response2 = client.chat.completions.create(...)
# Record feedback
session.thumbs_up() # Upvote the last response
session.thumbs_down() # Downvote the last response
session.feedback("Great response!") # Text feedback
Session Options
| Parameter | Type | Description |
|---|---|---|
session_id |
str | Session identifier (required) |
user_id |
str | User identifier |
metadata |
dict | Custom metadata |
App User Tracking
The App Users feature allows you to track usage per end-user of your AI application. Understand costs and usage per user to inform billing and usage limits.
App User analytics is available on Starter plan and above.
Specifying user_id
The recommended approach is to specify user_id when starting a session.
# Recommended: Specify user_id in session
with tracer.session(
session_id="session-abc123",
user_id="user-456", # Your app's user ID
) as session:
# All requests in this session are linked to user-456
response = client.chat.completions.create(...)
Specifying via Metadata
If not using sessions, you can specify user_id per request via metadata.
# Specify user_id in metadata
response = client.chat.completions.create(
model="gpt-4",
messages=[...],
extra_body={
"aitracer_metadata": {
"user_id": "user-456"
}
}
)
# Or set globally with set_metadata
tracer.set_metadata({"user_id": "user-456"})
response = client.chat.completions.create(...)
Viewing in Dashboard
The specified user_id can be aggregated and analyzed in the "App Users" menu in the dashboard.
- User List: Request count, cost, token count, and error count for all users
- User Details: Usage by model, recent logs, and statistics
- User Search: Search and filter logs by user_id
For more details, see Dashboard Guide - App User Analytics.
Tracing
Group multiple API calls into a single trace.
trace(trace_id)
with tracer.trace("request-123") as trace:
# Multiple API calls belong to the same trace
response1 = client.chat.completions.create(...)
response2 = client.chat.completions.create(...)
# Add metadata
trace.set_metadata({
"user_id": "user-456",
"feature": "summarization"
})
# Add tags
trace.add_tag("production")
trace.add_tag("high-priority")
Flush
Send buffered logs immediately.
flush()
# Flush manually
tracer.flush()
# Recommended to call before Lambda / Cloud Functions exit
def handler(event, context):
try:
response = client.chat.completions.create(...)
return response
finally:
tracer.flush() # Always flush
For Lambda / Cloud Functions, set
sync=True or always call flush() before the function exits.
Error Handling
The SDK is designed not to impact your application.
from aitracer.exceptions import AITracerError, APIError, RateLimitError
try:
tracer.log({...})
except RateLimitError:
# Rate limit reached
pass
except APIError as e:
# API communication error
print(f"API Error: {e}")
except AITracerError:
# Other errors
pass
By default, SDK errors only mean log recording failed and do not affect your application's behavior.
