Working with Events¶
Events are the core telemetry data sent to Honeycomb. The Events API allows you to send data programmatically for ingestion.
Note
For production workloads, batch sending is highly preferred over single events for better efficiency and throughput.
Basic Event Operations¶
Send a Single Event¶
async def send_event(client: HoneycombClient, dataset: str) -> None:
"""Send a single event.
Args:
client: Authenticated HoneycombClient
dataset: Dataset slug to send event to
Note: For production use, prefer send_batch() for better efficiency.
"""
await client.events.send_async(
dataset,
data={
"service": "api",
"endpoint": "/users",
"duration_ms": 45,
"status_code": 200,
},
)
Send Event with Timestamp¶
async def send_event_with_timestamp(client: HoneycombClient, dataset: str) -> None:
"""Send an event with a custom timestamp.
Args:
client: Authenticated HoneycombClient
dataset: Dataset slug to send event to
"""
await client.events.send_async(
dataset,
data={
"service": "api",
"endpoint": "/posts",
"duration_ms": 120,
"status_code": 201,
},
timestamp=int(time.time()), # Unix timestamp
)
Send Batch (Recommended)¶
async def send_batch(client: HoneycombClient, dataset: str) -> None:
"""Send multiple events in a batch.
Args:
client: Authenticated HoneycombClient
dataset: Dataset slug to send events to
Note: This is the recommended way to send events for production use.
"""
events = [
BatchEvent(
data={
"service": "api",
"endpoint": "/users",
"duration_ms": 45,
"status_code": 200,
}
),
BatchEvent(
data={
"service": "api",
"endpoint": "/orders",
"duration_ms": 120,
"status_code": 200,
}
),
BatchEvent(
data={
"service": "api",
"endpoint": "/products",
"duration_ms": 85,
"status_code": 200,
}
),
]
results = await client.events.send_batch_async(dataset, events)
# Check for any failures
for i, result in enumerate(results):
if result.status != 202:
print(f"Event {i} failed: {result.error}")
Verify Events via Query¶
Events take ~30 seconds to become queryable after sending:
async def verify_events(client: HoneycombClient, dataset: str) -> int:
"""Verify events were ingested by running a query.
Args:
client: Authenticated HoneycombClient
dataset: Dataset slug to query
Returns:
The count of events found
Note: Events take ~30 seconds to become queryable after sending.
"""
# Wait for events to be queryable
await asyncio.sleep(30)
# Query to count recent events
query, result = await client.query_results.create_and_run_async(
QueryBuilder().dataset(dataset).last_10_minutes().count(),
)
# Get the count from the result
if result.data and result.data.rows:
count = result.data.rows[0].get("COUNT", 0)
print(f"Found {count} events in the last 10 minutes")
return int(count)
return 0
Additional Options¶
Events support optional parameters: - timestamp: Unix timestamp for when the event occurred - samplerate: Sampling rate (e.g., 10 means 1 in 10 events)
BatchEvent also supports time (ISO 8601 string) and samplerate fields.
Event Size Limits¶
- Single event body: 1MB maximum
- Maximum columns per event: 2000 distinct fields
- Batch size: Limited by total request size (1MB)
Sync Usage¶
All event operations have sync equivalents:
with HoneycombClient(api_key="...", sync=True) as client:
# Send single event
client.events.send("my-dataset", data={"field": "value"})
# Send batch
events = [BatchEvent(data={"event": 1}), BatchEvent(data={"event": 2})]
results = client.events.send_batch("my-dataset", events)
Note: Events cannot be deleted once sent. They become part of your dataset's telemetry data.