Skip to main content

A2A Log Export

Export Agent-to-Agent (A2A) communication logs to enterprise tools for compliance, auditing, analytics, and security monitoring.

Overview

A2A Log Export enables you to stream agent communication data to external destinations in real-time:

  • Compliance: Meet regulatory requirements for AI audit trails
  • Security Monitoring: Detect anomalies in agent behavior via SIEM
  • Analytics: Build dashboards and reports on agent interactions
  • Archival: Long-term storage for data governance

Access Requirements

A2A Log Export is available on Enterprise plans only.

Supported Destinations

DestinationUse CaseFormats
Amazon S3Data lake, archival, analyticsJSON, JSONL, Parquet
SplunkSIEM, security monitoringHEC events
DatadogObservability, APMLogs API
ElasticsearchSearch, analyticsDocuments
SyslogLegacy SIEM, network devicesRFC3164, RFC5424, CEF, LEEF
WebhookCustom integrationsJSON payloads

Quick Start

Via Dashboard

  1. Navigate to Settings > Integrations > A2A Export
  2. Click Add Destination
  3. Select destination type and configure
  4. Test the connection
  5. Enable export

Via API

const destination = await client.a2aExport.createDestination({
name: 'Splunk Production',
type: 'splunk',
enabled: true,
config: {
type: 'splunk',
hecUrl: 'https://splunk.company.com:8088/services/collector',
hecToken: 'your-hec-token',
index: 'deeployd-a2a',
source: 'deeployd',
sourcetype: 'deeployd:a2a'
}
});

Destination Configuration

Amazon S3

Stream logs to S3 for data lake storage and analytics pipelines.

{
type: 's3',
bucket: 'company-deeployd-logs',
region: 'us-east-1',
prefix: 'a2a-logs/',
accessKeyId: 'AKIA...', // Optional: uses IAM role if omitted
secretAccessKey: '...',
format: 'jsonl', // json, jsonl, or parquet
compression: 'gzip', // gzip or none
partitionBy: ['tenant', 'date'] // Partition scheme
}

Partition schemes:

  • tenant - Separate folders per tenant
  • team - Separate folders per team
  • date - Daily folders (YYYY/MM/DD)

Output paths:

s3://bucket/prefix/tenant=abc123/2024/01/15/a2a-1705312800.jsonl.gz

Splunk

Send logs to Splunk HTTP Event Collector (HEC) for security monitoring.

{
type: 'splunk',
hecUrl: 'https://splunk.company.com:8088/services/collector',
hecToken: 'your-hec-token',
index: 'deeployd_a2a', // Target index
source: 'deeployd', // Event source
sourcetype: 'deeployd:a2a' // Event type
}

Splunk event format:

{
"time": 1705312800,
"host": "deeployd",
"source": "deeployd",
"sourcetype": "deeployd:a2a",
"index": "deeployd_a2a",
"event": {
"id": "msg_abc123",
"threadId": "a2a_xyz789",
"fromAgentId": "agent_001",
"fromAgentName": "Sales Analyst",
"toAgentId": "agent_002",
"messageType": "request",
"content": "...",
"timestamp": "2024-01-15T10:00:00Z"
}
}

Datadog

Stream logs to Datadog for observability and APM correlation.

{
type: 'datadog',
apiKey: 'dd-api-key',
site: 'datadoghq.com', // or datadoghq.eu, us3, us5
service: 'deeployd-agents',
tags: ['env:production', 'team:ai-ops']
}

Datadog log format:

{
"ddsource": "deeployd",
"ddtags": "env:production,team:ai-ops",
"hostname": "deeployd",
"service": "deeployd-agents",
"message": "Agent communication: Sales Analyst -> Data Processor",
"a2a": {
"thread_id": "a2a_xyz789",
"from_agent": "agent_001",
"to_agent": "agent_002",
"message_type": "request"
}
}

Elasticsearch

Index logs in Elasticsearch for full-text search and analytics.

{
type: 'elasticsearch',
nodes: ['https://es1.company.com:9200', 'https://es2.company.com:9200'],
index: 'deeployd-a2a',
username: 'elastic', // Basic auth
password: '...',
// Or use API key:
// apiKey: 'base64-encoded-key',
pipeline: 'a2a-enrichment' // Optional ingest pipeline
}

Syslog

Send logs to syslog-compatible SIEM systems (QRadar, ArcSight, etc.).

{
type: 'syslog',
host: 'siem.company.com',
port: 514,
protocol: 'tls', // tcp, udp, or tls
facility: 16, // Local0 (16-23 for local use)
severity: 6, // Informational
format: 'cef', // rfc3164, rfc5424, cef, or leef
appName: 'Deeployd'
}

SIEM Format Examples:

CEF (Common Event Format) - ArcSight, Splunk:

CEF:0|Deeployd|A2A|1.0|a2a-message|Agent Communication|5|src=agent_001 dst=agent_002 msg=Request sent

LEEF (Log Event Extended Format) - IBM QRadar:

LEEF:1.0|Deeployd|A2A|1.0|a2a-message|src=agent_001	dst=agent_002	msg=Request sent

RFC5424:

<134>1 2024-01-15T10:00:00Z deeployd.com Deeployd - - - Agent communication: agent_001 -> agent_002

Webhook

Send logs to any HTTP endpoint for custom integrations.

{
type: 'webhook',
url: 'https://api.company.com/a2a-logs',
method: 'POST',
headers: {
'X-Custom-Header': 'value'
},
authType: 'bearer', // none, bearer, basic, or hmac
authToken: 'your-bearer-token',
// For HMAC:
// hmacSecret: 'shared-secret'
}

HMAC Signature: When using HMAC auth, a signature header is included:

X-Deeployd-Signature: sha256=<hex-encoded-hmac>

Filtering

Control which A2A communications are exported:

const destination = await client.a2aExport.createDestination({
name: 'Security SIEM',
type: 'splunk',
config: { ... },
filter: {
teamIds: ['team_security', 'team_finance'], // Specific teams
agentIds: ['agent_001', 'agent_002'], // Specific agents
threadStatuses: ['active', 'completed'], // Thread states
messageTypes: ['request', 'handoff'], // Message types
keywords: ['confidential', 'pii'], // Content keywords
minPriority: 'high' // Priority threshold
}
});

Batching & Delivery

Configure batching for optimal throughput:

{
batchSize: 100, // Messages per batch (1-10000)
flushIntervalMs: 30000, // Flush every 30 seconds
maxRetries: 3, // Retry failed deliveries
retryDelayMs: 1000 // Delay between retries
}

Delivery Guarantees

  • At-least-once delivery: Messages may be delivered multiple times on retry
  • Ordering: Messages within a batch are ordered by timestamp
  • Retry with backoff: Exponential backoff on failures

Export Data Format

Each exported message contains:

{
"id": "msg_abc123",
"timestamp": "2024-01-15T10:00:00.000Z",
"threadId": "a2a_xyz789",
"messageId": "msg_abc123",
"tenantId": "tenant_001",
"teamId": "team_sales",
"senderAgentId": "agent_001",
"senderAgentName": "Sales Analyst",
"receiverAgentId": "agent_002",
"receiverAgentName": "Data Processor",
"messageType": "request",
"content": "Analyze Q4 sales data...",
"contentSummary": "Request for Q4 sales analysis",
"threadTopic": "Q4 Sales Analysis",
"threadStatus": "active",
"tags": ["sales", "q4", "analysis"],
"priority": "medium",
"metadata": {
"toolCalls": [...],
"handoffToAgentId": null
}
}

API Reference

List Destinations

GET /api/a2a/export/destinations

Create Destination

POST /api/a2a/export/destinations
Content-Type: application/json

{
"name": "Production SIEM",
"type": "splunk",
"enabled": true,
"config": { ... },
"filter": { ... },
"batchSize": 100,
"flushIntervalMs": 30000
}

Update Destination

PATCH /api/a2a/export/destinations/:id
Content-Type: application/json

{
"enabled": false
}

Delete Destination

DELETE /api/a2a/export/destinations/:id

Test Destination

Send a test message to verify configuration:

POST /api/a2a/export/destinations/:id/test

Force Flush

Immediately send buffered messages:

POST /api/a2a/export/destinations/:id/flush

Security Considerations

Credential Storage

  • Credentials are encrypted at rest
  • API responses mask sensitive fields (tokens show first 4 characters only)
  • Use IAM roles for S3 instead of access keys when possible

Network Security

  • Use TLS for all destination connections
  • Configure firewall rules to allow Deeployd IP ranges
  • Use private endpoints where available (AWS PrivateLink, etc.)

Data Privacy

  • Consider filtering PII before export
  • Use content summaries instead of full messages where appropriate
  • Apply data retention policies at the destination

Monitoring

Track export health via:

  1. Dashboard: Settings > Integrations > A2A Export
  2. Metrics: Export success/failure rates, latency, batch sizes
  3. Alerts: Configure alerts for export failures

Troubleshooting

Connection Failures

Error: Connection refused to splunk.company.com:8088
  • Verify firewall rules allow Deeployd IPs
  • Check destination service is running
  • Verify TLS certificates if using HTTPS

Authentication Errors

Error: 401 Unauthorized
  • Verify credentials are correct
  • Check token has not expired
  • Ensure API key has write permissions

Rate Limiting

Error: 429 Too Many Requests
  • Increase flushIntervalMs to reduce frequency
  • Increase batchSize to send more per request
  • Contact destination provider for rate limit increases

Feature Availability

FeatureEnterprise
Export Destinations10
S3 ExportYes
Splunk ExportYes
Datadog ExportYes
Elasticsearch ExportYes
Syslog ExportYes
Webhook ExportYes
Custom FiltersYes
Parquet FormatYes
SIEM Formats (CEF/LEEF)Yes

Related: SIEM Integration | Audit Logs | Agent-to-Agent