A high-performance webhook management and delivery service built in Go for the PyAirtable ecosystem. This service handles outgoing webhooks when events occur in PyAirtable, providing reliable event delivery with comprehensive retry mechanisms, circuit breakers, and rate limiting.
- CRUD Operations: Complete webhook lifecycle management
- Multi-tenant Support: Isolated webhook configurations per tenant
- Event Subscription: Subscribe to specific PyAirtable events
- URL Validation: Automatic webhook endpoint validation
- Custom Headers: Support for custom HTTP headers
- Filter Rules: Advanced filtering based on event data
- Asynchronous Processing: High-throughput event processing with worker pools
- Retry Logic: Exponential backoff with configurable retry attempts
- Circuit Breaker: Protection against failing endpoints
- Dead Letter Queue: Handling of permanently failed deliveries
- Delivery Tracking: Comprehensive delivery status and history
- HMAC-SHA256 Signing: Secure payload signing for authenticity
- Rate Limiting: Per-endpoint rate limiting to prevent abuse
- JWT Authentication: Secure API access with JWT tokens
- Health Monitoring: Endpoint health tracking and alerting
- Kafka Ready: Built-in Kafka integration for event streaming
- Event Filtering: Smart filtering to reduce unnecessary processing
- Scalable Architecture: Horizontally scalable event processing
POST /api/v1/webhooks
- Create a new webhookGET /api/v1/webhooks
- List webhooks (tenant-scoped)GET /api/v1/webhooks/:id
- Get webhook detailsPUT /api/v1/webhooks/:id
- Update webhook configurationDELETE /api/v1/webhooks/:id
- Delete webhookPOST /api/v1/webhooks/:id/test
- Test webhook endpointGET /api/v1/webhooks/:id/deliveries
- Get delivery historyPOST /api/v1/webhooks/:id/retry
- Retry failed deliveriesGET /api/v1/webhooks/:id/stats
- Get webhook statistics
GET /health
- Service health checkGET /metrics
- Prometheus metrics endpoint
PORT
- Server port (default: 8080)HOST
- Server host (default: 0.0.0.0)LOG_LEVEL
- Log level: debug, info, warn, error (default: info)
DB_HOST
- PostgreSQL host (default: localhost)DB_PORT
- PostgreSQL port (default: 5432)DB_USER
- Database user (default: postgres)DB_PASSWORD
- Database passwordDB_NAME
- Database name (default: webhook_service_db)DB_SSL_MODE
- SSL mode: disable, require, verify-ca, verify-full (default: disable)
REDIS_HOST
- Redis host (default: localhost)REDIS_PORT
- Redis port (default: 6379)REDIS_PASSWORD
- Redis password (optional)REDIS_DB
- Redis database number (default: 0)
JWT_SECRET
- JWT signing secret (required in production)JWT_TTL
- JWT token TTL in seconds (default: 3600)
KAFKA_ENABLED
- Enable Kafka integration (default: false)KAFKA_BROKERS
- Comma-separated list of Kafka brokers (default: localhost:9092)KAFKA_TOPIC
- Kafka topic for webhook events (default: webhook-events)
WEBHOOK_MAX_RETRIES
- Maximum retry attempts (default: 3)WEBHOOK_DEFAULT_TIMEOUT
- Default webhook timeout in seconds (default: 30)WEBHOOK_DEFAULT_RETRY_INTERVAL
- Default retry interval in seconds (default: 60)WEBHOOK_WORKER_COUNT
- Number of delivery workers (default: 10)WEBHOOK_BATCH_SIZE
- Batch size for processing (default: 100)
CORS_ALLOWED_ORIGINS
- Allowed origins for CORS (default: *)
- Go 1.21+
- PostgreSQL 12+
- Redis 6+ (optional, for caching)
- Kafka (optional, for event streaming)
-
Clone the repository
git clone <repository-url> cd webhookservice
-
Install dependencies
go mod download
-
Set up environment variables
cp .env.example .env # Edit .env with your configuration
-
Set up the database
# Create database createdb webhook_service_db # Run migrations (automatic on startup)
-
Run the service
go run cmd/webhook-service/main.go
-
Using Docker Compose
docker-compose up -d
-
Using Docker
docker build -t webhook-service . docker run -p 8080:8080 webhook-service
curl -X POST http://localhost:8080/api/v1/webhooks \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <jwt-token>" \
-H "X-Tenant-ID: your-tenant-id" \
-d '{
"name": "Record Created Webhook",
"url": "https://your-app.com/webhooks/records",
"events": ["record.created", "record.updated"],
"description": "Webhook for record events",
"timeout": 30,
"max_retries": 3,
"filter_rules": [
{
"field": "table_id",
"operator": "eq",
"value": "tbl123456"
}
]
}'
curl -X POST http://localhost:8080/api/v1/webhooks/1/test \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <jwt-token>" \
-H "X-Tenant-ID: your-tenant-id" \
-d '{
"event_type": "record.created",
"payload": {
"id": "rec123456",
"name": "Test Record",
"table_id": "tbl123456"
}
}'
Webhooks receive payloads in the following standardized format:
{
"id": "evt_1234567890",
"event": "record.created",
"created_at": "2024-01-15T10:30:00Z",
"data": {
"id": "rec123456",
"name": "Test Record",
"fields": {
"Name": "Test Record",
"Status": "Active"
}
},
"metadata": {
"tenant_id": "tenant-123",
"user_id": "user-456",
"source": "api",
"version": "1.0"
}
}
All webhook payloads are signed with HMAC-SHA256. Verify the signature using the X-Webhook-Signature
header:
import hmac
import hashlib
def verify_webhook_signature(payload, signature, secret):
expected_signature = 'sha256=' + hmac.new(
secret.encode('utf-8'),
payload.encode('utf-8'),
hashlib.sha256
).hexdigest()
return hmac.compare_digest(signature, expected_signature)
The service supports the following PyAirtable events:
record.created
- New record createdrecord.updated
- Record updatedrecord.deleted
- Record deletedtable.created
- New table createdtable.updated
- Table schema updatedtable.deleted
- Table deletedbase.created
- New base createdbase.updated
- Base updatedbase.deleted
- Base deleted
GET /health
- Basic service health- Database connectivity check
- Redis connectivity check (if enabled)
- Kafka connectivity check (if enabled)
- HTTP request metrics
- Webhook delivery success/failure rates
- Circuit breaker states
- Rate limiting metrics
- Event processing metrics
Structured logging with configurable levels:
- Request/response logging
- Webhook delivery attempts
- Error tracking and debugging
- Performance metrics
# Run unit tests
go test ./test/unit/...
# Run integration tests
go test ./test/integration/...
# Run all tests with coverage
go test -v -race -coverprofile=coverage.out ./...
# Generate mocks for testing
go generate ./...
# Build for current platform
go build -o webhook-service cmd/webhook-service/main.go
# Build for Linux
GOOS=linux GOARCH=amd64 go build -o webhook-service-linux cmd/webhook-service/main.go
- Event Processing: 10,000+ events/second
- Webhook Deliveries: 1,000+ concurrent deliveries
- API Requests: 5,000+ requests/second
- API Response Time: < 10ms (95th percentile)
- Event Processing Latency: < 100ms (95th percentile)
- Webhook Delivery: < 500ms (95th percentile, excluding remote latency)
- Horizontal Scaling: Stateless design for easy scaling
- Database: Optimized queries with proper indexing
- Memory Usage: < 100MB base memory footprint
- Worker Pools: Configurable concurrency levels
- Authentication: JWT-based API authentication
- Authorization: Tenant-based resource isolation
- Input Validation: Comprehensive request validation
- Rate Limiting: Per-tenant and per-endpoint limits
- Payload Signing: HMAC-SHA256 webhook signatures
- HTTPS: TLS encryption for all communications
- Database: Prepared statements prevent SQL injection
-
Webhook Deliveries Failing
- Check endpoint URL accessibility
- Verify webhook signature validation
- Review rate limiting settings
- Check circuit breaker status
-
High Memory Usage
- Reduce worker count
- Decrease batch sizes
- Check for memory leaks in event processing
-
Database Connection Issues
- Verify database credentials
- Check connection pool settings
- Monitor database performance
-
Kafka Integration Issues
- Verify Kafka broker connectivity
- Check topic permissions
- Review consumer group settings
Enable debug logging for detailed information:
export LOG_LEVEL=debug
Check service metrics:
curl http://localhost:8080/metrics
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Run the test suite
- Submit a pull request
This project is part of the PyAirtable ecosystem and is subject to the PyAirtable license terms.
For issues and questions:
- GitHub Issues: Repository Issues
- Documentation: PyAirtable Docs
- Community: PyAirtable Community