Overview
Formal can forward all activity logs to your SIEM, data lake, or observability platform. This enables centralized log management, long-term retention, compliance reporting, and integration with your existing security tools.
By default, Formal forwards all logs to your configured destination.
Setup
AWS S3
Navigate to Log Integrations
Create Integration
Click Create Log Integration
Select AWS S3
Choose AWS S3 as your provider
Configure
S3 Bucket Name : Your bucket name
Cloud Integration : Select your AWS Cloud Integration
# First, create S3 bucket
resource "aws_s3_bucket" "formal_logs" {
bucket = "formal-connector-logs"
}
# Create AWS Cloud Integration
resource "formal_integration_cloud" "aws" {
name = "aws-integration"
cloud_region = "us-east-1"
aws {
template_version = "1.2.0"
allow_s3_access = true
s3_bucket_arn = " ${ aws_s3_bucket . formal_logs . arn } /*"
}
}
# Deploy CloudFormation stack (see AWS Integration guide)
resource "aws_cloudformation_stack" "formal" {
# ... (see AWS Integration docs)
}
# Create log integration
resource "formal_integration_log" "s3" {
name = "s3-log-drain"
s3 {
s3_bucket_name = aws_s3_bucket . formal_logs . bucket
cloud_integration_id = formal_integration_cloud . aws . id
}
}
Datadog
Get Datadog Credentials
From your Datadog account, retrieve:
Application Key
API Key
Site (e.g., datadoghq.com, datadoghq.eu)
Navigate to Log Integrations
Create Integration
Click Create Log Integration
Select Datadog
Choose Datadog as your provider
Enter Credentials
Application Key : Your Datadog Application Key
API Key : Your Datadog API key
Site : Your Datadog site
resource "formal_integration_log" "datadog" {
name = "datadog-log-drain"
datadog {
account_id = var . datadog_application_key
api_key = var . datadog_api_key
site = "datadoghq.com" # or datadoghq.eu, etc.
}
}
Datadog Dashboard
Import this template JSON for a sample of some of the analyses you can do with the Datadog log integration.
Splunk
Create Splunk HEC Token
In Splunk, create a new HTTP Event Collector (HEC) token
Navigate to Log Integrations
Create Integration
Click Create Log Integration
Select Splunk
Choose Splunk as your provider
Enter Configuration
Access Token : Your HEC token
Host : Your Splunk instance hostname
Port : HEC port (usually 8088)
resource "formal_integration_log" "splunk" {
name = "splunk-log-drain"
splunk {
access_token = var . splunk_hec_token
host = "splunk.example.com"
port = 8088
}
}
Use Cases
Compliance and Auditing
Forward logs to long-term storage for compliance requirements:
# Archive to S3 for 7 years (SOC 2, HIPAA, etc.)
resource "formal_integration_log" "compliance_archive" {
name = "compliance-s3-archive"
s3 {
s3_bucket_name = aws_s3_bucket . compliance_logs . bucket
cloud_integration_id = formal_integration_cloud . aws . id
}
}
# Configure S3 lifecycle policy
resource "aws_s3_bucket_lifecycle_configuration" "compliance" {
bucket = aws_s3_bucket . compliance_logs . id
rule {
id = "archive-old-logs"
status = "Enabled"
transition {
days = 90
storage_class = "GLACIER"
}
expiration {
days = 2555 # 7 years
}
}
}
Real-Time Security Monitoring
Send logs to your SIEM for real-time threat detection:
# Send to Datadog for real-time monitoring
resource "formal_integration_log" "security_monitoring" {
name = "datadog-security"
datadog {
account_id = var . datadog_application_key
api_key = var . datadog_api_key
site = "datadoghq.com"
}
}
Create alerts in Datadog for:
Failed authentication attempts
Policy violations
Unusual query patterns
Off-hours access
Data Lake Integration
Forward logs to your data lake for analytics:
# Send to S3 data lake
resource "formal_integration_log" "data_lake" {
name = "data-lake-integration"
s3 {
s3_bucket_name = "my-data-lake-formal-logs"
cloud_integration_id = formal_integration_cloud . aws . id
}
}
Then use Athena, Redshift Spectrum, or Databricks to analyze:
User access patterns
Query performance
Policy effectiveness
Resource utilization
Multi-Destination Forwarding
Send logs to multiple destinations:
# Real-time monitoring
resource "formal_integration_log" "splunk_realtime" {
name = "splunk-realtime"
splunk {
access_token = var . splunk_hec_token
host = "splunk.example.com"
port = 8088
}
}
# Long-term archive
resource "formal_integration_log" "s3_archive" {
name = "s3-archive"
s3 {
s3_bucket_name = "formal-logs-archive"
cloud_integration_id = formal_integration_cloud . aws . id
}
}
# Security analytics
resource "formal_integration_log" "datadog_security" {
name = "datadog-security"
datadog {
account_id = var . datadog_application_key
api_key = var . datadog_api_key
site = "datadoghq.com"
}
}
Next Steps
AWS Integration Set up AWS Cloud Integration for S3 logs
View Logs Monitor logs in the Formal console