Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat!: Add support for Log Analytics and Remove BigQuery log destination #1025

Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion 1-org/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ to Bigquery and Pub/Sub. This will result in additional charges for those copies

- This module implements but does not enable [bucket policy retention](https://cloud.google.com/storage/docs/bucket-lock) for organization logs. If needed, enable a retention policy by configuring the `log_export_storage_retention_policy` variable.

- This module implements but does not enable [object versioning](https://cloud.google.com/storage/docs/object-versioning) for organization logs. If needed, enable object versioning by setting the `audit_logs_table_delete_contents_on_destroy` variable to true.
- This module implements but does not enable [object versioning](https://cloud.google.com/storage/docs/object-versioning) for organization logs. If needed, enable object versioning by setting the `log_export_storage_versioning` variable to true.

- Bucket policy retention and object versioning are **mutually exclusive**.

Expand Down
6 changes: 2 additions & 4 deletions 1-org/envs/shared/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,6 @@
| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| audit\_data\_users | Google Workspace or Cloud Identity group that have access to audit logs. | `string` | n/a | yes |
| audit\_logs\_table\_delete\_contents\_on\_destroy | (Optional) If set to true, delete all the tables in the dataset when destroying the resource; otherwise, destroying the resource will fail if tables are present. | `bool` | `false` | no |
| audit\_logs\_table\_expiration\_days | Period before tables expire for all audit logs in milliseconds. Default is 30 days. | `number` | `30` | no |
| billing\_data\_users | Google Workspace or Cloud Identity group that have access to billing data set. | `string` | n/a | yes |
| billing\_export\_dataset\_location | The location of the dataset for billing data export. | `string` | `"US"` | no |
| create\_access\_context\_manager\_access\_policy | Whether to create access context manager access policy. | `bool` | `true` | no |
Expand Down Expand Up @@ -37,8 +35,8 @@
| domains\_to\_allow | The list of domains to allow users from in IAM. |
| interconnect\_project\_id | The Dedicated Interconnect project ID |
| interconnect\_project\_number | The Dedicated Interconnect project number |
| logs\_export\_bigquery\_dataset\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets |
| logs\_export\_logbucket\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets |
| logs\_export\_logbucket\_linked\_dataset\_name | The resource name of the Log Bucket linked BigQuery dataset created for Log Analytics. See https://cloud.google.com/logging/docs/log-analytics . |
| logs\_export\_logbucket\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets . |
| logs\_export\_pubsub\_topic | The Pub/Sub topic for destination of log exports |
| logs\_export\_storage\_bucket\_name | The storage bucket for destination of log exports |
| org\_audit\_logs\_project\_id | The org audit logs project ID |
Expand Down
34 changes: 13 additions & 21 deletions 1-org/envs/shared/log_sinks.tf
Original file line number Diff line number Diff line change
Expand Up @@ -18,15 +18,16 @@ locals {
parent_resource_id = local.parent_folder != "" ? local.parent_folder : local.org_id
parent_resource_type = local.parent_folder != "" ? "folder" : "organization"
parent_resources = { resource = local.parent_resource_id }
main_logs_filter = <<EOF
logs_filter = <<EOF
logName: /logs/cloudaudit.googleapis.com%2Factivity OR
logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR
logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR
logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency OR
logName: /logs/cloudaudit.googleapis.com%2Fpolicy OR
logName: /logs/compute.googleapis.com%2Fvpc_flows OR
logName: /logs/compute.googleapis.com%2Ffirewall OR
logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency
logName: /logs/dns.googleapis.com%2Fdns_queries
daniel-cit marked this conversation as resolved.
Show resolved Hide resolved
EOF
all_logs_filter = ""
}

resource "random_string" "suffix" {
Expand All @@ -42,22 +43,11 @@ module "logs_export" {
resource_type = local.parent_resource_type
logging_destination_project_id = module.org_audit_logs.project_id

/******************************************
Send logs to BigQuery
*****************************************/
bigquery_options = {
logging_sink_name = "sk-c-logging-bq"
logging_sink_filter = local.main_logs_filter
dataset_name = "audit_logs"
expiration_days = var.audit_logs_table_expiration_days
delete_contents_on_destroy = var.audit_logs_table_delete_contents_on_destroy
}

/******************************************
Send logs to Storage
*****************************************/
storage_options = {
logging_sink_filter = local.all_logs_filter
logging_sink_filter = local.logs_filter
logging_sink_name = "sk-c-logging-bkt"
storage_bucket_name = "bkt-${module.org_audit_logs.project_id}-org-logs-${random_string.suffix.result}"
location = var.log_export_storage_location
Expand All @@ -72,7 +62,7 @@ module "logs_export" {
Send logs to Pub\Sub
*****************************************/
pubsub_options = {
logging_sink_filter = local.main_logs_filter
logging_sink_filter = local.logs_filter
logging_sink_name = "sk-c-logging-pub"
topic_name = "tp-org-logs-${random_string.suffix.result}"
create_subscriber = true
Expand All @@ -82,14 +72,16 @@ module "logs_export" {
Send logs to Logbucket
*****************************************/
logbucket_options = {
logging_sink_name = "sk-c-logging-logbkt"
logging_sink_filter = local.all_logs_filter
name = "logbkt-org-logs-${random_string.suffix.result}"
location = local.default_region
logging_sink_name = "sk-c-logging-logbkt"
daniel-cit marked this conversation as resolved.
Show resolved Hide resolved
logging_sink_filter = local.logs_filter
name = "logbkt-org-logs-${random_string.suffix.result}"
location = local.default_region
enable_analytics = true
linked_dataset_id = "ds_c_logbkt_analytics"
linked_dataset_description = "BigQuery Dataset for Logbucket analytics"
}
}


/******************************************
Billing logs (Export configured manually)
*****************************************/
Expand Down
8 changes: 4 additions & 4 deletions 1-org/envs/shared/outputs.tf
Original file line number Diff line number Diff line change
Expand Up @@ -106,12 +106,12 @@ output "logs_export_storage_bucket_name" {

output "logs_export_logbucket_name" {
value = module.logs_export.logbucket_destination_name
description = "The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets"
description = "The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets ."
}

output "logs_export_bigquery_dataset_name" {
value = module.logs_export.bigquery_destination_name
description = "The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets"
output "logs_export_logbucket_linked_dataset_name" {
value = module.logs_export.logbucket_linked_dataset_name
description = "The resource name of the Log Bucket linked BigQuery dataset created for Log Analytics. See https://cloud.google.com/logging/docs/log-analytics ."
}

output "tags" {
Expand Down
12 changes: 0 additions & 12 deletions 1-org/envs/shared/variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -35,12 +35,6 @@ variable "domains_to_allow" {
type = list(string)
}

variable "audit_logs_table_expiration_days" {
description = "Period before tables expire for all audit logs in milliseconds. Default is 30 days."
type = number
default = 30
}

variable "scc_notification_name" {
description = "Name of the Security Command Center Notification. It must be unique in the organization. Run `gcloud scc notifications describe <scc_notification_name> --organization=org_id` to check if it already exists."
type = string
Expand Down Expand Up @@ -94,12 +88,6 @@ variable "log_export_storage_versioning" {
default = false
}

variable "audit_logs_table_delete_contents_on_destroy" {
description = "(Optional) If set to true, delete all the tables in the dataset when destroying the resource; otherwise, destroying the resource will fail if tables are present."
type = bool
default = false
}

variable "log_export_storage_retention_policy" {
description = "Configuration of the bucket's data retention policy for how long objects in the bucket should be retained."
type = object({
Expand Down
9 changes: 6 additions & 3 deletions 1-org/modules/centralized-logging/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Centralized Logging Module

This module handles logging configuration enabling one or more resources such as organization, folders, or projects to send logs to multiple destinations: [GCS bucket](https://cloud.google.com/logging/docs/export/using_exported_logs#gcs-overview), [Big Query](https://cloud.google.com/logging/docs/export/bigquery), [Pub/Sub](https://cloud.google.com/logging/docs/export/using_exported_logs#pubsub-overview), and [Log Buckets](https://cloud.google.com/logging/docs/routing/overview#buckets).
This module handles logging configuration enabling one or more resources such as organization, folders, or projects to send logs to multiple destinations: [GCS bucket](https://cloud.google.com/logging/docs/export/using_exported_logs#gcs-overview), [Pub/Sub](https://cloud.google.com/logging/docs/export/using_exported_logs#pubsub-overview), and [Log Buckets](https://cloud.google.com/logging/docs/routing/overview#buckets) with [Log Analytics](https://cloud.google.com/logging/docs/log-analytics#analytics).

## Usage

Expand Down Expand Up @@ -33,9 +33,11 @@ module "logs_export" {
logName: /logs/cloudaudit.googleapis.com%2Factivity OR
logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR
logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR
logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency OR
logName: /logs/cloudaudit.googleapis.com%2Fpolicy OR
logName: /logs/compute.googleapis.com%2Fvpc_flows OR
logName: /logs/compute.googleapis.com%2Ffirewall OR
logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency
logName: /logs/dns.googleapis.com%2Fdns_queries
EOF
}
}
Expand Down Expand Up @@ -73,7 +75,7 @@ module "logging_logbucket" {
| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| bigquery\_options | Destination BigQuery options:<br>- dataset\_name: The name of the bigquery dataset to be created and used for log entries.<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is "" which exports all logs.<br>- expiration\_days: Table expiration time. If null logs will never be deleted.<br>- partitioned\_tables: Options that affect sinks exporting data to BigQuery. use\_partitioned\_tables - (Required) Whether to use BigQuery's partition tables.<br>- delete\_contents\_on\_destroy: If set to true, delete all contained objects in the logging destination. | <pre>object({<br> dataset_name = optional(string, null)<br> logging_sink_name = optional(string, null)<br> logging_sink_filter = optional(string, "")<br> expiration_days = optional(number, null)<br> partitioned_tables = optional(bool, true)<br> delete_contents_on_destroy = optional(bool, false)<br> })</pre> | `null` | no |
daniel-cit marked this conversation as resolved.
Show resolved Hide resolved
| logbucket\_options | Destination LogBucket options:<br>- name: The name of the log bucket to be created and used for log entries matching the filter.<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is "" which exports all logs.<br>- location: The location of the log bucket. Default: global.<br>- retention\_days: The number of days data should be retained for the log bucket. Default 30. | <pre>object({<br> name = optional(string, null)<br> logging_sink_name = optional(string, null)<br> logging_sink_filter = optional(string, "")<br> location = optional(string, "global")<br> retention_days = optional(number, 30)<br> })</pre> | `null` | no |
| logbucket\_options | Destination LogBucket options:<br>- name: The name of the log bucket to be created and used for log entries matching the filter.<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is "" which exports all logs.<br>- location: The location of the log bucket. Default: global.<br>- enable\_analytics: Whether or not Log Analytics is enabled. A Log bucket with Log Analytics enabled can be queried in the Log Analytics page using SQL queries. Cannot be disabled once enabled.<br>- linked\_dataset\_id: The ID of the linked BigQuery dataset. A valid link dataset ID must only have alphanumeric characters and underscores within it and have up to 100 characters.<br>- linked\_dataset\_description: A use-friendly description of the linked BigQuery dataset. The maximum length of the description is 8000 characters.<br>- retention\_days: The number of days data should be retained for the log bucket. Default 30. | <pre>object({<br> name = optional(string, null)<br> logging_sink_name = optional(string, null)<br> logging_sink_filter = optional(string, "")<br> location = optional(string, "global")<br> enable_analytics = optional(bool, true)<br> linked_dataset_id = optional(string, null)<br> linked_dataset_description = optional(string, null)<br> retention_days = optional(number, 30)<br> })</pre> | `null` | no |
daniel-cit marked this conversation as resolved.
Show resolved Hide resolved
daniel-cit marked this conversation as resolved.
Show resolved Hide resolved
| logging\_destination\_project\_id | The ID of the project that will have the resources where the logs will be created. | `string` | n/a | yes |
| logging\_project\_key | (Optional) The key of logging destination project if it is inside resources map. It is mandatory when resource\_type = project and logging\_target\_type = logbucket. | `string` | `""` | no |
| pubsub\_options | Destination Pubsub options:<br>- topic\_name: The name of the pubsub topic to be created and used for log entries matching the filter.<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is "" which exports all logs.<br>- create\_subscriber: Whether to create a subscription to the topic that was created and used for log entries matching the filter. If 'true', a pull subscription is created along with a service account that is granted roles/pubsub.subscriber and roles/pubsub.viewer to the topic. | <pre>object({<br> topic_name = optional(string, null)<br> logging_sink_name = optional(string, null)<br> logging_sink_filter = optional(string, "")<br> create_subscriber = optional(bool, true)<br> })</pre> | `null` | no |
Expand All @@ -87,6 +89,7 @@ module "logging_logbucket" {
|------|-------------|
| bigquery\_destination\_name | The resource name for the destination BigQuery. |
daniel-cit marked this conversation as resolved.
Show resolved Hide resolved
| logbucket\_destination\_name | The resource name for the destination Log Bucket. |
| logbucket\_linked\_dataset\_name | The resource name of the Log Bucket linked BigQuery dataset. |
| pubsub\_destination\_name | The resource name for the destination Pub/Sub. |
| storage\_destination\_name | The resource name for the destination Storage. |

Expand Down
5 changes: 4 additions & 1 deletion 1-org/modules/centralized-logging/main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -100,14 +100,17 @@ module "log_export" {
#-------------------------#
module "destination_logbucket" {
source = "terraform-google-modules/log-export/google//modules/logbucket"
version = "~> 7.5.0"
version = "~> 7.7"

count = var.logbucket_options != null ? 1 : 0

project_id = var.logging_destination_project_id
name = coalesce(var.logbucket_options.name, local.logging_tgt_name.lbk)
log_sink_writer_identity = module.log_export["${local.value_first_resource}_lbk"].writer_identity
location = var.logbucket_options.location
enable_analytics = var.logbucket_options.enable_analytics
linked_dataset_id = var.logbucket_options.linked_dataset_id
linked_dataset_description = var.logbucket_options.linked_dataset_description
retention_days = var.logbucket_options.retention_days
grant_write_permission_on_bkt = false
}
Expand Down
5 changes: 5 additions & 0 deletions 1-org/modules/centralized-logging/outputs.tf
Original file line number Diff line number Diff line change
Expand Up @@ -33,3 +33,8 @@ output "logbucket_destination_name" {
description = "The resource name for the destination Log Bucket."
value = try(module.destination_logbucket[0].resource_name, "")
}

output "logbucket_linked_dataset_name" {
description = "The resource name of the Log Bucket linked BigQuery dataset."
value = try(module.destination_logbucket[0].linked_dataset_name, "")
}
16 changes: 11 additions & 5 deletions 1-org/modules/centralized-logging/variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -55,14 +55,20 @@ Destination LogBucket options:
- logging_sink_name: The name of the log sink to be created.
- logging_sink_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is "" which exports all logs.
- location: The location of the log bucket. Default: global.
- enable_analytics: Whether or not Log Analytics is enabled. A Log bucket with Log Analytics enabled can be queried in the Log Analytics page using SQL queries. Cannot be disabled once enabled.
- linked_dataset_id: The ID of the linked BigQuery dataset. A valid link dataset ID must only have alphanumeric characters and underscores within it and have up to 100 characters.
- linked_dataset_description: A use-friendly description of the linked BigQuery dataset. The maximum length of the description is 8000 characters.
- retention_days: The number of days data should be retained for the log bucket. Default 30.
EOT
type = object({
name = optional(string, null)
logging_sink_name = optional(string, null)
logging_sink_filter = optional(string, "")
location = optional(string, "global")
retention_days = optional(number, 30)
name = optional(string, null)
logging_sink_name = optional(string, null)
logging_sink_filter = optional(string, "")
location = optional(string, "global")
enable_analytics = optional(bool, true)
linked_dataset_id = optional(string, null)
linked_dataset_description = optional(string, null)
retention_days = optional(number, 30)
})
default = null
}
Expand Down
Loading