Skip to content

Commit

Permalink
feat!: Add support for Log Analytics and Remove BigQuery log destinat…
Browse files Browse the repository at this point in the history
…ion (#1025)

Co-authored-by: Grant Sorbo <[email protected]>
  • Loading branch information
daniel-cit and gtsorbo authored Dec 13, 2023
1 parent ee3a1d8 commit 25c61c4
Show file tree
Hide file tree
Showing 10 changed files with 59 additions and 153 deletions.
2 changes: 1 addition & 1 deletion 1-org/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ to Bigquery and Pub/Sub. This will result in additional charges for those copies

- This module implements but does not enable [bucket policy retention](https://cloud.google.com/storage/docs/bucket-lock) for organization logs. If needed, enable a retention policy by configuring the `log_export_storage_retention_policy` variable.

- This module implements but does not enable [object versioning](https://cloud.google.com/storage/docs/object-versioning) for organization logs. If needed, enable object versioning by setting the `audit_logs_table_delete_contents_on_destroy` variable to true.
- This module implements but does not enable [object versioning](https://cloud.google.com/storage/docs/object-versioning) for organization logs. If needed, enable object versioning by setting the `log_export_storage_versioning` variable to true.

- Bucket policy retention and object versioning are **mutually exclusive**.

Expand Down
6 changes: 2 additions & 4 deletions 1-org/envs/shared/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,6 @@
| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| audit\_data\_users | Google Workspace or Cloud Identity group that have access to audit logs. | `string` | n/a | yes |
| audit\_logs\_table\_delete\_contents\_on\_destroy | (Optional) If set to true, delete all the tables in the dataset when destroying the resource; otherwise, destroying the resource will fail if tables are present. | `bool` | `false` | no |
| audit\_logs\_table\_expiration\_days | Period before tables expire for all audit logs in milliseconds. Default is 30 days. | `number` | `30` | no |
| billing\_data\_users | Google Workspace or Cloud Identity group that have access to billing data set. | `string` | n/a | yes |
| billing\_export\_dataset\_location | The location of the dataset for billing data export. | `string` | `"US"` | no |
| cai\_monitoring\_kms\_force\_destroy | If set to true, delete KMS keyring and keys when destroying the module; otherwise, destroying the module will fail if KMS keys are present. | `bool` | `false` | no |
Expand Down Expand Up @@ -42,8 +40,8 @@
| domains\_to\_allow | The list of domains to allow users from in IAM. |
| interconnect\_project\_id | The Dedicated Interconnect project ID |
| interconnect\_project\_number | The Dedicated Interconnect project number |
| logs\_export\_bigquery\_dataset\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets |
| logs\_export\_logbucket\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets |
| logs\_export\_logbucket\_linked\_dataset\_name | The resource name of the Log Bucket linked BigQuery dataset created for Log Analytics. See https://cloud.google.com/logging/docs/log-analytics . |
| logs\_export\_logbucket\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets . |
| logs\_export\_pubsub\_topic | The Pub/Sub topic for destination of log exports |
| logs\_export\_storage\_bucket\_name | The storage bucket for destination of log exports |
| network\_folder\_name | The network folder name. |
Expand Down
34 changes: 13 additions & 21 deletions 1-org/envs/shared/log_sinks.tf
Original file line number Diff line number Diff line change
Expand Up @@ -18,15 +18,16 @@ locals {
parent_resource_id = local.parent_folder != "" ? local.parent_folder : local.org_id
parent_resource_type = local.parent_folder != "" ? "folder" : "organization"
parent_resources = { resource = local.parent_resource_id }
main_logs_filter = <<EOF
logs_filter = <<EOF
logName: /logs/cloudaudit.googleapis.com%2Factivity OR
logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR
logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR
logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency OR
logName: /logs/cloudaudit.googleapis.com%2Fpolicy OR
logName: /logs/compute.googleapis.com%2Fvpc_flows OR
logName: /logs/compute.googleapis.com%2Ffirewall OR
logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency
logName: /logs/dns.googleapis.com%2Fdns_queries
EOF
all_logs_filter = ""
}

resource "random_string" "suffix" {
Expand All @@ -42,22 +43,11 @@ module "logs_export" {
resource_type = local.parent_resource_type
logging_destination_project_id = module.org_audit_logs.project_id

/******************************************
Send logs to BigQuery
*****************************************/
bigquery_options = {
logging_sink_name = "sk-c-logging-bq"
logging_sink_filter = local.main_logs_filter
dataset_name = "audit_logs"
expiration_days = var.audit_logs_table_expiration_days
delete_contents_on_destroy = var.audit_logs_table_delete_contents_on_destroy
}

/******************************************
Send logs to Storage
*****************************************/
storage_options = {
logging_sink_filter = local.all_logs_filter
logging_sink_filter = local.logs_filter
logging_sink_name = "sk-c-logging-bkt"
storage_bucket_name = "bkt-${module.org_audit_logs.project_id}-org-logs-${random_string.suffix.result}"
location = var.log_export_storage_location
Expand All @@ -72,7 +62,7 @@ module "logs_export" {
Send logs to Pub\Sub
*****************************************/
pubsub_options = {
logging_sink_filter = local.main_logs_filter
logging_sink_filter = local.logs_filter
logging_sink_name = "sk-c-logging-pub"
topic_name = "tp-org-logs-${random_string.suffix.result}"
create_subscriber = true
Expand All @@ -82,14 +72,16 @@ module "logs_export" {
Send logs to Logbucket
*****************************************/
logbucket_options = {
logging_sink_name = "sk-c-logging-logbkt"
logging_sink_filter = local.all_logs_filter
name = "logbkt-org-logs-${random_string.suffix.result}"
location = local.default_region
logging_sink_name = "sk-c-logging-logbkt"
logging_sink_filter = local.logs_filter
name = "logbkt-org-logs-${random_string.suffix.result}"
location = local.default_region
enable_analytics = true
linked_dataset_id = "ds_c_logbkt_analytics"
linked_dataset_description = "BigQuery Dataset for Logbucket analytics"
}
}


/******************************************
Billing logs (Export configured manually)
*****************************************/
Expand Down
8 changes: 4 additions & 4 deletions 1-org/envs/shared/outputs.tf
Original file line number Diff line number Diff line change
Expand Up @@ -111,12 +111,12 @@ output "logs_export_storage_bucket_name" {

output "logs_export_logbucket_name" {
value = module.logs_export.logbucket_destination_name
description = "The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets"
description = "The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets ."
}

output "logs_export_bigquery_dataset_name" {
value = module.logs_export.bigquery_destination_name
description = "The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets"
output "logs_export_logbucket_linked_dataset_name" {
value = module.logs_export.logbucket_linked_dataset_name
description = "The resource name of the Log Bucket linked BigQuery dataset created for Log Analytics. See https://cloud.google.com/logging/docs/log-analytics ."
}

output "tags" {
Expand Down
12 changes: 0 additions & 12 deletions 1-org/envs/shared/variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -35,12 +35,6 @@ variable "domains_to_allow" {
type = list(string)
}

variable "audit_logs_table_expiration_days" {
description = "Period before tables expire for all audit logs in milliseconds. Default is 30 days."
type = number
default = 30
}

variable "scc_notification_name" {
description = "Name of the Security Command Center Notification. It must be unique in the organization. Run `gcloud scc notifications describe <scc_notification_name> --organization=org_id` to check if it already exists."
type = string
Expand Down Expand Up @@ -94,12 +88,6 @@ variable "log_export_storage_versioning" {
default = false
}

variable "audit_logs_table_delete_contents_on_destroy" {
description = "(Optional) If set to true, delete all the tables in the dataset when destroying the resource; otherwise, destroying the resource will fail if tables are present."
type = bool
default = false
}

variable "log_export_storage_retention_policy" {
description = "Configuration of the bucket's data retention policy for how long objects in the bucket should be retained."
type = object({
Expand Down
20 changes: 3 additions & 17 deletions 1-org/modules/centralized-logging/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Centralized Logging Module

This module handles logging configuration enabling one or more resources such as organization, folders, or projects to send logs to multiple destinations: [GCS bucket](https://cloud.google.com/logging/docs/export/using_exported_logs#gcs-overview), [Big Query](https://cloud.google.com/logging/docs/export/bigquery), [Pub/Sub](https://cloud.google.com/logging/docs/export/using_exported_logs#pubsub-overview), and [Log Buckets](https://cloud.google.com/logging/docs/routing/overview#buckets).
This module handles logging configuration enabling one or more resources such as organization, folders, or projects to send logs to multiple destinations: [GCS bucket](https://cloud.google.com/logging/docs/export/using_exported_logs#gcs-overview), [Pub/Sub](https://cloud.google.com/logging/docs/export/using_exported_logs#pubsub-overview), and [Log Buckets](https://cloud.google.com/logging/docs/routing/overview#buckets) with [Log Analytics](https://cloud.google.com/logging/docs/log-analytics#analytics).

## Usage

Expand All @@ -25,19 +25,6 @@ module "logs_export" {
storage_bucket_name = "bkt-logs"
location = "us-central1"
}
bigquery_options = {
dataset_name = "ds_logs"
logging_sink_name = "sk-c-logging-bq"
logging_sink_filter = <<EOF
logName: /logs/cloudaudit.googleapis.com%2Factivity OR
logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR
logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR
logName: /logs/compute.googleapis.com%2Fvpc_flows OR
logName: /logs/compute.googleapis.com%2Ffirewall OR
logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency
EOF
}
}
```

Expand Down Expand Up @@ -72,8 +59,7 @@ module "logging_logbucket" {

| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| bigquery\_options | Destination BigQuery options:<br>- dataset\_name: The name of the bigquery dataset to be created and used for log entries.<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is "" which exports all logs.<br>- expiration\_days: Table expiration time. If null logs will never be deleted.<br>- partitioned\_tables: Options that affect sinks exporting data to BigQuery. use\_partitioned\_tables - (Required) Whether to use BigQuery's partition tables.<br>- delete\_contents\_on\_destroy: If set to true, delete all contained objects in the logging destination. | <pre>object({<br> dataset_name = optional(string, null)<br> logging_sink_name = optional(string, null)<br> logging_sink_filter = optional(string, "")<br> expiration_days = optional(number, null)<br> partitioned_tables = optional(bool, true)<br> delete_contents_on_destroy = optional(bool, false)<br> })</pre> | `null` | no |
| logbucket\_options | Destination LogBucket options:<br>- name: The name of the log bucket to be created and used for log entries matching the filter.<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is "" which exports all logs.<br>- location: The location of the log bucket. Default: global.<br>- retention\_days: The number of days data should be retained for the log bucket. Default 30. | <pre>object({<br> name = optional(string, null)<br> logging_sink_name = optional(string, null)<br> logging_sink_filter = optional(string, "")<br> location = optional(string, "global")<br> retention_days = optional(number, 30)<br> })</pre> | `null` | no |
| logbucket\_options | Destination LogBucket options:<br>- name: The name of the log bucket to be created and used for log entries matching the filter.<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is "" which exports all logs.<br>- location: The location of the log bucket. Default: global.<br>- enable\_analytics: Whether or not Log Analytics is enabled. A Log bucket with Log Analytics enabled can be queried in the Log Analytics page using SQL queries. Cannot be disabled once enabled.<br>- linked\_dataset\_id: The ID of the linked BigQuery dataset. A valid link dataset ID must only have alphanumeric characters and underscores within it and have up to 100 characters.<br>- linked\_dataset\_description: A use-friendly description of the linked BigQuery dataset. The maximum length of the description is 8000 characters.<br>- retention\_days: The number of days data should be retained for the log bucket. Default 30. | <pre>object({<br> name = optional(string, null)<br> logging_sink_name = optional(string, null)<br> logging_sink_filter = optional(string, "")<br> location = optional(string, "global")<br> enable_analytics = optional(bool, true)<br> linked_dataset_id = optional(string, null)<br> linked_dataset_description = optional(string, null)<br> retention_days = optional(number, 30)<br> })</pre> | `null` | no |
| logging\_destination\_project\_id | The ID of the project that will have the resources where the logs will be created. | `string` | n/a | yes |
| logging\_project\_key | (Optional) The key of logging destination project if it is inside resources map. It is mandatory when resource\_type = project and logging\_target\_type = logbucket. | `string` | `""` | no |
| pubsub\_options | Destination Pubsub options:<br>- topic\_name: The name of the pubsub topic to be created and used for log entries matching the filter.<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is "" which exports all logs.<br>- create\_subscriber: Whether to create a subscription to the topic that was created and used for log entries matching the filter. If 'true', a pull subscription is created along with a service account that is granted roles/pubsub.subscriber and roles/pubsub.viewer to the topic. | <pre>object({<br> topic_name = optional(string, null)<br> logging_sink_name = optional(string, null)<br> logging_sink_filter = optional(string, "")<br> create_subscriber = optional(bool, true)<br> })</pre> | `null` | no |
Expand All @@ -85,8 +71,8 @@ module "logging_logbucket" {

| Name | Description |
|------|-------------|
| bigquery\_destination\_name | The resource name for the destination BigQuery. |
| logbucket\_destination\_name | The resource name for the destination Log Bucket. |
| logbucket\_linked\_dataset\_name | The resource name of the Log Bucket linked BigQuery dataset. |
| pubsub\_destination\_name | The resource name for the destination Pub/Sub. |
| storage\_destination\_name | The resource name for the destination Storage. |

Expand Down
41 changes: 5 additions & 36 deletions 1-org/modules/centralized-logging/main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -39,34 +39,30 @@ locals {
for v in local.exports_list : "${v.res}_${v.type}" => v
}
destinations_options = {
bgq = var.bigquery_options
pub = var.pubsub_options
sto = var.storage_options
lbk = var.logbucket_options
}

logging_sink_name_map = {
bgq = try("sk-to-ds-logs-${var.logging_destination_project_id}", "sk-to-ds-logs")
pub = try("sk-to-tp-logs-${var.logging_destination_project_id}", "sk-to-tp-logs")
sto = try("sk-to-bkt-logs-${var.logging_destination_project_id}", "sk-to-bkt-logs")
lbk = try("sk-to-logbkt-logs-${var.logging_destination_project_id}", "sk-to-logbkt-logs")
}

logging_tgt_name = {
bgq = replace("${local.logging_tgt_prefix.bgq}${random_string.suffix.result}", "-", "_")
pub = "${local.logging_tgt_prefix.pub}${random_string.suffix.result}"
sto = "${local.logging_tgt_prefix.sto}${random_string.suffix.result}"
lbk = "${local.logging_tgt_prefix.lbk}${random_string.suffix.result}"
}

destination_uri_map = {
bgq = try(module.destination_bigquery[0].destination_uri, "")
pub = try(module.destination_pubsub[0].destination_uri, "")
sto = try(module.destination_storage[0].destination_uri, "")
lbk = try(module.destination_logbucket[0].destination_uri, "")
}

logging_tgt_prefix = {
bgq = "ds_logs_"
pub = "tp-logs-"
sto = try("bkt-logs-${var.logging_destination_project_id}-", "bkt-logs-")
lbk = "logbkt-logs-"
Expand All @@ -92,22 +88,24 @@ module "log_export" {
parent_resource_type = var.resource_type
unique_writer_identity = true
include_children = local.include_children
bigquery_options = each.value.type == "bgq" ? { use_partitioned_tables = true } : null
}

#-------------------------#
# Send logs to Log Bucket #
#-------------------------#
module "destination_logbucket" {
source = "terraform-google-modules/log-export/google//modules/logbucket"
version = "~> 7.5.0"
version = "~> 7.7"

count = var.logbucket_options != null ? 1 : 0

project_id = var.logging_destination_project_id
name = coalesce(var.logbucket_options.name, local.logging_tgt_name.lbk)
log_sink_writer_identity = module.log_export["${local.value_first_resource}_lbk"].writer_identity
location = var.logbucket_options.location
enable_analytics = var.logbucket_options.enable_analytics
linked_dataset_id = var.logbucket_options.linked_dataset_id
linked_dataset_description = var.logbucket_options.linked_dataset_description
retention_days = var.logbucket_options.retention_days
grant_write_permission_on_bkt = false
}
Expand All @@ -126,35 +124,6 @@ resource "google_project_iam_member" "logbucket_sink_member" {
member = module.log_export["${each.value}_lbk"].writer_identity
}


#-----------------------#
# Send logs to BigQuery #
#-----------------------#
module "destination_bigquery" {
source = "terraform-google-modules/log-export/google//modules/bigquery"
version = "~> 7.4"

count = var.bigquery_options != null ? 1 : 0

project_id = var.logging_destination_project_id
dataset_name = coalesce(var.bigquery_options.dataset_name, local.logging_tgt_name.bgq)
log_sink_writer_identity = module.log_export["${local.value_first_resource}_bgq"].writer_identity
expiration_days = var.bigquery_options.expiration_days
delete_contents_on_destroy = var.bigquery_options.delete_contents_on_destroy
}

#-----------------------------------------#
# Bigquery Service account IAM membership #
#-----------------------------------------#
resource "google_project_iam_member" "bigquery_sink_member" {
for_each = var.bigquery_options != null ? var.resources : {}

project = var.logging_destination_project_id
role = "roles/bigquery.dataEditor"
member = module.log_export["${each.value}_bgq"].writer_identity
}


#----------------------#
# Send logs to Storage #
#----------------------#
Expand Down
Loading

0 comments on commit 25c61c4

Please sign in to comment.