Skip to content

Commit a1d636e

Browse files
felipecrescencio-citdaniel-citbharathkkb
authored
feat: Modularize logging components (#781)
* Initial commit * New inline Centralized Logging module * New logbucket destination module path * -Using new inline centralized logging module for log -Added logbucket as new logging destination * Fix missing logbucket name in doc * Add support to Cloud KMS CryptoKey * Fix typos * Reviewed module documentation * Fix readme log sink filter * Fix variable description and improve module documentation * Project id removed from Log Bucket name because it is not global unique as storage names * Added information about Log bucket free cost * Added link with additional information Co-authored-by: Daniel Andrade <[email protected]> * Added links with additional information about sink destinations Co-authored-by: Daniel Andrade <[email protected]> * Improve to clarify documentation Co-authored-by: Daniel Andrade <[email protected]> * Added link with additional info * Clean unused locals * Fix example codes * -Improve auto-generated names for sinks and target -Improve code readability using maps and lookup * Fix var description Co-authored-by: Bharath KKB <[email protected]> * Refactor all destinations in one module call * Duplicated validation Removed * Fix handle retention_policy object * Fix added logbucket default location * Fix test output values to not break module * Fix PR reviews * Fix outputs and remote state vars Co-authored-by: Daniel Andrade <[email protected]> Co-authored-by: Bharath KKB <[email protected]>
1 parent 0019b00 commit a1d636e

File tree

11 files changed

+620
-79
lines changed

11 files changed

+620
-79
lines changed

1-org/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -78,8 +78,8 @@ Enabling Data Access logs might result in your project being charged for the add
7878
For details on costs you might incur, go to [Pricing](https://cloud.google.com/stackdriver/pricing).
7979
You can choose not to enable the Data Access logs by setting variable `data_access_logs_enabled` to false.
8080

81-
**Note:** This module creates a sink to export all logs to Google Storage. It also creates sinks to export a subset of security-related logs
82-
to Bigquery and Pub/Sub. This will result in additional charges for those copies of logs.
81+
**Note:** This module creates a sink to export all logs to Google Storage and Log Bucket. It also creates sinks to export a subset of security related logs
82+
to Bigquery and Pub/Sub. This will result in additional charges for those copies of logs. For Log Bucket destination, logs retained for the default retention period (30 days) [don't incur a storage cost](https://cloud.google.com/stackdriver/pricing#:~:text=Logs%20retained%20for%20the%20default%20retention%20period%20don%27t%20incur%20a%20storage%20cost.).
8383
You can change the filters & sinks by modifying the configuration in `envs/shared/log_sinks.tf`.
8484

8585
**Note:** Currently, this module does not enable [bucket policy retention](https://cloud.google.com/storage/docs/bucket-lock) for organization logs, please, enable it if needed.

1-org/envs/shared/README.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -64,6 +64,8 @@
6464
| dns\_hub\_project\_id | The DNS hub project ID |
6565
| domains\_to\_allow | The list of domains to allow users from in IAM. |
6666
| interconnect\_project\_id | The Dedicated Interconnect project ID |
67+
| logs\_export\_bigquery\_dataset\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets |
68+
| logs\_export\_logbucket\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets |
6769
| logs\_export\_pubsub\_topic | The Pub/Sub topic for destination of log exports |
6870
| logs\_export\_storage\_bucket\_name | The storage bucket for destination of log exports |
6971
| org\_audit\_logs\_project\_id | The org audit logs project ID |

1-org/envs/shared/log_sinks.tf

Lines changed: 46 additions & 74 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,7 @@
1717
locals {
1818
parent_resource_id = local.parent_folder != "" ? local.parent_folder : local.org_id
1919
parent_resource_type = local.parent_folder != "" ? "folder" : "organization"
20+
parent_resources = { resource = local.parent_resource_id }
2021
main_logs_filter = <<EOF
2122
logName: /logs/cloudaudit.googleapis.com%2Factivity OR
2223
logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR
@@ -34,88 +35,59 @@ resource "random_string" "suffix" {
3435
special = false
3536
}
3637

37-
/******************************************
38-
Send logs to BigQuery
39-
*****************************************/
38+
module "logs_export" {
39+
source = "../../modules/centralized-logging"
40+
41+
resources = local.parent_resources
42+
resource_type = local.parent_resource_type
43+
logging_destination_project_id = module.org_audit_logs.project_id
4044

41-
module "log_export_to_biqquery" {
42-
source = "terraform-google-modules/log-export/google"
43-
version = "~> 7.3.0"
44-
destination_uri = module.bigquery_destination.destination_uri
45-
filter = local.main_logs_filter
46-
log_sink_name = "sk-c-logging-bq"
47-
parent_resource_id = local.parent_resource_id
48-
parent_resource_type = local.parent_resource_type
49-
include_children = true
50-
unique_writer_identity = true
45+
/******************************************
46+
Send logs to BigQuery
47+
*****************************************/
5148
bigquery_options = {
52-
use_partitioned_tables = true
49+
logging_sink_name = "sk-c-logging-bq"
50+
logging_sink_filter = local.main_logs_filter
51+
dataset_name = "audit_logs"
52+
expiration_days = var.audit_logs_table_expiration_days
53+
delete_contents_on_destroy = var.audit_logs_table_delete_contents_on_destroy
5354
}
54-
}
55-
56-
module "bigquery_destination" {
57-
source = "terraform-google-modules/log-export/google//modules/bigquery"
58-
version = "~> 7.3.0"
59-
project_id = module.org_audit_logs.project_id
60-
dataset_name = "audit_logs"
61-
log_sink_writer_identity = module.log_export_to_biqquery.writer_identity
62-
expiration_days = var.audit_logs_table_expiration_days
63-
delete_contents_on_destroy = var.audit_logs_table_delete_contents_on_destroy
64-
}
65-
66-
/******************************************
67-
Send logs to Storage
68-
*****************************************/
6955

70-
module "log_export_to_storage" {
71-
source = "terraform-google-modules/log-export/google"
72-
version = "~> 7.3.0"
73-
destination_uri = module.storage_destination.destination_uri
74-
filter = local.all_logs_filter
75-
log_sink_name = "sk-c-logging-bkt"
76-
parent_resource_id = local.parent_resource_id
77-
parent_resource_type = local.parent_resource_type
78-
include_children = true
79-
unique_writer_identity = true
80-
}
81-
82-
module "storage_destination" {
83-
source = "terraform-google-modules/log-export/google//modules/storage"
84-
version = "~> 7.3.0"
85-
project_id = module.org_audit_logs.project_id
86-
storage_bucket_name = "bkt-${module.org_audit_logs.project_id}-org-logs-${random_string.suffix.result}"
87-
log_sink_writer_identity = module.log_export_to_storage.writer_identity
88-
uniform_bucket_level_access = true
89-
location = var.log_export_storage_location
90-
retention_policy = var.log_export_storage_retention_policy
91-
force_destroy = var.log_export_storage_force_destroy
92-
versioning = var.log_export_storage_versioning
93-
}
56+
/******************************************
57+
Send logs to Storage
58+
*****************************************/
59+
storage_options = {
60+
logging_sink_filter = local.all_logs_filter
61+
logging_sink_name = "sk-c-logging-bkt"
62+
storage_bucket_name = "bkt-${module.org_audit_logs.project_id}-org-logs-${random_string.suffix.result}"
63+
location = var.log_export_storage_location
64+
retention_policy_is_locked = var.log_export_storage_retention_policy == null ? null : var.log_export_storage_retention_policy.is_locked
65+
retention_policy_period_days = var.log_export_storage_retention_policy == null ? null : var.log_export_storage_retention_policy.retention_period_days
66+
force_destroy = var.log_export_storage_force_destroy
67+
versioning = var.log_export_storage_versioning
68+
}
9469

95-
/******************************************
96-
Send logs to Pub\Sub
97-
*****************************************/
70+
/******************************************
71+
Send logs to Pub\Sub
72+
*****************************************/
73+
pubsub_options = {
74+
logging_sink_filter = local.main_logs_filter
75+
logging_sink_name = "sk-c-logging-pub"
76+
topic_name = "tp-org-logs-${random_string.suffix.result}"
77+
create_subscriber = true
78+
}
9879

99-
module "log_export_to_pubsub" {
100-
source = "terraform-google-modules/log-export/google"
101-
version = "~> 7.3.0"
102-
destination_uri = module.pubsub_destination.destination_uri
103-
filter = local.main_logs_filter
104-
log_sink_name = "sk-c-logging-pub"
105-
parent_resource_id = local.parent_resource_id
106-
parent_resource_type = local.parent_resource_type
107-
include_children = true
108-
unique_writer_identity = true
80+
/******************************************
81+
Send logs to Logbucket
82+
*****************************************/
83+
logbucket_options = {
84+
logging_sink_name = "sk-c-logging-logbkt"
85+
logging_sink_filter = local.all_logs_filter
86+
name = "logbkt-org-logs-${random_string.suffix.result}"
87+
location = local.default_region
88+
}
10989
}
11090

111-
module "pubsub_destination" {
112-
source = "terraform-google-modules/log-export/google//modules/pubsub"
113-
version = "~> 7.3.0"
114-
project_id = module.org_audit_logs.project_id
115-
topic_name = "tp-org-logs-${random_string.suffix.result}"
116-
log_sink_writer_identity = module.log_export_to_pubsub.writer_identity
117-
create_subscriber = true
118-
}
11991

12092
/******************************************
12193
Billing logs (Export configured manually)

1-org/envs/shared/outputs.tf

Lines changed: 12 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -90,11 +90,21 @@ output "domains_to_allow" {
9090
}
9191

9292
output "logs_export_pubsub_topic" {
93-
value = module.pubsub_destination.resource_name
93+
value = module.logs_export.pubsub_destination_name
9494
description = "The Pub/Sub topic for destination of log exports"
9595
}
9696

9797
output "logs_export_storage_bucket_name" {
98-
value = module.storage_destination.resource_name
98+
value = module.logs_export.storage_destination_name
9999
description = "The storage bucket for destination of log exports"
100100
}
101+
102+
output "logs_export_logbucket_name" {
103+
value = module.logs_export.logbucket_destination_name
104+
description = "The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets"
105+
}
106+
107+
output "logs_export_bigquery_dataset_name" {
108+
value = module.logs_export.bigquery_destination_name
109+
description = "The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets"
110+
}
Lines changed: 93 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,93 @@
1+
# Centralized Logging Module
2+
3+
This module handles logging configuration enabling one or more resources such as organization, folders, or projects to send logs to multiple destinations: [GCS bucket](https://cloud.google.com/logging/docs/export/using_exported_logs#gcs-overview), [Big Query](https://cloud.google.com/logging/docs/export/bigquery), [Pub/Sub](https://cloud.google.com/logging/docs/export/using_exported_logs#pubsub-overview), and [Log Buckets](https://cloud.google.com/logging/docs/routing/overview#buckets).
4+
5+
## Usage
6+
7+
Before using this module, get familiar with the [log-export](https://registry.terraform.io/modules/terraform-google-modules/log-export/google/latest) module that is the base for it.
8+
9+
The following example exports audit logs from two folders to the same storage destination:
10+
11+
```hcl
12+
module "logs_export" {
13+
source = "terraform-google-modules/terraform-example-foundation/google//1-org/modules/centralized-logging"
14+
15+
resources = {
16+
fldr1 = "<folder1_id>"
17+
fldr2 = "<folder2_id>"
18+
}
19+
resource_type = "folder"
20+
logging_destination_project_id = "<log_destination_project_id>"
21+
22+
storage_options = {
23+
logging_sink_filter = ""
24+
logging_sink_name = "sk-c-logging-bkt"
25+
storage_bucket_name = "bkt-logs"
26+
location = "us-central1"
27+
}
28+
29+
bigquery_options = {
30+
dataset_name = "ds_logs"
31+
logging_sink_name = "sk-c-logging-bq"
32+
logging_sink_filter = <<EOF
33+
logName: /logs/cloudaudit.googleapis.com%2Factivity OR
34+
logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR
35+
logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR
36+
logName: /logs/compute.googleapis.com%2Fvpc_flows OR
37+
logName: /logs/compute.googleapis.com%2Ffirewall OR
38+
logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency
39+
EOF
40+
}
41+
}
42+
```
43+
44+
**Note:** When the destination is a Log Bucket and a sink is been created in the same project, set variable `logging_project_key` with the **key** used to map the Log Bucket project in the `resources` map.
45+
Get more details at [Configure and manage sinks](https://cloud.google.com/logging/docs/export/configure_export_v2#dest-auth:~:text=If%20you%27re%20using%20a%20sink%20to%20route%20logs%20between%20Logging%20buckets%20in%20the%20same%20Cloud%20project%2C%20no%20new%20service%20account%20is%20created%3B%20the%20sink%20works%20without%20the%20unique%20writer%20identity.).
46+
47+
The following example exports all logs from three projects - including the logging destination project - to a Log Bucket destination. As it exports all logs be aware of additional charges for this amount of logs:
48+
49+
```hcl
50+
module "logging_logbucket" {
51+
source = "terraform-google-modules/terraform-example-foundation/google//1-org/modules/centralized-logging"
52+
53+
resources = {
54+
prj1 = "<log_destination_project_id>"
55+
prj2 = "<prj2_id>"
56+
prjx = "<prjx_id>"
57+
}
58+
resource_type = "project"
59+
logging_destination_project_id = "<log_destination_project_id>"
60+
logging_project_key = "prj1"
61+
62+
logbucket_options = {
63+
logging_sink_name = "sk-c-logging-logbkt"
64+
logging_sink_filter = ""
65+
name = "logbkt-logs"
66+
}
67+
}
68+
```
69+
70+
<!-- BEGINNING OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
71+
## Inputs
72+
73+
| Name | Description | Type | Default | Required |
74+
|------|-------------|------|---------|:--------:|
75+
| bigquery\_options | Destination BigQuery options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- dataset\_name: The name of the bigquery dataset to be created and used for log entries.<br>- expiration\_days: (Optional) Table expiration time. If null logs will never be deleted.<br>- partitioned\_tables: (Optional) Options that affect sinks exporting data to BigQuery. use\_partitioned\_tables - (Required) Whether to use BigQuery's partition tables.<br>- delete\_contents\_on\_destroy: (Optional) If set to true, delete all contained objects in the logging destination.<br><br>Destination BigQuery options example:<pre>bigquery_options = {<br> logging_sink_name = "sk-c-logging-bq"<br> dataset_name = "audit_logs"<br> partitioned_tables = "true"<br> expiration_days = 30<br> delete_contents_on_destroy = false<br> logging_sink_filter = <<EOF<br> logName: /logs/cloudaudit.googleapis.com%2Factivity OR<br> logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR<br> logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR<br> logName: /logs/compute.googleapis.com%2Fvpc_flows OR<br> logName: /logs/compute.googleapis.com%2Ffirewall OR<br> logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency<br>EOF<br>}</pre> | `map(string)` | `null` | no |
76+
| logbucket\_options | Destination LogBucket options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- name: The name of the log bucket to be created and used for log entries matching the filter.<br>- location: The location of the log bucket. Default: global.<br>- retention\_days: (Optional) The number of days data should be retained for the log bucket. Default 30.<br><br>Destination LogBucket options example:<pre>logbucket_options = {<br> logging_sink_name = "sk-c-logging-logbkt"<br> logging_sink_filter = ""<br> name = "logbkt-org-logs"<br> retention_days = "30"<br> location = "global"<br>}</pre> | `map(any)` | `null` | no |
77+
| logging\_destination\_project\_id | The ID of the project that will have the resources where the logs will be created. | `string` | n/a | yes |
78+
| logging\_project\_key | (Optional) The key of logging destination project if it is inside resources map. It is mandatory when resource\_type = project and logging\_target\_type = logbucket. | `string` | `""` | no |
79+
| pubsub\_options | Destination Pubsub options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- topic\_name: The name of the pubsub topic to be created and used for log entries matching the filter.<br>- create\_subscriber: (Optional) Whether to create a subscription to the topic that was created and used for log entries matching the filter. If 'true', a pull subscription is created along with a service account that is granted roles/pubsub.subscriber and roles/pubsub.viewer to the topic.<br><br>Destination Storage options example:<pre>pubsub_options = {<br> logging_sink_name = "sk-c-logging-pub"<br> topic_name = "tp-org-logs"<br> create_subscriber = true<br> logging_sink_filter = <<EOF<br> logName: /logs/cloudaudit.googleapis.com%2Factivity OR<br> logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR<br> logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR<br> logName: /logs/compute.googleapis.com%2Fvpc_flows OR<br> logName: /logs/compute.googleapis.com%2Ffirewall OR<br> logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency<br>EOF<br>}</pre> | `map(any)` | `null` | no |
80+
| resource\_type | Resource type of the resource that will export logs to destination. Must be: project, organization, or folder. | `string` | n/a | yes |
81+
| resources | Export logs from the specified resources. | `map(string)` | n/a | yes |
82+
| storage\_options | Destination Storage options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- storage\_bucket\_name: The name of the storage bucket to be created and used for log entries matching the filter.<br>- location: (Optional) The location of the logging destination. Default: US.<br>- Retention Policy variables: (Optional) Configuration of the bucket's data retention policy for how long objects in the bucket should be retained.<br> - retention\_policy\_is\_locked: Set if policy is locked.<br> - retention\_policy\_period\_days: Set the period of days for log retention. Default: 30.<br>- versioning: (Optional) Toggles bucket versioning, ability to retain a non-current object version when the live object version gets replaced or deleted.<br>- force\_destroy: When deleting a bucket, this boolean option will delete all contained objects.<br><br>Destination Storage options example:<pre>storage_options = {<br> logging_sink_name = "sk-c-logging-bkt"<br> logging_sink_filter = ""<br> storage_bucket_name = "bkt-org-logs"<br> location = "US"<br> force_destroy = false<br> versioning = false<br>}</pre> | `map(any)` | `null` | no |
83+
84+
## Outputs
85+
86+
| Name | Description |
87+
|------|-------------|
88+
| bigquery\_destination\_name | The resource name for the destination BigQuery. |
89+
| logbucket\_destination\_name | The resource name for the destination Log Bucket. |
90+
| pubsub\_destination\_name | The resource name for the destination Pub/Sub. |
91+
| storage\_destination\_name | The resource name for the destination Storage. |
92+
93+
<!-- END OF PRE-COMMIT-TERRAFORM DOCS HOOK -->

0 commit comments

Comments
 (0)