You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Initial commit
* New inline Centralized Logging module
* New logbucket destination module path
* -Using new inline centralized logging module for log
-Added logbucket as new logging destination
* Fix missing logbucket name in doc
* Add support to Cloud KMS CryptoKey
* Fix typos
* Reviewed module documentation
* Fix readme log sink filter
* Fix variable description and improve module documentation
* Project id removed from Log Bucket name because it is not global unique as storage names
* Added information about Log bucket free cost
* Added link with additional information
Co-authored-by: Daniel Andrade <[email protected]>
* Added links with additional information about sink destinations
Co-authored-by: Daniel Andrade <[email protected]>
* Improve to clarify documentation
Co-authored-by: Daniel Andrade <[email protected]>
* Added link with additional info
* Clean unused locals
* Fix example codes
* -Improve auto-generated names for sinks and target
-Improve code readability using maps and lookup
* Fix var description
Co-authored-by: Bharath KKB <[email protected]>
* Refactor all destinations in one module call
* Duplicated validation Removed
* Fix handle retention_policy object
* Fix added logbucket default location
* Fix test output values to not break module
* Fix PR reviews
* Fix outputs and remote state vars
Co-authored-by: Daniel Andrade <[email protected]>
Co-authored-by: Bharath KKB <[email protected]>
Copy file name to clipboardExpand all lines: 1-org/README.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -78,8 +78,8 @@ Enabling Data Access logs might result in your project being charged for the add
78
78
For details on costs you might incur, go to [Pricing](https://cloud.google.com/stackdriver/pricing).
79
79
You can choose not to enable the Data Access logs by setting variable `data_access_logs_enabled` to false.
80
80
81
-
**Note:** This module creates a sink to export all logs to Google Storage. It also creates sinks to export a subset of security-related logs
82
-
to Bigquery and Pub/Sub. This will result in additional charges for those copies of logs.
81
+
**Note:** This module creates a sink to export all logs to Google Storage and Log Bucket. It also creates sinks to export a subset of securityrelated logs
82
+
to Bigquery and Pub/Sub. This will result in additional charges for those copies of logs. For Log Bucket destination, logs retained for the default retention period (30 days) [don't incur a storage cost](https://cloud.google.com/stackdriver/pricing#:~:text=Logs%20retained%20for%20the%20default%20retention%20period%20don%27t%20incur%20a%20storage%20cost.).
83
83
You can change the filters & sinks by modifying the configuration in `envs/shared/log_sinks.tf`.
84
84
85
85
**Note:** Currently, this module does not enable [bucket policy retention](https://cloud.google.com/storage/docs/bucket-lock) for organization logs, please, enable it if needed.
Copy file name to clipboardExpand all lines: 1-org/envs/shared/README.md
+2Lines changed: 2 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -64,6 +64,8 @@
64
64
| dns\_hub\_project\_id | The DNS hub project ID |
65
65
| domains\_to\_allow | The list of domains to allow users from in IAM. |
66
66
| interconnect\_project\_id | The Dedicated Interconnect project ID |
67
+
| logs\_export\_bigquery\_dataset\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets|
68
+
| logs\_export\_logbucket\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets|
67
69
| logs\_export\_pubsub\_topic | The Pub/Sub topic for destination of log exports |
68
70
| logs\_export\_storage\_bucket\_name | The storage bucket for destination of log exports |
69
71
| org\_audit\_logs\_project\_id | The org audit logs project ID |
This module handles logging configuration enabling one or more resources such as organization, folders, or projects to send logs to multiple destinations: [GCS bucket](https://cloud.google.com/logging/docs/export/using_exported_logs#gcs-overview), [Big Query](https://cloud.google.com/logging/docs/export/bigquery), [Pub/Sub](https://cloud.google.com/logging/docs/export/using_exported_logs#pubsub-overview), and [Log Buckets](https://cloud.google.com/logging/docs/routing/overview#buckets).
4
+
5
+
## Usage
6
+
7
+
Before using this module, get familiar with the [log-export](https://registry.terraform.io/modules/terraform-google-modules/log-export/google/latest) module that is the base for it.
8
+
9
+
The following example exports audit logs from two folders to the same storage destination:
**Note:** When the destination is a Log Bucket and a sink is been created in the same project, set variable `logging_project_key` with the **key** used to map the Log Bucket project in the `resources` map.
45
+
Get more details at [Configure and manage sinks](https://cloud.google.com/logging/docs/export/configure_export_v2#dest-auth:~:text=If%20you%27re%20using%20a%20sink%20to%20route%20logs%20between%20Logging%20buckets%20in%20the%20same%20Cloud%20project%2C%20no%20new%20service%20account%20is%20created%3B%20the%20sink%20works%20without%20the%20unique%20writer%20identity.).
46
+
47
+
The following example exports all logs from three projects - including the logging destination project - to a Log Bucket destination. As it exports all logs be aware of additional charges for this amount of logs:
| bigquery\_options | Destination BigQuery options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- dataset\_name: The name of the bigquery dataset to be created and used for log entries.<br>- expiration\_days: (Optional) Table expiration time. If null logs will never be deleted.<br>- partitioned\_tables: (Optional) Options that affect sinks exporting data to BigQuery. use\_partitioned\_tables - (Required) Whether to use BigQuery's partition tables.<br>- delete\_contents\_on\_destroy: (Optional) If set to true, delete all contained objects in the logging destination.<br><br>Destination BigQuery options example:<pre>bigquery_options = {<br> logging_sink_name = "sk-c-logging-bq"<br> dataset_name = "audit_logs"<br> partitioned_tables = "true"<br> expiration_days = 30<br> delete_contents_on_destroy = false<br> logging_sink_filter = <<EOF<br> logName: /logs/cloudaudit.googleapis.com%2Factivity OR<br> logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR<br> logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR<br> logName: /logs/compute.googleapis.com%2Fvpc_flows OR<br> logName: /logs/compute.googleapis.com%2Ffirewall OR<br> logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency<br>EOF<br>}</pre> | `map(string)` | `null` | no |
76
+
| logbucket\_options | Destination LogBucket options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- name: The name of the log bucket to be created and used for log entries matching the filter.<br>- location: The location of the log bucket. Default: global.<br>- retention\_days: (Optional) The number of days data should be retained for the log bucket. Default 30.<br><br>Destination LogBucket options example:<pre>logbucket_options = {<br> logging_sink_name = "sk-c-logging-logbkt"<br> logging_sink_filter = ""<br> name = "logbkt-org-logs"<br> retention_days = "30"<br> location = "global"<br>}</pre> |`map(any)`|`null`| no |
77
+
| logging\_destination\_project\_id | The ID of the project that will have the resources where the logs will be created. |`string`| n/a | yes |
78
+
| logging\_project\_key | (Optional) The key of logging destination project if it is inside resources map. It is mandatory when resource\_type = project and logging\_target\_type = logbucket. |`string`|`""`| no |
79
+
| pubsub\_options | Destination Pubsub options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- topic\_name: The name of the pubsub topic to be created and used for log entries matching the filter.<br>- create\_subscriber: (Optional) Whether to create a subscription to the topic that was created and used for log entries matching the filter. If 'true', a pull subscription is created along with a service account that is granted roles/pubsub.subscriber and roles/pubsub.viewer to the topic.<br><br>Destination Storage options example:<pre>pubsub_options = {<br> logging_sink_name = "sk-c-logging-pub"<br> topic_name = "tp-org-logs"<br> create_subscriber = true<br> logging_sink_filter = <<EOF<br> logName: /logs/cloudaudit.googleapis.com%2Factivity OR<br> logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR<br> logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR<br> logName: /logs/compute.googleapis.com%2Fvpc_flows OR<br> logName: /logs/compute.googleapis.com%2Ffirewall OR<br> logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency<br>EOF<br>}</pre> | `map(any)` | `null` | no |
80
+
| resource\_type | Resource type of the resource that will export logs to destination. Must be: project, organization, or folder. |`string`| n/a | yes |
81
+
| resources | Export logs from the specified resources. |`map(string)`| n/a | yes |
82
+
| storage\_options | Destination Storage options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- storage\_bucket\_name: The name of the storage bucket to be created and used for log entries matching the filter.<br>- location: (Optional) The location of the logging destination. Default: US.<br>- Retention Policy variables: (Optional) Configuration of the bucket's data retention policy for how long objects in the bucket should be retained.<br> - retention\_policy\_is\_locked: Set if policy is locked.<br> - retention\_policy\_period\_days: Set the period of days for log retention. Default: 30.<br>- versioning: (Optional) Toggles bucket versioning, ability to retain a non-current object version when the live object version gets replaced or deleted.<br>- force\_destroy: When deleting a bucket, this boolean option will delete all contained objects.<br><br>Destination Storage options example:<pre>storage_options = {<br> logging_sink_name = "sk-c-logging-bkt"<br> logging_sink_filter = ""<br> storage_bucket_name = "bkt-org-logs"<br> location = "US"<br> force_destroy = false<br> versioning = false<br>}</pre> | `map(any)` | `null` | no |
83
+
84
+
## Outputs
85
+
86
+
| Name | Description |
87
+
|------|-------------|
88
+
| bigquery\_destination\_name | The resource name for the destination BigQuery. |
89
+
| logbucket\_destination\_name | The resource name for the destination Log Bucket. |
90
+
| pubsub\_destination\_name | The resource name for the destination Pub/Sub. |
91
+
| storage\_destination\_name | The resource name for the destination Storage. |
0 commit comments