Skip to main content

GCP Logging Sink

A Logging Sink in Google Cloud Platform (GCP) is a routing rule that selects log entries with a user-defined filter and exports them to a chosen destination such as BigQuery, Cloud Storage, Pub/Sub, or another Cloud Logging bucket. Sinks are the building blocks of GCPโ€™s Log Router and are used to retain, analyse or stream logs outside of the originating project, folder or organisation.
Official documentation: https://cloud.google.com/logging/docs/export

Supported Methodsโ€‹

  • GET: Get GCP Logging Sink by "gcp-logging-sink-name"
  • LIST: List all GCP Logging Sink items
  • SEARCH

gcp-big-query-datasetโ€‹

If the sinkโ€™s destination is a BigQuery table, it must reference a BigQuery dataset where the tables will be created and written to. The dataset therefore appears as a child dependency of the logging sink.

gcp-iam-service-accountโ€‹

Every sink is assigned a writer_identity, which is an IAM service account that needs permission to write into the chosen destination. The sinkโ€™s correct operation depends on this service account having the required roles on the target resource.

gcp-logging-bucketโ€‹

A sink can route logs to another Cloud Logging bucket (including aggregated buckets at the folder or organisation level). In this case the sink targets, and must have write access to, the specified logging bucket.

gcp-pub-sub-topicโ€‹

When the destination is Pub/Sub, the sink exports each matching log entry as a message on a particular topic. The topic therefore represents an external linkage for onward streaming or event-driven processing.

gcp-storage-bucketโ€‹

For archival purposes a sink may export logs to a Cloud Storage bucket. The bucket must exist and grant the sinkโ€™s writer service account permission to create objects, making the storage bucket a direct dependency of the sink.