GCP Ai Platform Custom Job
A Google Cloud AI Platform Custom Job (part of Vertex AI) represents an on-demand training workload that you define yourself—either by supplying a Docker image or by pointing to a Python package and associated command. Google Cloud orchestrates the required compute resources, runs the containers, streams logs, and stores the resulting artefacts. Custom Jobs allow you to train models in a fully managed manner while retaining complete control over the training logic and environment.
For full details see the official documentation: https://cloud.google.com/vertex-ai/docs/training/custom-training
Supported Methods
GET
: Get a gcp-ai-platform-custom-job by its "name"LIST
: List all gcp-ai-platform-custom-jobSEARCH
Possible Links
gcp-ai-platform-model
A Custom Job often ends by invoking model.upload()
or by specifying model_to_upload
in its specification, which creates a Vertex AI Model resource. Overmind therefore links the Custom Job to any Model it produces or references.
gcp-cloud-kms-crypto-key
If customer-managed encryption keys (CMEK) are used, the Custom Job specification includes the full name of the KMS Crypto Key that encrypts its output artefacts. Overmind records this link to show the encryption dependency.
gcp-compute-network
Custom Jobs can be configured to run inside a specific VPC network for private-service-connect or restricted egress scenarios (network
field). The referenced Compute Network is linked so that network isolation and reachability risks can be assessed.
gcp-iam-service-account
The workload executes under a user-specified service account (serviceAccount
field). Overmind links the Custom Job to that IAM Service Account to surface permission inheritance and least-privilege issues.
gcp-storage-bucket
Training data, output models, logs, and checkpoints are usually stored in Cloud Storage (inputUri
, outputUri
, stagingBucket
, etc.). Overmind links the Custom Job to every referenced Storage Bucket so that data-access policies and bucket configurations can be inspected.