I’m a big fan of Google’s Cloudbuild - and not just for a CI/CD pipeline. I use it as a cheap, on-demand serverless / docker process that has easy access to shell scripting.
For the lowest tier, the first 120 mins a day are free, making it perfect for small projects & utility tasks.
To start a cloudbuild as a cron job from Cloud Scheduler, or as a reaction to events like files uploaded to a Google Storage bucket. For this a Cloud Function needs to act as the intermediary.
You’d create new Service Account & credentials for your GCP account, and include them in the Cloud Function upload package.
Here’s an example of a Node script, referencing the GCP credentials in a separate file called gcp-cred.json.
const request = require('request');
const {auth} = require('google-auth-library');
process.env.GOOGLE_APPLICATION_CREDENTIALS = "gcp-creds.json"
const client = await auth.getClient({
scopes: 'https://www.googleapis.com/auth/cloud-platform'
});
const projectId = await auth.getProjectId();
const url = `https://cloudbuild.googleapis.com/v1/projects/${projectId}/builds`;
const buildSteps = {
"steps": [
{
"id": "action-step",
"name": "gcr.io/cloud-builders/gcloud",
"entrypoint": "sh",
"args": [
"-c",
"echo \"A Cloudbuild Step\"\n"
]
}
],
"timeout": "86400s"
}
const buildDataStr = JSON.stringify(buildSteps)
const res = await client.request({ url ,body: buildDataStr, method:"POST"});
For security I’d recommend iterating further by storing the GCP credentials encrypted in KMS.