Add the package as a dev dependency:
npm install -D @space48/cloud-seed
Note: Versions prior to 4.1.2 were published to GitHub Packages. Starting with version 4.1.2, all versions are published to npm, as per the Space48 package publishing guidelines. If you need to install an older version (< 4.1.2), you'll need to configure npm to use GitHub Packages.
Please see CONTRIBUTING.md for guidelines on how to contribute to this project.
For information about the release process, please see RELEASING.md.
Define your first cloud function:
src/myFirstFunction.ts
import { HttpFunction } from "@google-cloud/functions-framework";
import { GcpConfig } from "@space48/cloud-seed";
const myFunction: HttpFunction = (req, res) => {
console.log("Hello World");
return res.sendStatus(200);
};
export default myFunction;
export const runtimeConfig: GcpConfig = {
cloud: "gcp",
type: "http",
public: true,
};Then run your build command:
npx @space48/cloud-seed build
To apply the terraform config you'll be able to run the following:
terraform chdir=[buildDir] init
terraform chdir=[buildDir] plan -out=plan
terraform chdir=[buildDir] apply plan
You can start the local development server for a function by running:
npx @space48/cloud-seed run src/myFirstFunction.ts --env environment
The development server will build the function and expose it on http://localhost:3000/ so you can trigger it with a HTTP request. Any changes you make to the source code of the function will rebuild it and restart the server, allowing you to keep testing without having to manually run any additional commands.
Tips and tricks:
- The development server does not require access to a cloud project to run the function locally. However, if your function code accesses any cloud resources, such as message queues or databases, you will need to authenticate with the cloud to allow it. For example, you can authenticate with GCP by running the
gcloud auth logincommand in gcloud CLI. - The development server runs a single cloud function in isolation. If you want to test multiple functions, you can run multiple development servers on different ports using the
--portoption. Keep in mind that this won't automatically connect the functions and allow then to interract with each other. Your function code will have to be aware that it is running in development and adjust the endpoints it uses accordingly. You can use theCLOUD_SEED_ENVIRONMENTenvironment variable to write conditional code for running in the development environment. - Any cloud function can be triggered by a HTTP request, regardless of its type. However, some function types may require the request data to follow a specific format or use a specific encoding. Refer to the official documentation for the cloud service you're using for details on how to format your function input.
- The development server automatically loads the environment variables you have defined in
cloudseed.jsonfor the provided environment and exposes them to the function. You can use this to configure a dedicated development environment, with appropriate configuration, to be shared by anyone running the project locally. Additionally, any environment variables with the same names defined in your environment will override the values from the configuration file. This is useful to temporarily modify the configuration for testing without having to modify thecloudseed.jsonfile.
Each Cloud Function you wish to define can live anywhere within the src directory, how you structure this is up to you.
The only thing you need to define for your function to work is the named export: runtimeConfig. Take a look at the RuntimeConfig type to see what options you have (thats in runtime.ts).
import { HttpFunction } from "@google-cloud/functions-framework";
import type { GcpConfig } from "@space48/cloud-seed";
const myFunction: HttpFunction = (req, res) => {
console.log("Hello World");
return res.sendStatus(200);
};
export default myFunction;
export const runtimeConfig: GcpConfig = {
cloud: "gcp",
type: "http",
// Configure if the cloud function should be publically executable or not.
public: true,
};import { CloudEventFunction } from "@google-cloud/functions-framework";
import type { GcpConfig } from "@space48/cloud-seed";
const fn: CloudEventFunction (data) => {
console.log("This is a event triggered function", data);
};
export default fn;
export const runtimeConfig: GcpConfig = {
cloud: "gcp",
type: "event",
// By defining the topicName it will create the topic for you.
topicName: "hello-world",
};import { CloudEventFunction } from "@google-cloud/functions-framework";
import type { GcpConfig } from "@space48/cloud-seed";
const fn: CloudEventFunction = (data) => {
console.log("This is a scheduled triggered function", data);
};
export default fn;
export const runtimeConfig: GcpConfig = {
cloud: "gcp",
type: "schedule",
schedule: "* * * * *",
};Supported only in gen 2 supported versions of cloud-seed
import { CloudEventFunction } from "@google-cloud/functions-framework";
import type { GcpConfig } from "@space48/cloud-seed";
const fn: CloudEventFunction = (data) => {
console.log("This is a scheduled triggered function, triggered directly from the scheduled job", data);
};
export default fn;
export const runtimeConfig: GcpConfig = {
cloud: "gcp",
schedule: "* * * * *",
type: "scheduledJob",
version: "gen2",
attemptDeadline: "1800s", // optional, this is set as 3 minutes by default
};import { HttpFunction } from "@google-cloud/functions-framework";
import type { GcpConfig } from "@space48/cloud-seed";
const fn: HttpFunction = (req, res) => {
console.log(req.body);
return res.sendStatus(200);
};
export default fn;
export const runtimeConfig: GcpConfig = {
cloud: "gcp",
type: "queue",
};import { CloudEventFunction } from "@google-cloud/functions-framework";
import type { GcpConfig } from "@space48/cloud-seed";
const fn: CloudEventFunction = (data) => {
console.log("This is a firestore triggered function", data);
};
export default fn;
export const runtimeConfig: GcpConfig = {
cloud: "gcp",
type: "firestore",
document: "collection/{doc}",
// Optional event type (defaults to 'write').
firestoreEvent: "create"
};import { CloudEventFunction } from "@google-cloud/functions-framework";
import type { GcpConfig } from "@space48/cloud-seed";
const fn: CloudEventFunction = (data) => {
console.log("This is a cloud storage triggered function", data);
};
export default fn;
export const runtimeConfig: GcpConfig = {
cloud: "gcp",
type: "storage",
bucket: {
default: "myBucket",
// optional environment-specific buckets
environmentSpecific: {
production: "myProdBucket",
...
},
},
// Optional event type (defaults to 'finalize').
storageEvent: "finalize"
};You can override any value in runtimeConfig for a specific environment by including an environmentOverrides object. For example:
import { GcpConfig } from "@space48/cloud-seed";
export const runtimeConfig: GcpConfig = {
cloud: "gcp",
type: "http",
public: true,
minInstances: 0,
environmentOverrides: {
production: {
minInstances: 1,
},
},
};If your function needs access to resources on a VPC network (e.g. a back-office service over an IPSec VPN), you can configure it to route egress traffic through a VPC. All VPC settings are grouped under the vpc key in runtimeConfig.
Note: VPC resources (connectors, networks, subnets) must already exist in your GCP project. Cloud Seed will configure the function to use them but will not create them. For gen1 functions, only the
connectoroption is supported —networkandsubnetare ignored. For gen2 functions, ifnetworkorsubnetis provided, direct VPC egress is used and theconnectorsetting is ignored. Direct VPC egress requires@cdktf/provider-googleto be built against Terraform Google provider >= 7.x.
Note: These options are incompatible with static IP settings. If a function is configured to use a static IP address, it won't be able to access VPN resources.
For gen2 functions, the recommended method to connect functions to a VPC is using Direct VPC Egress. This allows functions to directly communicate with a VPC which is cheaper and more performant than using a VPC connector. To configure direct VPC egress, provide the vpc.network and/or vpc.subnet fields.
Note: gen2 functions prioritise direct VPC egress over VPC connector. If both,
vpc.network/vpc.subnetandvpc.connectorare specified, the connector setting will be ignored.
import { HttpFunction } from "@google-cloud/functions-framework";
import type { GcpConfig } from "@space48/cloud-seed";
const myFunction: HttpFunction = (req, res) => {
return res.sendStatus(200);
};
export default myFunction;
export const runtimeConfig: GcpConfig = {
cloud: "gcp",
type: "http",
public: false,
version: "gen2",
vpc: {
network: "my-vpc-network",
subnet: "my-vpc-subnet",
// Optional. Defaults to "internal_only". Set to "all_traffic" to route all egress through the VPC network.
egressSettings: "all_traffic",
},
};Route traffic through a Serverless VPC Access connector. This works for both gen1 and gen2 functions:
import { HttpFunction } from "@google-cloud/functions-framework";
import type { GcpConfig } from "@space48/cloud-seed";
const myFunction: HttpFunction = (req, res) => {
return res.sendStatus(200);
};
export default myFunction;
export const runtimeConfig: GcpConfig = {
cloud: "gcp",
type: "http",
public: false,
vpc: {
// Full resource name of an existing VPC Access connector.
connector: "projects/my-project/locations/europe-west2/connectors/my-connector",
// Optional. Defaults to "internal_only". Set to "all_traffic" to route all egress through the connector.
egressSettings: "internal_only",
},
};You can set the cloud seed config by adding a cloudseed.json file in the project root directory. An example is provided below:
{
"$schema": "./node_modules/@space48/cloud-seed/schemas/cloudseed.schema.json",
"default": {
"cloud": {
"gcp": {
"region": "europe-west2"
}
},
"buildConfig": {
"dir": "./src",
"outDir": "./.build",
},
"secretVariableNames": [
"apiKey1",
"apiKey2"
]
},
"environmentOverrides": {
"staging": {
"cloud": {
"gcp": {
"project": "example-project-staging"
}
},
"tfConfig": {
"backend": {
"type": "gcs",
"backendOptions": {
"bucket": "example-backend-bucket",
"prefix": "path/to/staging/statefile/directory"
}
}
},
"runtimeEnvironmentVariables": {
"FOO": "Staging1",
"BAR": "Staging2"
}
},
"production": {
"cloud": {
"gcp": {
"project": "example-project-production"
}
},
"tfConfig": {
"backend": {
"type": "gcs",
"backendOptions": {
"bucket": "example-backend-bucket",
"prefix": "path/to/production/statefile/directory"
}
}
},
"runtimeEnvironmentVariables": {
"FOO": "Prod1",
"BAR": "Prod2"
}
}
}
}This package can also be called via a JS API.
import { build, BaseConfig } from "@space48/cloud-seed";
import GoogleProvider from "@cdktf/provider-google";
import { GcsBackend, TerraformStack } from "cdktf";
import { Construct } from "constructs";
class CustomStack extends TerraformStack {
constructor(scope: Construct, id: string, options: BaseConfig) {
super(scope, id);
options.tfConfig.backend.type === "gcs" && new GcsBackend(this, {
bucket: options.tfConfig.backend.backendOptions.bucket,
prefix: options.tfConfig.backend.backendOptions.prefix + "-custom-stack",
});
new GoogleProvider.GoogleProvider(this, "Google", {
region: options.cloud.gcp.region,
project: options.cloud.gcp.project,
});
/**
* Example infrastructure
*/
new GoogleProvider.StorageBucket(this, "CustomBucket", {
name: `my-custom-bucket-${process.env.ENVIRONMENT}`,
location: options.cloud.gcp.region.toUpperCase(),
storageClass: "STANDARD",
uniformBucketLevelAccess: true,
});
}
}
// build() is the same as per the CLI command, and returns the parsed Cloud Seed config and the app construct
const { config, app } = build({ environment: process.env.ENVIRONMENT });
new CustomStack(app, "CustomStack", config);
app.synth();It is recommended by google to use the gen 2 functions wherever possible.
The cloud-seed v1.3.6-alpha is an upgraded version to use gen 2 functions, which is still in alpha stage. This is currently being used in Sneakers-n-stuff.
If you upgrade a project, that is using an older version of cloud-seed (ex:v1.3.0 is the latest stable version) to the 1.3.6-alpha version, all the existing cloud functions (which use older csktf npm modules) will be deleted and recreated. This can lead to an issue as functions/topics/secrets can't be created with the same name for 7 days, when they get deleted. Therefore upgrading a project that uses cloud-seed, has to be done cautiously.
But for a new project it is recommended to use the 1.3.6-alpha version with gen 2 functions support.
The structure is similar to what is explained above. But in runtime config add version: "gen2"
export const runtimeConfig: GcpConfig = {
runtime: "nodejs18",
cloud: "gcp",
type: "schedule",
schedule: "0 0 * * *",
memory: 2048,
timeout: 540,
version: "gen2",
};If a version is not given, the function will be automatically added as a gen 1.
pythonandmakeshould be installed in the build server
- add to the bitbucket pipeline build step
- step: &build
name: "Build"
caches:
- npm
script:
# Install Python and Make
- chmod +x ./bitbucket-pipelines/install_python_and_make.sh
- ./bitbucket-pipelines/install_python_and_make.sh- Add a new bash script:
install_python_and_make.sh
#!/bin/bash
apt update && apt -y install python3 && apt -y install build-essential- Enable Eventarc API in GCP as gen2 cloud functions are built on Cloud Run and Eventarc
This project is licenced under the MIT License.
The authors wish to acknowledge our collaboration with the open-source Cloud Seed project by user MNahad, and that certain features in our project are derived from it.