Showing posts with label google cloud functions. Show all posts
Showing posts with label google cloud functions. Show all posts

Friday, February 28, 2020

Set Up Your Free Serverless Webhook - in Minutes!

Get started on your serverless journey. Right here. Right now.

Note: This article is several months obsolete; after all, these days, who would want to deploy a serverless webhook from Google Cloud dashboard, when you can do the same with just a few clicks - on the world's best serverless IDE?!


Often you need to set up a HTTP/S endpoint (webhook) for accepting data posted from another application or service; such as GitHub webhooks. Here is a quick way to set one up one, without having to run, pay for, or maintain a server of your own. (And hence the term "serverless webhook".)

We'll stick to Google Cloud Platform; it's quick to register if you already have a Google account (which I guess you do ;)), and is totally free. You do have to provide a credit/debit card (all cloud platforms do); but unless your endpoint receives a huge traffic, you would be completely covered by the free tier. Plus, you receive $300 free credits to try out any of the other cool Google Cloud Platform services.

Let's assume you want a webhook that accepts POST requests on the path /webhook.

We need two main things:

  • a HTTP endpoint that accepts the data, and
  • a compute entity (Google Cloud Function) to consume and process the data

Create a new Cloud Plaform Project

If you haven't already done so,

  • Click on the project name drop-down on the header, and then New Project.

create new project

  • Provide a name for your project (or let Google auto-generate one for you).

naming your project

  • Click Create. Google will start creating your project; it could take a few seconds. You can check the status via the notification drop-down (bell icon) on the page header.

Cloud Console notifications drop-down

  • When the project is ready, you will be taken to the project dashboard.

project created; we're on the dashboard

Sign up for Google Cloud Functions

Cloud Functions menu item

  • Since you are probably new to Cloud Functions, the dashboard will first ask you to enable the Cloud Functions API. (If not, you can skip the next few steps.)

Cloud Functions: sign-up page; 'Cloud Functions API not enabled'

If you already have a billing account configured, you can simply select it and proceed. Otherwise, add your card details here and proceed. (Repeat: this is mostly a formality, and your project would be totally free.)

filling in your card details

  • Once your card is confirmed, you'll end up back on the Cloud Functions dashboard.

Create a new function

  • Click Create Function. The Create Function page will open up.

Cloud Functions enabled; now you can create a function

  • Provide a name for your function; this will also be the pathname of the webhook URL, so I will choose webhook. You also need to select a runtime. I chose NodeJS 6.
  • Pick HTTP as the Trigger Type.

function configuration

Write the code

  • Click Next. You'll be taken to a page where you can edit the function code.

function code editor

  • Now you can write the custom logic for handling the webhook request. The request will be available via the req parameter as an Express.js Request object.

After handling, you can respond via the res parameter which is an Express.js Response object:

res.send("success!");

If you want to use external (NPM) dependencies, switch to the package.json tab and define them under a dependencies entry, as usual.

Deploy it

  • When done, click Create. You'll be taken back to the dashboard.

back in the dashboard; function is being created

You'll see your function listed in the previously-empty list, with a spinner in front. Wait till it changes to a green check mark - indicating that the function is live.

Once the function is live, your webhook is ready!


Test it

To test what you just built,

  • Open a HTTP client (e.g. Postman), and set the URL to https://<region>.<project-name>.cloudfunctions.net/<function-name> (e.g. https://us-east-1.myscellanius.cloudfunctions.net/webhook). You can also find the URL through the Trigger tab:

    function details: 'Triggers' tab

  • Set the request method to POST, if not already.
  • Paste the payload you want to send, and send the request.
  • You should receive the response generated by your cloud function.

You can also use the built-in testing feature of the Cloud Functions dashboard to directly invoke your function with a suitable payload:

testing your function from the Cloud Functions dashboard

Checking the logs

If you receive an error, or would like to see any logs generated by the function, you can use the View Logs command on the ellipsis drop-down of the dashboard entry to visit the full-blown StackDriver logging dashboard.

For test invocations, the logs are displayed right below the Output pane:

logs for the function


What's next?

This was quick and easy; but it could become a headache to switch between dashboards and manually upload code bundles whenever there's a change to your handler logic.

Using a proper deployment tool would save you time, and also allow you to keep your cloud resources grouped together. For example, you may need to incorporate a Cloud Storage bucket or a Pub/Sub topic into your logic; in that case it would be quite easy to deploy them automatically as one unit, instead of manually doing each via the different dashboards.

And in case you didn't know, that tool is already here: create function, write code, add dependencies; and save, build and deploy with one button click!

Friday, November 29, 2019

Google Cloud has the fastest build now - a.k.a. හූ හූ, AWS!

SLAppForge Sigma cloud IDE - for an ultra-fast serverless ride! (courtesy of Pexels)

Building a deployable serverless code artifact is a key functionality of our SLAppForge Sigma cloud IDE - the first-ever, completely browser-based solution for serverless development. The faster the serverless build, the sooner you can get along with the deployment - and the sooner you get to see your serverless application up and running.

AWS was "slow" too - but then came Quick Build.

During early days, back when we supported only AWS, our mainstream build was driven by CodeBuild. This had several drawbacks; it usually took 10-20 seconds for the build to complete, and it was rather repetitive - cloning the repo and downloading dependencies each time. Plus, you only get 100 free build minutes per month, so it adds a bit of a cost - despite small - to ourselves, as well as to our users.

Then we noticed that we only need to modify the previous build artifact in order to get the new code rolling. So I wrote a "quick build"; basically a Lambda that downloads the last build artifact, updates the zipfile with the changed code files, and re-uploads it as the current artifact. This was accompanied by a "quick deploy" that directly updates the code of affected functions, thereby avoiding the overhead of a complete CloudFormation deployment.

Then our ex-Adroit wizard Chathura built a test environment, and things changed drastically. The test environment (basically a warm Lambda, replicating the content of the user's project) already had everything; all code files and dependencies, pre-loaded. Now "quick build" was just a matter of zipping everything up from within the test environment itself, and uploading it to S3; just one network call instead of two.

GCP build - still in stone age?

When we introduced GCP support, the build was again based on their Cloud Build, a.k.a. Container Builder service. Although GCP did offer 3600(!) free build minutes per month (120 each day; see what I'm talking, AWS?), theirs was generally slower than CodeBuild. So, for several months, Sigma's GCP support had the bad reputation of having the slowest build-deployment cycle.

But now, it is no longer the case.

Wait, what? It only needs code - no dependencies?

There's a very interesting characteristic of Cloud Functions:

When you deploy your function, Cloud Functions installs dependencies declared in the package.json file using the npm install command.

-Google Cloud Functions: Specifying dependencies in Node.js

This means, for deploying, you just have to upload a zipfile containing the sources and a dependencies file (package.json, requirements.txt and the like). No more npm install, or megabyte-sized bundle uploads.

But, the coolest part is...

... you can do it completely within the browser!

jszip FTW!

That awesome jszip package does it all for us, in just a couple lines:

let zip = new JSZip();

files.forEach(file => zip.file(file.name, file.code));

/*
a bit more complex, actually - e.g. for a nested file 'folder/file'
zip.folder(folder.name).file(file.name, file.code)
*/

let data = await zip.generateAsync({
 type: "string",
 streamFiles: true,
 compression: "DEFLATE"
});

We just zip up all code files in our project, plus the Node/npm package.json and/or Python/pip requirements.txt...

...and upload them to a Cloud Storage bucket:

let bucket = "your-bucket-name";
let key = "path/to/upload";

gapi.client.request({
 path: `/upload/storage/v1/b/${bucket}/o`,
 method: "POST",
 params: {
  uploadType: "media",
  name: key
 },
 headers: {
  "Content-Type": "application/zip",
  "Content-Encoding": "base64"
 },
 body: btoa(data)
})).then(res => {
 console.debug("GS upload successful", res);

 return {
  Bucket: res.result.bucket,
  Key: res.result.name
 };
});

Now we can add the Cloud Storage object path into our Deployment Manager template right away!

...
{
 "name": "goofunc",
 "type": "cloudfunctions.v1beta2.function",
 "properties": {
  "function": "goofunc",
  "sourceArchiveUrl": "gs://your-bucket-name/path/to/upload",
  "entryPoint": ...
 }
}

So, how fast is it - for real?

  1. jszip runs in-memory and takes just a few millis - as expected.
  2. If it's the first time after the IDE is loaded, the Google APIs JS client library takes a few seconds to load.
  3. After that, it's a single Cloud Storage API call - to upload our teeny tiny zipfile into our designated Cloud Storage bucket sigma-slappforge-{your Cloud Platform project name}-build-artifacts!
  4. If the bucket is not yet available, and the upload fails as a result, we have two more steps - create the bucket and then re-run the upload. This happens only once in a lifetime.

So for a routine serverless developer, skipping steps 2 and 4, the whole process takes around just one second - the faster your network, the faster it all is!

In comparison to AWS builds, where we want to first run a dependency sync and then a build (each of which is preceded by HTTP OPTIONS requests, thanks to CORS restrictions); this is lightning fast!

(And yeah, this is one of those places where the googleapis client library shines; high above aws-sdk.)

Enough reading - let's roll!

I am a Google Cloud fan by nature - perhaps because my "online" life started with Gmail, and my "cloud dev" life started with Google Apps Script and App Engine. So I'm certainly at bias here.

Still, when you really think about it, Google Cloud is way simpler far more organized than AWS. While this could be a disadvantage when it comes to advanced serverless apps - say, "how do I trigger my Cloud Function periodically?" - GCF is pretty simple, easy and fast. Very much so, when all you need is a serverless HTTP endpoint (webhook) or bucket/queue consumer up and running in a few minutes.

And, when you do that with Sigma IDE, that few minutes could even drop down to a matter of seconds - thanks to the brand new quick build!

So, why waste time reading this - when you can just go and do it right away?!

Friday, April 5, 2019

Google Cloud Functions: a (looong overdue) "hello, world!" on GCP

In early 2018, Google Cloud Functions went GA. Some time before that - in March 2018, while it was still in beta - I took some screenshots while taking my first shot at Cloud Functions. But then one thing led to another, and to another, and another, and this blog post was never born.

Now, after a whole year of downright obsoletion, I present to you my "getting started with Google Cloud Functions [beta]" guide.

Surely a lot has changed; beta tag gone, hordes of new features introduced: Python runtime, environment variables, in-built test invocations and logs, and a lot that I haven't even seen yet.

Activate billing on your GCP account

Now, if you rush to the Cloud Functions dashboard, you may notice that you need to enable the Cloud Functions API - unless you have done so already, with your currently active GCP project.

'Cloud Functions' link on the GCP console's left menu

'Cloud Functions API not enabled'

Cloud Functions need a billing enabled GCP project, so that's the first thing we need to do. Google's official guide is fairly easy to follow.

filling in payment details for your billing account

Google promises that we won't be charged during our 1-year, $300 free trial so we're covered here.

Create a cloud function

When done, go ahead to the Cloud Functions dashboard and click Create function.

a fresh Cloud Functions dashboard (back from the beta days)

This will take you to a Create function wizard. The first phase would resemble:

GCP Create Function: configuration phase 1

Here you define most of the basics of your cloud function: name, max memory limit, trigger, and source code.

Later on, you get the chance to define the handler/entrypoint (NodeJS function to invoke when the Cloud Function is hit); and other settings like the function deployment region and timeout (maximum running time per request).

GCP Create Function: name (handler), region and timeout

Trigger options

Triggers can invoke your cloud function in response to external actions: active ones like HTTP requests or passive ones like events from Cloud Storage buckets or Cloud Pub/Sub topics.

In platforms like AWS this may not make much difference: Lambda configures and handles both event types in the same way. But in GCP they are handled quite differently; so different that the cloud function method signature itself is different.

If you try switching between the trigger types, you would see how the sample code under Source code changes:

HTTP functions

Google generates the signature:

/**
 * Responds to any HTTP request that can provide a "message" field in the body.
 *
 * @param {!Object} req Cloud Function request context.
 * @param {!Object} res Cloud Function response context.
 */
exports.helloWorld = (req, res) => {
  // Example input: {"message": "Hello!"}
  if (req.body.message === undefined) {
    // This is an error case, as "message" is required.
    res.status(400).send('No message defined!');
  } else {
    // Everything is okay.
    console.log(req.body.message);
    res.status(200).send('Success: ' + req.body.message);
  }
};

A HTTP function accepts a request and writes back to a response.

The HTTP(S) endpoint is automatically provisioned by Google, at https://{region}-{project-name}-gcp.cloudfunctions.net/{function-name}. So there is nothing more to configure in terms of triggers.

Event-based functions

Google's signature looks like:

/**
 * Triggered from a message on a Cloud Pub/Sub topic.
 *
 * @param {!Object} event The Cloud Functions event.
 * @param {!Function} The callback function.
 */
exports.subscribe = (event, callback) => {
  // The Cloud Pub/Sub Message object.
  const pubsubMessage = event.data;

  // We're just going to log the message to prove that
  // it worked.
  console.log(Buffer.from(pubsubMessage.data, 'base64').toString());

  // Don't forget to call the callback.
  callback();
};

Event-based functions accept an event and convey success/failure (and optionally a result) via a callback.

Here we need to configure an event source (Pub/Sub topic or Storage bucket) to trigger the function.

configuring a Cloud Storage bucket to trigger a function

Good thing is, GCP allows you to pick an existing topic/bucket, or create a new one, right there inside the cloud function wizard page.

defining a new Cloud Storage bucket via the 'Create new bucket' pop-up

defining a new Cloud Pub/Sub topic via the 'Create new topic' pop-up

Picking an existing entity is just as easy:

selecting an existing Cloud Storage bucket

Automatic retry

These non-HTTP functions also have a retry mechanism: you can configure GCP to redeliver an event back to the function automatically, if the function failed to process it during the last time.

Cloud Function retry configuration

This is good for taking care of temporary failures, but it can be dangerous if the error is due to a bug in your code: GCP will keep on retrying the failing event for up to 7 days, draining your quotas and growing your bill.

A function is born

Now we are done with the configurations; click Create.

GCP takes you back to the function dashboard, where the new function will be listed; with a "pending" or "creating" status.

Cloud Functions dashboard, while your new function gets created

It may take a while, but finally you would see the successfully created function in Active status:

Cloud Function created and in 'Active' status

Back in the beta days, I got some sporadic errors when trying to create functions; for no apparent reason. Hopefully nobody is that much unlucky these days.

create function: 'Request failed with unknown error'

It is a good practice to label your functions, so you can find and manage them easily. GCP allows you to do this right from the dashboard.

labelling your Cloud Functions, right from the dashboard

Function actions

Click the three-dots button at the far right end of the function entry, to see what you can do next:

Cloud Function actions

Back in the beta days, there were some hiccups with some of these options (like Copy Function; hopefully they are long gone now!

Copy Function: 'no source code location' error

Testing

Test Function gives a nice interface where you can invoke the function with a custom payload, and view the output and execution logs right away. However it still lacks the ability to define and run predefined custom test events, like in AWS or Sigma. Another caveat is that the test invocations also hit the same production function instance (unlike, say, the test environment of Sigma), so they count towards the logs and statistics of your actual function.

'Test Function' tab of the function detailed view

Test result for the standard HTTP function sample

Logs

View Logs takes you to the familiar StackDriver Logging page, where you can browse, sort, search stream and do all sorts of things with the logs of your function. It takes a few seconds for the latest logs to appear, as is the case with other platforms as well.

'View Logs' takes you to StackDriver Logging

More function details

You can click the function entry to see more details:

  • General tab shows a nice stats graph, along with basic function configs like runtime, memory etc.

'General' tab of our new function

  • Trigger tab shows the trigger config of the function.

'Triggers' tab with an HTTP trigger config

As of now, GCP doesn't allow you to edit/change the trigger after you have created the function; we sincerely hope this would be relaxed in the future!

a read-only trigger config

  • Source tab has the familiar code viewer (although you cannot directly update and deploy the code from there). It also has a Download zip button for the code archive.

'Source' tab with function code

  • Testing tab we have already seen. It's also pretty neat, for something that went GA just a few months ago.

So, that's what a cloud function looks like.

Or, to be precise, how it used to be - back in the pre- and post-beta days.

I'm sure Google will catch up on the serverless race - with more event sources, languages, monitoring and so forth.

But do we need to wait? Absolutely not.

Cloud Functions are mature enough for most of your routine integrations. One major bummer is that it doesn't yet support timer schedules, but folks are already using workarounds.

Plus, many of the leading serverless development frameworks are already supporting GCP!

So hop in - write your own serverless success story on GCP!