Showing posts with label lambda quickstart. Show all posts
Showing posts with label lambda quickstart. Show all posts

Monday, May 14, 2018

How to rob a bank: no servers - just a ballpoint pen!

Okay, let's face it: this article has nothing to do with robbery, banks or, heck, ballpoint pens; but it's a good attention grabber (hopefully!), thanks to Chef Horst of Gusteau's. (Apologies if that broke your heart!)

Rather, this is about getting your own gossip feed—sending you the latest and the hottest, within minutes they become public—with just an AWS account and a web browser!

Maybe not as exciting as a bank robbery, but still worth reading on—especially if you're a gossip fan and like to always have an edge over the rest of your buddies.

Kicking out the server

Going with the recent hype, we will be using serverless technologies for our mission. You guessed it, there's no server involved. (But, psst, there is one!)

Let's go with AWS, which offers an attractive Free Tier in addition to a myriad of rich serverless utilties: CloudWatch scheduled events to trigger our gossip seek, DynamoDB to store gossips and track changes, and SNS-based SMS to dispatch new gossips right into your mobile!

And the best part is: you will be doing everything—from defining entities and composing lambdas to building, packaging and deploying the whole set-up—right inside your own web browser, without ever having to open up a single tedious AWS console!

All of it made possible thanks to Sigma, the brand new truly serverless IDE from SLAppForge.

Sigma: Think Serverless!

The grocery list

First things first: sign up for a Sigma account, if you haven't already. All it takes is an email address, AWS account (comes with that cool free tier, if you're a new user!), GitHub account (also free) and a good web browser. We have a short-and-sweet writeup to get you started within minutes; and will probably come up with a nice video as well, pretty soon!

A project is born

Once you are in, create a new project (with a catchy name to impress your buddies—how about GossipHunter?). The Sigma editor will create a template lambda for you, and we can start right away.

GossipHunter at t = 0

Nurtured with <3 by NewsAPI

As my gossip source, I picked the Entertainment Weekly API by newsapi.org. Their API is quite simple and straightforward, and a free signup with just an email address gets you an API key with 1000 requests per day! In case you have your own love, feel free to switch just the API request part of the code (coming up soon!), and the rest should work just fine!

The recipe

Our lambda will be periodically pulling data from this API, comparing the results with what we already know (stored in DynamoDB) and sending out SMS notifications (via SNS) to your phone number (or email, or whatever other preferred medium that SNS offers) for any already unknown (hence "hot") results. We will store any newly seen topics in DynamoDB, so that we can prevent ourselves from sending out the same gossip repeatedly.

(By the way, if you have access to a gossip API that actually emits/notifies you of latest updates (e.g. via webhooks) rather than us having to poll for and filter them, you can use a different, more efficient approach such as configuring an API Gateway trigger and pointing the API webhook to the trigger endpoint.)

Okay, let's chop away!

The wake-up call(s)

First, let's drag a CloudWatch entry from the left Resources pane and configure it to fire our lambda; to prevent distractions during working hours, we will configure it to run every 15 minutes, only from 7 PM (when you are back from work) to midnight, and from 5AM to 8 AM (when you are on your way back to work). This can be easily achieved through a New, Schedule-type trigger that uses a cron expression such as 5-7,19-23 0/15 ? * MON-FRI *. (Simply paste 0/15 , 5-7,19-23 (no spaces) and MON-FRI into the Minutes, Hours and Day of Week fields, and type a ? under Day of Month.)

CloudWatch Events trigger: weekdays

But wait! The real value of gossip is certainly in the weekend! So let's add (drag, drop, configure) another trigger to run GossipHunter all day (5 AM - midnight!) over the weekend; just another cron with 0/10 (every ten minutes this time! we need to be quick!) in Minutes, 5-23 in Hours, ? in Day of Month and SAT,SUN in Day of Week.

CloudWatch Events trigger: weekends

Okay, time to start coding!

Grabbing the smoking hot stuff

Let's first fetch the latest gossips from the API. The requests module could do this for us in a heartbeat, so we'll go get it: click the Add Dependency button on the toolbar, type in requests and click Add once our subject appears in the list:

'Add Dependency' button

Now for the easy part:

  request.get(`https://newsapi.org/v2/top-headlines?sources=entertainment-weekly&apiKey=your-api-key`,
  (error, response, body) => {

    callback(null,'Successfully executed');
  })

Gotta hide some secrets?

Wait! The apiKey parameter: do I have to specify the value in the code? Since you probably would be saving all this in GitHub (yup, you guessed right!) won't that compromise my token?

We also had the same question; and that's exactly why, just a few weeks ago, we introduced the environment variables feature!

Go ahead, click the Environment Variables ((x)) button, and define a KEY variable (associated with our lambda) holding your API key. This value will be available for your lambda at runtime, but it will not be committed into your source; you can simply provide the value during your first deployment after opening the project. And so can any of your colleagues (with their own API keys, of course!) when they get jealous and want to try out their own copy of your GossipHunter!

Defining the 'KEY' environment variable

(Did I mention that your friends can simply grab your GossipHunter's GitHub repo URL—once you have saved your project—and open it in Sigma right away, and deploy it on their own AWS account? Oh yeah, it's that easy!)

Cool! Okay, back to business.

Before we forget it, let's append process.env.KEY to our NewsAPI URL:

  request.get(`https://newsapi.org/v2/top-headlines?sources=entertainment-weekly&apiKey=${process.env.KEY}`,

And extract out the gossips list, with a few sanity checks:

  (error, response, body) => {
    let result = JSON.parse(body);
    if (result.status !== "ok") {
      return callback('NewsAPI call failed!');
    }
    result.articles.forEach(article => {

    });

    callback(null,'Successfully executed');
  })

Sifting out the not-so-hot

Now the tricky part: we have to compare these with the most recent gossips that we have dispatched, to detect whether they are truly "new" ones, i.e. filter the ones that have not already been dispatched.

For starters, we shall maintain a DynamoDB table gossips to retain the gossips that we have dispatched, serving as our GossipHunter's "memory". Whenever a "new" gossip (i.e. one that is not already available in our table) is encountered, we shall send it out via SNS, the Simple Notification Service and add it to our table so that we will not send it out again. (Later on we can improve our "memory" to "forget" (delete) old entries so that it would not keep on growing indefinitely, but for the moment, let's not worry about it.)

What's that, Dynamo-DB?

For the DynamoDB table, simply drag a DynamoDB entry from the resources pane into the editor, right into the forEach callback. Sigma will show you a pop-up where you can define your table (without a round trip to the DynamoDB dashboard!) and the operation you intend to perform on it. Right now we need to query the table for the gossip in the current iteration, so we can zip it by

  • entering gossips into the Table Name field and url for the Partition Key,
  • selecting the Get Document operation, and
  • entering @{article.url} (note the familiar, ${}-like syntax?) in the Partition Key field.

Your brand new DynamoDB table 'gossips' with a 'Get Document' operation

      result.articles.forEach(article => {
        ddb.get({
          TableName: 'gossips',
          Key: { 'url': article.url }
        }, function (err, data) {
          if (err) {
            //handle error
          } else {
            //your logic goes here
          }
        });

      });

In the callback, let's check if DynamoDB found a match (ignoring any failed queries):

        }, function (err, data) {
          if (err) {
            console.log(`Failed to check for ${article.url}`, err);
          } else {
            if (data.Item) {  // match found, meaning we have already saved it
              console.log(`Gossip already dispatched: ${article.url}`);
            } else {

            }
          }
        });

Compose (160 characters remaining)

In the nested else block (when we cannot find a matching gossip), we prepare an SMS-friendly gossip text (including the title, and optionally the description and URL if we can stuff them in; remember the 160-character limit?). (Later you can tidy things up by throwing in a URL-shortener logic and so on, but for the sake of simplicity, I'll pass.)

            } else {
              let titleLen = article.title.length;
              let descrLen = article.description.length;
              let urlLen = article.url.length;

              let gossipText = article.title;
              if (gossipText.length + descrLen < 160) {
                gossipText += "\n" + article.description;
              }
              if (gossipText.length + urlLen < 160) {
                gossipText += "\n" + article.url;
              }

Hitting "Send"

Now we can send out our gossip as an SNS SMS. For this,

  • drag an SNS entry from the left pane into the editor, right after the last if block,
  • select Direct SMS as the Resource Type,
  • enter your mobile number into the Mobile Number field,
  • populate the SMS text field with @{gossipText},
  • type in GossipHuntr as the Sender ID (unfortunately the sender ID cannot be longer than 11 characters, but it doesn't really matter since it is just the text message sender's name; besides, GossipHuntr is more catchy, right? :)), and
  • click Inject.

But...

Wait! What would happen if your best buddy grabs your repo and deploys it; his gossips would also start flowing into your phone!

Perhaps a clever trick would be to extract out the phone number into another environment variable, so that you and your best buddy can pick your own numbers (and part ways, still as friends) at deployment time. So click the (x) again and add a new PHONE variable (with your phone number), and use it in the Mobile Number field instead as (you guessed it!) @{process.env.PHONE}:

Behold: gossip SMSs are on their way!

            } else {
              let titleLen = article.title.length;
              let descrLen = article.description.length;
              let urlLen = article.url.length;

              let gossipText = article.title;
              if (gossipText.length + descrLen < 160) {
                gossipText += "\n" + article.description;
              }
              if (gossipText.length + urlLen < 160) {
                gossipText += "\n" + article.url;
              }

              sns.publish({
                Message: gossipText,
                MessageAttributes: {
                  'AWS.SNS.SMS.SMSType': {
                    DataType: 'String',
                    StringValue: 'Promotional'
                  },
                  'AWS.SNS.SMS.SenderID': {
                    DataType: 'String',
                    StringValue: 'GossipHuntr'
                  },
                },
                PhoneNumber: process.env.PHONE
              }).promise()
                .then(data => {
                  // your code goes here
                })
                .catch(err => {
                  // error handling goes here
                });
            }

(In case you got overexcited and clicked Inject before reading the but... part, chill out! Dive right into the code, and change the PhoneNumber parameter under the sns.publish(...) call; ta da!)

Tick it off, and be done with it!

One last thing: for this whole contraption to work properly, we also need to save the "new" gossip in our table. Since you have already defined the table during the query operation, you can simply drag it from under the DynamoDB list on the resources pane (click the down arrow on the DynamoDB entry to see the table definition entry); drop it right under the SNS SDK call, select Put Document as the operation, and configure the new entry as url = ${article.url} (by clicking the Add button under Values and entering url as the key and @{article.url} as the value).

Dragging the existing DynamoDB table in; for our last mission

Adding a 'sent' marker for the 'hot' gossip that we just texted out

                .then(data => {
                  ddb.put({
                    TableName: 'gossips',
                    Item: { 'url': article.url }
                  }, function (err, data) {
                    if (err) {
                      console.log(`Failed to save marker for ${article.url}`, err);
                    } else {
                      console.log(`Saved marker for ${article.url}`);
                    }
                  });
                })
                .catch(err => {
                  console.log(`Failed to dispatch SMS for ${article.url}`, err);
                });

Time to polish it up!

Since we'd be committing this code to GitHub, let's clean it up a bit (all your buddies would see this, remember?) and throw in some comments:

let AWS = require('aws-sdk');
const sns = new AWS.SNS();
const ddb = new AWS.DynamoDB.DocumentClient();
let request = require('request');

exports.handler = function (event, context, callback) {

  // fetch the latest headlines
  request.get(`https://newsapi.org/v2/top-headlines?sources=entertainment-weekly&apiKey=${process.env.KEY}`,
    (error, response, body) => {

      // early exit on failure
      let result = JSON.parse(body);
      if (result.status !== "ok") {
        return callback('NewsAPI call failed!');
      }

      // check each article, processing if it hasn't been already
      result.articles.forEach(article => {
        ddb.get({
          TableName: 'gossips',
          Key: { 'url': article.url }
        }, function (err, data) {
          if (err) {
            console.log(`Failed to check for ${article.url}`, err);
          } else {
            if (data.Item) {  // we've seen this previously; ignore it
              console.log(`Gossip already dispatched: ${article.url}`);

            } else {
              let titleLen = article.title.length;
              let descrLen = article.description.length;
              let urlLen = article.url.length;

              // stuff as much content into the text as possible
              let gossipText = article.title;
              if (gossipText.length + descrLen < 160) {
                gossipText += "\n" + article.description;
              }
              if (gossipText.length + urlLen < 160) {
                gossipText += "\n" + article.url;
              }

              // send out the SMS
              sns.publish({
                Message: gossipText,
                MessageAttributes: {
                  'AWS.SNS.SMS.SMSType': {
                    DataType: 'String',
                    StringValue: 'Promotional'
                  },
                  'AWS.SNS.SMS.SenderID': {
                    DataType: 'String',
                    StringValue: 'GossipHuntr'
                  },
                },
                PhoneNumber: process.env.PHONE
              }).promise()
                .then(data => {
                  // save the URL so we won't send this out again
                  ddb.put({
                    TableName: 'gossips',
                    Item: { 'url': article.url }
                  }, function (err, data) {
                    if (err) {
                      console.log(`Failed to save marker for ${article.url}`, err);
                    } else {
                      console.log(`Saved marker for ${article.url}`);
                    }
                  });
                })
                .catch(err => {
                  console.log(`Failed to dispatch SMS for ${article.url}`, err);
                });
            }
          }
        });
      });

      // notify AWS that we're good (no need to track/notify errors at the moment)
      callback(null, 'Successfully executed');
    })
}

All done!

3, 2, 1, ignition!

Click Deploy on the toolbar, which will set a chain of actions in motion: first the project will be saved (committed to your own GitHub repo, with a commit message of your choosing), then built and packaged (fully automated!) and finally deployed into your AWS account (giving you a chance to review the deployment summary before it is executed).

deployment progress

Once the progress bar hits the end and the deployment status says CREATE_COMPLETE (or UPDATE_COMPLETE in case you missed a spot and had to redeploy), GossipHunter is ready for action!

Houston, we're GO!

Until your DynamoDB table is primed up (populated with enough gossips to start waiting for updates), you would receive a trail of gossip texts. After that, whenever a new gossip comes up, you will receive it on your mobile within a matter of minutes!

All thanks to the awesomeness of serverless and AWS, and Sigma that brings it all right into your web browser.

Friday, March 9, 2018

No more running around the block: Lambda-S3 thumbnailer, nailed by SLAppForge Sigma!

In case you hadn't noticed already, I have been recently babbling about the pitfalls I suffered when trying to get started with the official AWS lambda-S3 example. While the blame for most of those stupid mistakes is on my own laziness, over-esteem and lack of attention to detail, I personally felt that getting started with a leading serverless provider should not have been that hard.

banging head against the wall

And so did my team at SLAppForge. And they built Sigma to make it a reality.

Sigma logo

(Alert: the cat is out of the bag!)

Let's see what Sigma could do, to make your serverless life easy.

how Sigma works

Sigma already comes with a ready-made version of the S3 thumbnailing sample. Deploying it should take just a few minutes, as per the Readme, if you dare.

In this discussion, let's take a more hands-on approach: grabbing the code from the original thumbnailing sample, pasting it into Sigma, and deploying it into AWS—the exact same thing that got me running around the block, the last time I tried.

As you may know, Sigma manages much of the "behind the scenes" stuff regarding your app—including function permissions, trigger configurations and related resources—on your behalf. This relies on certain syntactic guidelines being followed in the code, which—luckily—are quite simple and ordinary. So all we have to do is to grab the original source, paste it into Sigma, and make some adjustments and drag-and-drop configuration stuff—and Sigma will understand and handle the rest.

If you haven't already, now is a great time to sign up for Sigma so that we could start inspiring you with the awesomeness of serverless. (Flattery aside, you do need a Sigma account in order to access the IDE.) Have a look at this small guide to get going.

Sigma: create an account

Once you're in, just copy the S3 thumbnail sample code from AWS docs and shove it down Sigma's throat.

S3 thumbnail code pasted into Sigma

The editor, which would have been rather plain and boring, would now start showing some specks of interesting stuff; especially on the left border of the editor area.

operation and trigger indicators on left border

The lightning sign at the top (against the function header with the highlighted event variable) indicates a trigger; an invocation (entry) point for the lambda function. While this is not a part of the function itself, it should nevertheless be properly configured, with the necessary source (S3 bucket), destination (lambda function) and permissions.

trigger indicator: red (unset)

Good thing is, with Sigma, you only need to indicate the source (S3 bucket) configuration; Sigma will take care of the rest.

At this moment the lightning sign is red, indicating that a trigger has not been configured. Simply drag a S3 entry from the left pane on to the above line (function header) to indicate to Sigma that this lambda should be triggered by an S3 event.

dragging S3 entry

As soon as you do the drag-and-drop, Sigma will ask you about the missing pieces of the puzzle: namely the S3 bucket which should be the trigger point for the lambda, and the nature of the operation that should trigger it; which, in our case, is the "object created" event for image files.

S3 trigger pop-up

When it comes to specifying the source bucket, Sigma offers you two options: you could either

  • select an existing bucket via the drop-down list (Existing Bucket tab), or
  • define a new bucket name via the New Bucket tab, so that Sigma would create it afresh as part of the project deployment.

Since the "image files" category involves several file types, we would need to define multiple triggers for our lambda, each corresponding to a different file type. (Unfortunately S3 triggers do not yet support patterns for file name prefixes/suffixes; if they did, we could have gotten away with a single trigger!) So let's first define a trigger for JPG files by selecting "object created" as the event and entering ".png" as the suffix, and drag, drop and configure another trigger with ".jpg" as the suffix—for, you guessed it, JPG files.

S3 trigger for PNG files

There's a small thing to remember when you select the bucket for the second trigger: even if you entered a new bucket name for the first trigger, you would have to select the same, already-defined bucket from the "Existing Bucket" tab for the second trigger, rather than providing the bucket name again as a "new" bucket. The reason is that Sigma keeps track of each newly-defined resource (since it has to create the bucket at deployment time) and, if you define a new bucket twice, Sigma would get "confused" and the deployment may not go as planned. To mitigate the ambiguity, we mark newly defined buckets as "(New)" when we display them under the existing buckets list (such as my-new-bucket (New) for a newly added my-new-bucket) - at least for now, until we find a better alternative; if you have a cool idea, feel free to chip in!.

selecting new S3 bucket from existing buckets list

Now both triggers are ready, and we can move on to operations.

S3 trigger list pop-up with both triggers configured

You may have already noticed two S3 icons on the editor's left pane, somewhat below the trigger indicator, right against the s3.getObject and s3.putObject calls. The parameter blocks of the two operations would also be highlighted. This indicates that Sigma has identified the API calls and can help you by automatically generating the necessary bells and whistles to get them working (such as execution permissions).

S3 operation highlighted

Click on the first icon (against s3.getObject) to open the operation edit pop-up. All we have to do here is to select the correct bucket name for the Bucket parameter (again, ensure that you select the "(New)"-prefixed bucket on the "existing" tab, rather than re-entering the bucket name on the "new" tab) and click Update.

S3 getObject operation pop-up

Similarly, with the second icon (s3.putObject), select a destination bucket. Because we haven't yet added or played around with a destination bucket definition, here you will be adding a fresh bucket definition to Sigma; hence you can either select an existing bucket or name a new bucket, just like in the case of the first trigger.

S3 putObject operation pop-up

Just one more step: adding the dependencies.

While Sigma offers you the cool feature of the ability to add third-party dependencies to your project, it does need to know the name and version of the dependency at build time. Since we copied and pasted an alien block of code into the editor, we should separately tell Sigma about the dependencies that are being used in the code, so that it can bundle them along with our project sources. Just click the "Add Dependency" button on the toolbar, search for the dependency and click "Add", and all the added dependencies (along with two defaults, aws-sdk and @slappforge/slappforge-sdk) will appear on the dependencies drop-down under the "Add Dependency" button.

Add Dependency button with dependencies drop-down

In our case, keeping with the original AWS sample guidelines, we have to add the async (for waterfall-style execution flow) and gm (for GraphicsMagick) dependencies.

adding async dependency

Done!

Now all that remains is to click the Deploy button on the IDE toolbar, to set the wheels in motion!

Firstly, Sigma will save (commit) the app source to your GitHub repo. So be sure to provide a nice commit message when Sigma asks you for one :) You can pick your favourite repo name too, and Sigma will create it if it does not exist. (However, Sigma has a known glitch when an "empty" repo (i.e. one that does not have a master branch) is encountered, so if you have a brand new repo, make sure that you have at least one commit on the master branch; the easiest way is to create a Readme, which can be easily done with one click at repo creation.)

commit dialog

Once saving is complete, Sigma will automatically build your project, and open up a deployment summary pop-up showing everything that it would deploy to your AWS account with regard to your brand new S3 thumbnail generator. Some of the names will look gibberish, but they will generally reflect the type and name of the deployed resource (e.g. s3MyAwesomeBucket may represent a new S3 bucket named my-awesome-bucket).

build progress in status bar

deployment changes summary

Review the list (if you dare) and click Deploy. The deployment mechanism will kick in, displaying a live progress bar (and a log view showing the changes taking place in the underlying CloudFormation stack of your project).

deployment in progress

Once the deployment is complete, your long-awaited thumbnail generator lambda is ready for testing! Just upload a JPG or PNG file to the source bucket you chose (via the S3 console, or via an aws s3 cp if you are more like me), and marvel at the thumbnail that would pop up in your destination bucket within a matter of seconds!

If you don't see anything interesting in the destination bucket (after a small wait), you would be able to check what went wrong, by checking the lambda's execution logs just like in the case of any other lambda; we know it's painful to go back to the AWS consoles to do this, and we hope to find a cooler alternative to that as well, pretty soon.

If you want to make the generated thumbnail public (as I said in my previous article, what good is a private thumbnail?), you don't have to run around reading IAM docs, updating IAM roles and pulling your hair off; simply click the S3 operation edit icon against the s3.putObject call, select the "ACL to apply to the object" parameter as public-read from the drop-down, and click "Deploy" to go through another save-build-deploy cycle. (We are already working on speeding up these "small change" deployments, so bear with us for now :) ) Once the new deployment is complete, in order to view any newly generated thumbnails, you can simply enter the URL http://<bucketname>.s3.amazonaws.com/resized-<original image name> into your favourite web browser and press Enter!

making thumbnails public: S3 pop-up

Oh, and if you run into anything unusual—a commit/build/deployment failure, an unusual error or a bug with Sigma itself— don't forget to ping us via Slack - or post an issue on our public issue tracker; you can do it right within the IDE, using the "Help" → "Report an Issue" menu item. Same goes for any improvements or cool features that you would like to see in Sigma in the future: faster builds and deployments, ability to download the build/deployment artifacts, a shiny new set of themes, whatever. Just let us know, and we'll add it to our backlog and give it a try in the not-too-distant future!

Okay folks, time to go back and start playing with Sigma, while I write my next blog post! Stay tuned for more from SLAppForge!