Issue #1: Scalable Video Processing for Frame Extraction

If you’re working with videos, you know how difficult and time consuming video processing is. Now assume you have a video portal and receiving thousands of videos per minute during peak time, and you want to extract frames from each video so that you’ll show frame preview in the video slider of your custom video player. Assuming it takes around 5 minutes and full CPU capacity to extract sufficient number of frames from a video, we can easily conclude you’ll need a scalable architecture in terms of computing and storage.

In this short article I’ll show how you can create a solution for this problem in AWS cloud in minutes.

The Trigger

Let’s start with the trigger, we assume portal application will use S3 as scalable cloud storage. We can easily configure a Lambda micro service which will be triggered automatically when a new video is uploaded to an S3 bucket. This trigger will provide the bucket name and object key information of the video to the Lambda micro service. And micro service will trigger the processing layer. Below you can see a simple Lambda function designed for this task. This function simply gets the bucket and key and triggers an API with HTTP GET method.


exports.handler = (event, context, callback) => {

    const bucket = event.Records[0];

    const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '));

    const params = {

        Bucket: bucket,

        Key: key,


    http.get(APIURL + "?bucket=" + bucket + "&key=" + key, function(res) {




Lambda micro services are not well suited for long running processes.  We used EC2 servers and scale out/in them using Auto Scaling Groups. You can use any server side development alternatives in EC2, in this example, I’ve used node.js. Here’s what I did within the API:

Get the bucket and key from the request parameters and download the video into a temporary folder for processing:

    var s3 = new AWS.S3();

    var options = {

        Bucket: req.query.bucket,

        Key: req.query.key


    var writeStream = fs.createWriteStream(req.query.key);

    var fileStream = s3.getObject(options).createReadStream();

    var folderPath = "videosource/frames." + path.basename(req.query.key);

    writeStream.on('finish', function() {

    ... }


To extract the frames from the video I’ve used an NPM module named node-frame-extractor. You can develop your own by using FFMPEG easily. Here’s how I did it:

var Extractor = require('./node_modules/node-frame-extractor/lib/extract-frames');

        var inst = new Extractor({

            inputVideo: './' + req.query.key,

                videoId: path.basename(req.query.key),

            mode: 'all',

            s3Bucket: 'extractortestbucket',

            pushToCloud: true,

            extractAllFrames: true,

            awsProfile: 'default'


inst.extract([250, 333, 432, 456, 502], function(err, result) {

... }

This node module uses FFMPEG (via command line execution) to extract all or selected frames from the video, stores them in a local folder and uploads them into S3 bucket.

In order to do this operation asynchronously in parallel, you can use NPM async module.

And I used NPM express module to expose this function as a REST API.

var express = require('express')

var app = express()

app.get('/', function(req, res) {

... //function logic here.


app.listen(3000, function() {

    console.log('Video frame extractor app listening on port 3000!')


Do not forget to delete the temporary video and frame images from your local filesystem after the upload is finished.

After the EC2 is ready with the REST service all you need to do is to configure the Auto Scaling Group and Elastic Load Balancers. I’ll explain these in another short article.


After the cloud upload is finished, you can push a message to SQS or send a notification via SNS or trigger another Lambda function according to what you want to achieve with the frame images.

In my next article, I’ll explain you how you can create a scalable cloud IVR with AWS Lambda, API GW and Twilio in 60 minutes.

Leave a Reply

Your email address will not be published. Required fields are marked *