Using AWS SQS with Node Js application

Posted on November 10, 2021 at 08:11 AM

aws sqs

Simple Queue Service(SQS) is a fully managed messaging service managed by Amazon. SQS eliminates the complexity and overhead associated with managing and operating message-oriented middleware and empowers developers to focus on differentiating work. Using SQS, you could send, store and receive messages without losing them. Amazon SQS message queuing can be used with other AWS services such as Redshift, DynamoDB, RDS, EC2, ECS, Lambda, and S3, to make distributed applications more scalable and reliable.

SQS pricing

Amazon SQS is a pay-per-use web service.

  • Pay only for what you use
  • No minimum fee

Below is a price snippet for reference. For more detail click here.

how are Amazon SQS requests priced

More About SQS

SQS offers two types of message queues.

  1. Standard queue
  2. FIFO queue

Standard Queue

  1. Nearly Unlimited Throughput: It supports a nearly unlimited number of transactions per second (TPS) per API action.
  2. At-Least-Once Delivery: A message is delivered at least once, but occasionally more than one copy of a message is delivered.
  3. Best-Effort Ordering: Occasionally, messages might be delivered in an order different from which they were sent. Visit the official website to know more.

FIFO Queue

  1. High Throughput:It supports 300 messages per second (300 send, receive, or delete operations per second) and could be increased up to 30000 messages per second with batching.
  2. Exactly-Once Processing:A message is delivered once and remains available until the consumer processes and deletes it. Duplicates aren’t introduced into the queue.
  3. First-In-First-Out Delivery:The order in which messages are sent and received is strictly preserved (i.e. First-In-First-Out).
fifo queue

What queue type should I use?

You should use standard queues as long as your application can process messages that arrive out of order and once. For example, you want to resize images after upload. On the other hand, you should use FIFO queues if your application can not tolerate duplicates and out-of-order delivery. For example, you want to prevent customers from being debited twice after an order is placed.

The example application in Node Js

What we will do:

  1. Create a standard SQS Queue on AWS.
  2. Create a Producer(producer.js) to send messages to the queue.
  3. Create a Consumer(consumer.js) to consume messages from the queue.

Let’s begin.

Create Standard SQS Queue

Queue Prerequisites

  1. Create an Amazon Web Service account
  2. Create an IAM user
  3. Get your access key ID and secret access key.

Visit the official website to know more.

Steps to create a standard queue

  1. Login to your AWS Account
  2. On AWS Management Console search and select Simple Queue Service
  3. Click “Create queue” under the heading Get started
  4. Type in “TestQueue” as the queue name
  5. Click on “Create Queue” at the bottom of the page. We will be using default settings.
  6. Copy the URL. It will be used later.

Visit the official website to know more.

Create Node Js application

Application Prerequisites

  1. Knowledge of ES6 standards
  2. Code editor (I’ll be using VSCode)
  3. Get the following from AWS:
    1. AWS access key id
    2. AWS secret access key
    3. AWS region

Steps to create the application

  1. Run following command in terminal:
    1. mkdir queueServices
    2. cd queueServices
    3. npm i aws-sdk sqs-consumer
  2. Open the folder in VSCode
  3. Create these files:
    1. sqs.js
    2. producer.js
    3. Consumer.js
  4. Code snippets:

sqs.js const AWS = require('aws-sdk'); AWS.config.update({ region: , accessKeyId: , secretAccessKey: }); const sqs = new AWS.SQS({ apiVersion: '2012-11-05' }); module.exports = sqs;

const sqs = require('./sqs'); function sendMessage(id, status) { const params = { QueueUrl: , MessageBody: JSON.stringify({ id, status }) }; sqs.sendMessage(params, function (err, result) { if (err) reject('Message not sent to the queue'); else resolve("Message sent to the queue"); }); } module.exports = sendMessage; Call the method sendMessage(id, status) to create a new message in the queue.

consumer.js const { Consumer } = require('sqs-consumer'); const sqs = require('./sqs'); const queueUrl = ; const app = Consumer.create({ queueUrl: 'https://sqs.eu-west-1.amazonaws.com/account-id/queue-name', handleMessage: async (message) => { // do some work with `message` } }); app.on('error', (err) => { console.error(err.message); }); app.on('processing_error', (err) => { console.error(err.message); }); console.log('Consumer service is running'); app.start();

Run the code node consumer.js to start polling the queue for messages

  • The queue is polled continuously for messages using long polling.
  • Messages are deleted from the queue once the handler function has been completed successfully.
  • if using handleMessageBatch, messages processed successfully should be deleted manually and throw an error to send the rest of the messages back to the queue.
  • Throwing an error (or returning a rejected promise) from the handler function will cause the message to be left in the queue. An SQS redrive policy can be used to move messages that cannot be processed to a dead letter queue.
  • By default messages are processed one at a time – a new message won’t be received until the first one has been processed. To process messages in parallel, use the batchSize option detailed below.
  • By default, the default Node.js HTTP/HTTPS SQS agent creates a new TCP connection for every new request (AWS SQS documentation). To avoid the cost of establishing a new connection, you can reuse an existing connection by passing a new SQS instance with keepAlive: true.

Facebook Comments

Related Posts

Start a Project

We could talk tech all day. But we’d like to do things too,
like everything we’ve been promising out here.