Mateus Nava

Mateus Nava

April 26, 2022

Active Job with huge payload

by Dan Meyers (unsplash.com)
by Dan Meyers (unsplash.com)
Today's problem involves active job with sidekiq. We have to enqueue a job with a huge payload (more than 10MB). Remember that when you create a job in sidekiq you actually put a new element in a Redis queue and if the params are big you will put that very large payload in the Redis. Is it a problem? Depends, in our case yes because we just have Redis in RAM, and it will increase the RAM. (It's possible to have different strategies for Redis)

The solution we developed for this situation is to put the payload on Amazon S3 and create a job with just the S3 file key.

Class to use the Ruby AWS SDK:
class AwsS3Client
  BUCKET = ENV.fetch('S3_BUCKET')
  ACCESS_KEY_ID = ENV.fetch('S3_ACCESS_KEY_ID')
  SECRET_ACCESS_KEY = ENV.fetch('S3_SECRET_ACCESS_KEY')
  REGION = ENV.fetch('S3_REGION')

  attr_reader :bucket

  def initialize(bucket: BUCKET)
    @bucket = bucket
  end

  def put(name:, content:)
    client.put_object(
      bucket: bucket,
      body: content,
      key: name
    )
  end

  def delete(name)
    client.delete_object(
      bucket: bucket,
      key: name
    )
  end

  def download(name:, path:)
    client.get_object(
      response_target: path,
      bucket: bucket,
      key: name
    )
  end

  private

  def client
    @client ||= Aws::S3::Client.new(sdk_options)
  end

  def sdk_options
    {
      credentials: Aws::Credentials.new(
        ACCESS_KEY_ID,
        SECRET_ACCESS_KEY
      ),
      region: REGION
    }    
  end
end


Creating a job:
s3_key = "#{SecureRandom.uuid}.to_json"
AwsS3Client.new.put(name: s3_key, content: my_big_payload.to_json)
MyCoolJob.perform_late(s3_key)


The job:
...

def perform(s3_key)
   client = AwsS3Client.new
   data = JSON.parse(client.download(s3_key))

   ...

   client.delete(s3_key)
end

...


Thas all :)