Skip to content

project-queuify/queuify

Repository files navigation

Queuify is building! codecov

Queuify Pattern!

  • Options (For whole Queuify)

    1. [Worker Type] Sandbox / Embedded / Hybrid
    2. [Max workers] default 0 (Infinite)
    3. [Max concurrency] (default 100) x amount of job should be processed by each queue
    4. [Batch Concurrency] defaults to Max Concurrency or 1 if max is infinite! Only needed when batch runs in parallel
    5. [Run Batch in Parallel] Should hold of another job during batch execution or not! On enabling, Total queue concurrency will be Max concurrency + Batch concurrency
    6. [Max execution time] If a sandboxed worker doesn't finish up in given time, the Process will be killed (Useful to kill memory leak, Make sure it’s kind of 10x than you think)
    7. [Connection options]
      1. [DB] Redis, DragonFly
      2. [DB Options] If there is something that we want to use specifically
  • Options (For each queue)

    1. [Worker Type] Sandbox / Embedded / Hybrid (takes priority over above concurrency)
    2. [Max concurrency] default 0 (takes priority over above concurrency)
      1. Beware that max concurrency is effective on worker level as of now. That means if we have it set at 2, but we have three workers; The final concurrency for that queue would be 6!
    3. [Batch concurrency] (takes priority over above concurrency)
    4. [Run Batch in parallel] (takes priority over above)
    5. [Max execution time] (takes priority over above)
    6. [Heap limit] (in MB)
      1. [For Sandbox] limit gets divided per concurrency. 2048 MB with concurrency 2 will allocate 1024 for each sandboxed processor
      2. [For Hybrid] It creates a sandboxed processor with Heap limits and processes jobs in parallel as per concurrency!
  • Workers

    1. Total three types
      1. [Sandboxes] Spawns a new node process for the jobs, More scalable and keeps parent server clean from heap issues!
      2. [Embedded] Works on the same server where processor is added so the more jobs runs in parallel, the more memory heap it consumes
      3. [Hybrid] Most likely combination of above two, Creates one process and runs everything in parallel to avoid any harm to parent server!
    2. [Pre-start hooks] Used to execute code to prepare the worker, Mostly it will be db and websocket connections along with clients like shopify etc. so that it can have all the required things before running the worker
    3. Hooks/Events
      1. [Before Work] Picks X jobs based on Concurrency, Do some logic and return array. The length of array will be pushed to workers (if picked five jobs because of five concurrency and new array is of 7 then total of seven workers will be created)
      2. [After Work] Perform clean up if required.
      3. Other generic events like job success, fail, etc.
      4. Batch specific events
        1. Before Batch Start
        2. Before Item Start
        3. On Item Success
        4. On Item Fail
        5. After Item End
        6. After Batch End
  • Batching

    1. A dedicated method to schedule multiple jobs (an array) in a queue with some additional hooks as mentioned above
    2. Can provide a unique id or it’ll return a new generated id
    3. Option to delay the execution to wait for more items, so it can be added later on
    4. Once all item added or batch is set to start immediately, Start batch in

Notes

  • There can be few limitations due to the nature of Queuify design, Below are few examples for the same.
    • Queueify is promise-based, Meaning it expects some async tasks to be waited to work perfectly!
    • When adding workers to Queue via queue.process, One must await for the response, Queuify Engine starts worker pool when it receives first worker for the given queue, During that time if other workers gets added, There's a high chance that the rest workers will never be used for the processing due to those not being added to worker pool!
      • For example, Use solution 2 instead of solution 1.
         const queue = new Queue('test-queue');
         const workerFunc = (job) => console.log(job.id)  
         const workerFunc2 = (job) => console.log(job.id)  
         redisQueue.process(workerFunc)
         // This will cause problem because the above `.process` is not finished yet,
         // And it's likely that workerpool is not started so it's most likely that second worker won't get any jobs!
         redisQueue.process(workerFunc2)
        Instead of above, Do following!
         const queue = new Queue('test-queue');
         const workerFunc = (job) => console.log(job.id)  
         const workerFunc2 = (job) => console.log(job.id)  
         // Await will make sure that second worker only gets added after worker pool is set up!
         await redisQueue.process(workerFunc)
         await redisQueue.process(workerFunc2)

Feature Roadmap

Queuify

  • Worker Type
  • Embedded
  • Sandbox
  • Hybrid
  • Max workers
  • Default 0 (Infinite)
  • Max concurrency
  • Default 100
  • Batch concurrency
  • default to Max Concurrency or 1 if max is infinite!
    • Only needed when batch runs in parallel
  • Run Batch in Parallel
  • Should hold of another job during batch execution or not!
    • On enabling, Total queue concurrency will be Max concurrency + Batch concurrency.
  • Max execution time
  • If a sandboxed worker doesn’t finish up in given time, the Process will be killed
    • Useful to kill memory leak, Make sure it’s kind of 10x than you think for each job!
  • Connection options
    • DB
    • Redis
    • DragonFly
    • DB Options
    • Underlying connection options for ioredis

For each queue

  • Worker Type
  • Embedded
  • Sandbox
  • Hybrid
  • Max concurrency
  • Default 100
    • takes priority over above concurrency
  • Batch concurrency
  • default to Max Concurrency or 1 if max is infinite!
    • takes priority over the above concurrency
    • Only needed when batch runs in parallel
  • Run Batch in Parallel
  • Should hold of another job during batch execution or not!
    • takes priority over the above concurrency
    • On enabling, Total queue concurrency will be Max concurrency + Batch concurrency.
  • Max execution time
  • If a sandboxed worker doesn’t finish up in given time, the Process will be killed
    • takes priority over the above concurrency
    • Useful to kill memory leak, Make sure it’s kind of 10x than you think for each job!
  • Heap limit - in MB
    • For Sandbox
      • limit gets divided per concurrency
      • 2048 MB with two concurrencies will allocate 1024 for each sandboxed processor
    • For Hybrid
      • It creates a sandboxed processor with Heap limits and processes jobs in parallel as per concurrency!

Workers

  • Three types of workers
  • Embedded
    • Works on the same server where processor is added so the more jobs runs in parallel, the more memory heap it consumes
  • Sandbox
    • Spawns a new node process for the jobs, More scalable and keeps parent server clean from heap issues!
  • Hybrid
    • Most likely combination of the above two, Creates one process and runs everything in parallel to avoid any harm to parent server!
  • Pre-start hooks
    • Used to execute code to prepare the worker,
      • Mostly, it will be db and websocket connections along with clients like shopify etc. so that it can have all the required things before running the worker.
  • Hooks & Events
    • Before Work
      • Picks X jobs based on Concurrency, Do some logic and return array.
      • The length of array will be pushed to workers (if picked five jobs because of five concurrencies and new array is of 7, then a total of seven workers will be created)
    • After Work
      • Perform clean up if required.
    • Other generic events
      • Job Add
      • Job Start
      • Job Success
      • Job Fail
    • Batch specific events
      • Before Batch Start
      • Before Item Start
      • On Item Success
      • On Item Fail
      • After Item End
      • After Batch End

Batching

  • Three types of workers
  • A dedicated method to schedule multiple jobs (an array) in a queue with some additional hooks as mentioned above
  • Can provide a unique id or it’ll return a new generated id
  • Option to delay the execution to wait for more items, so it can be added later on
  • Once all item added or batch is set to start immediately

Contributing

The Queuify project welcomes all constructive contributions. Contributions take many forms, from code for bug fixes and enhancements, to additions and fixes to documentation, additional tests, triaging incoming pull requests and issues, and more!

See the Contributing Guide for more technical details on contributing.

Security Issues

If you discover a security vulnerability in Queuify, please see Security Policies and Procedures.

About

A fast and robust queue for Node!

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published