High Throughput Data Processing with GenStage

Ian Butler


Data processing is a part of many disciplines in technology and as the amount of data you need to process grows so do the techniques you need to be able to safely and speedily ingest that data. Join us as we go through the building blocks Elixir offers to build different ingestion topologies that handle things like backpressure, partitioning, retries, and batching.

Talk objectives:

  • Make listeners aware of tools like GenStage
  • Show how GenStage can be used to build high throughput data processing systems with useful features
  • Educate why Elixir is uniquely suited for this type of data processing system
  • Provide examples of real-world use cases where this has been done (including one’s I’ve built)

Target audience:

  • Engineers working at companies who need fast, effortlessly scalable, online data processing systems that can handle pretty large throughput with reasonable resource usage.
  • Data Engineers looking to dip into Elixir.


Big Data