About The Position

Azumo is seeking a highly motivated Big Data Engineer to develop and enhance data and analytics infrastructure. This position is FULLY REMOTE and based in Latin America. The role focuses on building scalable, reliable, and governed data systems to power analytics, live operations, player insights, publishing intelligence, and business decision-making across the gaming ecosystem. The ideal candidate will have strong hands-on experience with Databricks, DBT, AWS, scalable ETL pipelines, data governance, and modern analytics engineering practices. Experience supporting gaming, telemetry, live operations, or high-scale analytics environments is strongly preferred.

Requirements

  • Strong experience with Databricks
  • Strong SQL and Python programming experience
  • Experience building scalable ETL/ELT pipelines
  • Experience with AWS data ecosystem and cloud-native architectures
  • Experience with DBT, Apache Airflow or similar orchestration tools
  • Strong understanding of data modeling and analytics engineering
  • Experience working with large-scale structured and semi-structured datasets
  • Experience implementing data governance and data quality practices
  • Experience optimizing pipeline performance and scalability
  • Strong communication and collaboration skills
  • Ability to work independently within distributed engineering teams
  • BS or Master’s degree in Computer Science, related degree, or equivalent experience
  • 5+ years experience with data-related and data management responsibilities
  • Deep expertise in designing and building data warehouses and big data analytics systems
  • Practical experience manipulating, analyzing and visualizing data
  • Self-driven and motivated, with a strong work ethic and a passion for problem solving
  • Professional English proficiency (B2/C1)

Nice To Haves

  • Gaming industry experience
  • LiveOps or telemetry pipeline experience
  • Experience supporting analytics for player behavior, engagement, monetization, or publishing systems
  • Experience with real-time or streaming data systems
  • Experience with viewer analytics or esports data ecosystems
  • Experience with KPI reporting and operational dashboards
  • Familiarity with cost optimization and cloud infrastructure analytics
  • Experience working in fast-paced product engineering organizations

Responsibilities

  • Design, build, and maintain scalable ETL/ELT pipelines using Databricks and AWS
  • Improve ingestion pipeline quality, reliability, scalability, and governance
  • Develop and optimize core data models and foundational data tables
  • Build analytics-ready datasets to support player insights, publishing analytics, esports analytics, and operational reporting
  • Implement data governance, data quality, lineage, and observability practices
  • Collaborate with product, analytics, engineering, and business stakeholders to support data-driven decision-making
  • Optimize large-scale data processing workflows for performance and cost efficiency
  • Support centralized player data models, viewer analytics, publishing activity systems, and operational metrics
  • Contribute to the unification of fragmented data ecosystems across multiple game teams and organizations
  • Build and maintain reliable orchestration workflows and scheduling systems
  • Participate in architectural discussions around scalability, governance, and data platform modernization

Benefits

  • Paid Time Off
  • Mentored Career Development
  • U.S. Holidays
  • USD Remuneration
  • Profit Sharing
  • Maternity Coverage
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service