Data Analytics Engineer

Activision Blizzard, Inc.Santa Monica, CA
Onsite

About The Position

The Data Analytics Engineer role within Marketing Technology at Activision is for an excellent communicator who can transform ambiguous business problems into well-designed technical solutions. The position involves building high-scale, cloud-native backend systems to power marketing, profiling, and reporting across Activision's iconic gaming portfolio, with a focus on personalizing the player experience. The team emphasizes building reliable, secure, optimized, and maintainable applications, leveraging state-of-the-art technologies. This experienced data engineer will join a diverse team to build and evolve data products and services that understand and activate players across Activision titles, enabling timely in-game and out-of-game communication. The team operates with a problem-solving mindset, constantly reviewing processes, learning, and innovating with new technology. Activision Blizzard is the world's largest interactive entertainment company, home to beloved franchises like Call of Duty®, Crash Bandicoot™, and Diablo®. They aim to deliver unrivaled gaming experiences and are committed to creating an inclusive company culture. The company has over 10,000 global employees and has been recognized as one of FORTUNE's "100 Best Companies To Work For®". This role is based in the Santa Monica, CA office and follows an onsite work schedule of four days per week.

Requirements

  • BA/BS degree in Computer Science or related technical field (or equivalent practical experience).
  • 3–5+ years of professional experience building and maintaining production-grade data pipelines or backend data applications.
  • Strong proficiency in Python and SQL, including clean architecture, testing practices, and solid engineering fundamentals.
  • Hands-on experience with Databricks (Spark, Delta Lake, job orchestration, performance tuning, and best practices).
  • Understanding of distributed processing systems and designing for scale, reliability, and cost efficiency.
  • Experience working effectively in ambiguous problem spaces with cross-functional stakeholders and evolving requirements.
  • Able to clearly explain technical designs and tradeoffs to both technical and non-technical audiences.

Nice To Haves

  • Experience with cloud platforms (AWS, GCP, or Azure), including storage, compute, networking, and IAM/security concepts.
  • Experience in modeling and working with marketing and player behavioral data.
  • Experience with Spark Structured Streaming and streaming design patterns (watermarking, late data handling, idempotency, replay strategies).
  • Experience with distributed messaging systems such as Kafka and/or Pub/Sub (including schema evolution, ordering guarantees, retries, and dead-letter queues).
  • Experience building identity resolution, attribution, or cross-platform player profile systems.
  • Experience with Docker and containerized workloads (Kubernetes a plus).
  • Familiarity with Looker/LookML and semantic modeling layers.
  • Experience with data governance concepts, including data contracts, cataloging, lineage, and access controls.
  • Experience scaling internal data products and services to stakeholder teams.
  • Track record applying AI-accelerated engineering practices.

Responsibilities

  • Design and implement scalable data pipelines and backend data services using Databricks (Spark), Python, and SQL with a strong emphasis on reliability, observability, security, and cost/performance optimization.
  • Build event-driven data applications and services that consume, transform, and publish player events and signals for downstream activation and analytics (e.g., player profiling, marketing triggers, identity resolution, reporting signals).
  • Design both batch and streaming systems with strong guarantees around data correctness, idempotency, replayability, and schema evolution.
  • Translate ambiguous business needs into technical designs, engineering plans, and measurable outcomes; clearly articulate tradeoffs and architectural decisions.
  • Collaborate with stakeholders across Marketing, Analytics, Engineering, Product, and Legal to ensure solutions are aligned, secure, privacy-compliant, and fit-for-purpose.
  • Apply strong engineering judgment: select appropriate architectural patterns, enforce data contracts, document decisions, and build for long-term maintainability and scalability.
  • Improve and uphold team standards across code quality, testing practices, CI/CD pipelines, observability, monitoring, data governance, and shared libraries.
  • Innovate thoughtfully with AI-enabled engineering tools to accelerate development while maintaining high quality, correctness, and security.
  • Works under general guidance from senior engineers or technical leads on complex cross-system and architectural decisions.

Benefits

  • Medical, dental, vision, health savings account or health reimbursement account, healthcare spending accounts, dependent care spending accounts, life and AD&D insurance, disability insurance
  • 401(k) with Company match, tuition reimbursement, charitable donation matching
  • Paid holidays and vacation, paid sick time, floating holidays, compassion and bereavement leaves, parental leave
  • Mental health & wellbeing programs, fitness programs, free and discounted games, and a variety of other voluntary benefit programs like supplemental life & disability, legal service, ID protection, rental insurance, and others
  • Relocation assistance (if the Company requires that you move geographic locations for the job)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service