Senior Data Engineer

RAPPChicago, IL
6h$90,000 - $99,000Hybrid

About The Position

RAPP Chicago is looking for a Senior Data Engineer to join our award-winning Technology team. WHO WE ARE: We are RAPP – world leaders in activating growth with precision and empathy at scale. As a global, next-generation precision marketing agency we leverage data, creativity, technology, and empathy to foster client growth. We champion individuality in the marketing solutions we create, and in our workplace. We fight for solutions that adapt to the individual’s needs, beliefs, behaviors, and aspirations. We foster an inclusive workplace that emphasizes personal well-being. HOW WE DO IT: At RAPP, our fearless superconnectors help to create value from personal brand experiences by focusing on three key areas: connected data, connected content and connected decisioning. Our data analysts identify who that person is, our strategists understand what they want, and our award-winning technologists and creatives know how to deliver it – ensuring we’re able to activate authentic customer connections for our clients. Part of Omnicom’s Precision Marketing Group, RAPP is comprised of 2,000+ creatives, technologists, strategists, and data and marketing scientists across 15+ global markets. YOUR ROLE: We are looking for a Senior Data Engineer with deep expertise in building scalable, cloud-native data pipelines and platforms. The ideal candidate is highly skilled in Python, Apache Airflow, AWS Lambda, DynamoDB, and dbt, and has experience designing reliable data workflows that enable advanced analytics, reporting, and machine learning use cases. The ideal candidate will have strong attention to detail, a passion for information management, and the ability to work collaboratively with creative teams to enhance the efficiency and scalability of our asset workflows.

Requirements

  • 5–8+ years of experience in data engineering, software engineering, or a related role.
  • Strong expertise in Python for data engineering and automation.
  • Hands-on experience with Apache Airflow for orchestration.
  • Proficiency with AWS Lambda and serverless design patterns.
  • Solid experience with DynamoDB (schema design, performance tuning, scaling).
  • Strong knowledge of dbt for transformation and analytics modeling.
  • Experience with cloud environments (AWS preferred).
  • Familiarity with CI/CD workflows, Git, and DevOps practices.
  • Strong problem-solving and communication skills.
  • Exceptional attention to detail and organizational skills.
  • Strong written and verbal communication skills, with the ability to explain complex metadata systems to non-technical users.
  • Ability to work collaboratively and cross-functionally with creative, marketing, and IT teams.
  • Proactive problem-solver who can identify issues and suggest improvements.
  • Time management skills with the ability to prioritize and manage multiple tasks in a fast-paced environment.

Nice To Haves

  • Experience with other AWS services (S3, Glue, Redshift, Kinesis).
  • Familiarity with data warehouse and data lake architectures.
  • Exposure to real-time streaming and event-driven data pipelines.
  • Knowledge of containerization (Docker, Kubernetes).

Responsibilities

  • Design, build, and maintain robust ETL/ELT pipelines using Python and Airflow.
  • Develop serverless workflows leveraging AWS Lambda for scalable event-driven data processing.
  • Implement and optimize dbt models for analytics and transformations.
  • Design schemas and manage data in DynamoDB and other cloud-native storage solutions.
  • Ensure high availability, scalability, and performance of data systems.
  • Integrate structured, semi-structured, and unstructured data sources.
  • Build workflow orchestration strategies using Airflow for scheduling and monitoring pipelines.
  • Automate infrastructure deployment and CI/CD pipelines for data services.
  • Implement data validation, testing, and monitoring frameworks.
  • Ensure compliance with security, privacy, and governance standards.
  • Partner with analytics, product, and engineering teams to deliver reliable datasets.
  • Mentor junior engineers and enforce best practices in data engineering.
  • Actively contribute to improving team efficiency, scalability, and standards.

Benefits

  • health/vision/dental insurance
  • 401(k)
  • stock options
  • Healthcare & Dependent Flexible Spending Accounts
  • vacation, sick, and personal days and positive activism days
  • paid parental leave and disability benefits
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service