Data Engineer

RAPPChicago, IL
8h$81,000 - $97,000Hybrid

About The Position

RAPP Chicago is looking for a Data Engineer to join our award-winning Technology team. WHO WE ARE: We are RAPP – world leaders in activating growth with precision and empathy at scale. As a global, next-generation precision marketing agency we leverage data, creativity, technology, and empathy to foster client growth. We champion individuality in the marketing solutions we create, and in our workplace. We fight for solutions that adapt to the individual’s needs, beliefs, behaviors, and aspirations. We foster an inclusive workplace that emphasizes personal well-being. HOW WE DO IT: At RAPP, our fearless superconnectors help to create value from personal brand experiences by focusing on three key areas: connected data, connected content and connected decisioning. Our data analysts identify who that person is, our strategists understand what they want, and our award-winning technologists and creatives know how to deliver it – ensuring we’re able to activate authentic customer connections for our clients. Part of Omnicom’s Precision Marketing Group, RAPP is comprised of 2,000+ creatives, technologists, strategists, and data and marketing scientists across 15+ global markets YOUR ROLE: We are looking for a Data Engineer who is eager to learn and grow while contributing to the development of scalable, cloud‑native data pipelines and platforms. The ideal candidate has foundational knowledge of Python and an interest in building data workflows using modern technologies such as Apache Airflow, AWS Lambda, DynamoDB, and dbt. You should have a strong curiosity for how data systems operate, a willingness to learn best practices in data engineering, and a motivation to support advanced analytics, as you gain hands‑on experience.

Requirements

  • 1–3 years of experience in data engineering, software engineering, or a related role.
  • Proficiency in Python for data engineering and automation.
  • Familiarity with Apache Airflow for workflow orchestration.
  • Basic understanding of AWS Lambda and serverless design patterns.
  • Exposure to DynamoDB (schema design and performance considerations).
  • Knowledge of dbt for data transformation and analytics modeling.
  • Experience working in cloud environments (AWS preferred).
  • Understanding of CI/CD workflows, Git, and DevOps practices.
  • Strong analytical, problem-solving, and communication skills.
  • Exceptional attention to detail and organizational skills.
  • Strong written and verbal communication skills, with the ability to explain complex metadata systems to non-technical users.
  • Ability to work collaboratively and cross-functionally with creative, marketing, and IT teams.
  • Proactive problem-solver who can identify issues and suggest improvements.
  • Time management skills with the ability to prioritize and manage multiple tasks in a fast-paced environment.

Nice To Haves

  • Experience with other AWS services (S3, Glue, Redshift, Kinesis).
  • Familiarity with data warehouse and data lake architectures.
  • Exposure to real-time streaming and event-driven data pipelines.
  • Knowledge of containerization (Docker, Kubernetes).

Responsibilities

  • Design, build, and maintain robust ETL/ELT pipelines using Python and Airflow.
  • Develop serverless workflows leveraging AWS Lambda for scalable event-driven data processing.
  • Implement and optimize dbt models for analytics and transformations.
  • Design schemas and manage data in DynamoDB and other cloud-native storage solutions.
  • Ensure high availability, scalability, and performance of data systems.
  • Integrate structured, semi-structured, and unstructured data sources.
  • Build workflow orchestration strategies using Airflow for scheduling and monitoring pipelines.
  • Automate infrastructure deployment and CI/CD pipelines for data services.
  • Implement data validation, testing, and monitoring frameworks.
  • Ensure compliance with security, privacy, and governance standards.

Benefits

  • health/vision/dental insurance
  • 401(k)
  • stock options
  • Healthcare & Dependent Flexible Spending Accounts
  • vacation, sick, and personal days and positive activism days
  • paid parental leave and disability benefits
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service