Data Engineer

IbottaDenver, CO
1d$110,000 - $126,000Hybrid

About The Position

Ibotta is looking for a software-focused Data Engineer to join our team and contribute to our mission to Make Every Purchase Rewarding. Accelerating development of our cutting-edge data platform as a leader in the Data Platform Organization, you will work with both engineering and analytics to develop and own stable, scalable, and approachable data platforms. We're looking for a self-motivated engineer who has a passion for enabling data mesh concepts while heavily leveraging AWS cloud and Databricks Lakehouse technologies. The data engineering team is central to delivering and maintaining our modern data, analytics, and decisioning platforms across Ibotta. This position is located in Denver, Colorado as a hybrid position requiring 3 days in office, (Tuesday, Wednesday, and Thursday). Candidates must live in the United States. Not based in Denver? We will offer a relocation bonus to help make your move to the Mile High City a smooth one.

Requirements

  • 3+ years of software development experience (Python preferred), with a specific focus on building data engineering frameworks and automation tools.
  • Proven track record of designing and deploying end-to-end data pipelines within Databricks or similar platforms.
  • Experience leading technical projects through the entire software development lifecycle (SDLC), from initial concept to production support.
  • Proficiency in using AI-assisted coding platforms (Claude, Copilot, Cursor, etc.) to improve development velocity and solve complex logic problems.
  • Strong SQL and Python abilities with platforms like Databricks or similar.
  • Bachelor’s degree in Computer Science, Engineering, or a related field.

Nice To Haves

  • Experience with the following a strong plus:
  • AWS Cloud Services; EC2, S3
  • Experience with Scala and Spark
  • Experience with Delta Lake, Apache Iceberg, or Apache Hudi
  • Message Brokers such as Kafka or Kinesis
  • ETL tools and processes (Airflow or other similar tools)
  • Infrastructure as code using Terraform, CloudFormation, etc
  • Experience building APIs and libraries
  • Agile (Kanban or Scrum) development experience

Responsibilities

  • Initiative Leadership: Own small-to-medium sized technical initiatives from ideation and architectural design to full implementation , ensuring solutions are scalable and maintainable.
  • Automation & Frameworks: Design, build, and maintain internal automation tools and software frameworks that standardize data ingestion, transformation, and quality checks across the organization.
  • E2E Pipeline Ownership: Develop and manage end-to-end (E2E) data pipelines that are robust, self-healing, and compliant with Data Governance and Security standards.
  • AI-Augmented Development: Proactively leverage AI tools (e.g., Claude, GitHub Copilot, Cursor ) to accelerate framework development, optimize Spark queries, and maintain high code quality.
  • Platform Evolution: Work with cross-functional teams to enable self-service data access patterns, advocating for "Data as a Product" and infrastructure-as-code.
  • Operational Excellence: Perform root cause analysis for critical outages and implement systematic fixes to prevent recurrence. Provide rotational on-call support.
  • Assist with documentation of the environments and data tooling that support our products.
  • Embrace and uphold Ibotta’s Core Values: Integrity, Boldness, Ownership, Teamwork, Transparency, & A good idea can come from anywhere

Benefits

  • competitive pay
  • flexible time off
  • benefits package (including medical, dental, vision)
  • Employee Stock Purchase Program
  • 401k match
  • paid parking
  • snacks and occasional meals
  • relocation bonus
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service