Analytics Engineer, Feature Engineering

IbottaDenver, CO
$113,000 - $132,000Hybrid

About The Position

Ibotta is seeking an Analytics Engineer, Feature Engineering to join our innovative Core Data & Analytics team and contribute to our mission to Make Every Purchase Rewarding. Our growing Core Data & Analytics team provides high-quality, on-time analytics products and services across all business areas and teams within Ibotta. We achieve this by offering expert guidance, sharing specialized knowledge, promoting data-driven solutions, and overseeing mission-critical data science projects that impact the entire organization. This position is located in Denver, Colorado, as a hybrid position requiring 3 days in office (Tuesday, Wednesday, and Thursday). Candidates must live in the United States. Not based in Denver? We will offer a relocation bonus to help make your move to the Mile High City a smooth one.

Requirements

  • 3+ years of practical work experience in a data engineering, machine learning or equivalent experience as an analytics engineer
  • Bachelor’s degree in Computer Science, Engineering, Analytics, or a related field required
  • Working knowledge and some practical experience with some or all of the following:
  • End-to-end analytics automation, data pipelines, ETL/ELT processes, and tools (AWS Glue, DBT, etc.)
  • AWS EcoSystem and cloud-based data warehouse and architecture
  • Airflow, DataBricks, Git, Monte Carlo
  • Multiple languages and frameworks (Python, Scala, strong SQL, Spark, Command Line) highly preferred
  • Development in a modern BI/data visualization platform (Looker, Tableau, etc.)
  • Event-driven architectures and platforms are a strong plus
  • An ability to develop solutions by applying data quality principles
  • Ability to think creatively, provide thoughtful insights, and solve problems to answer business questions using data
  • Collaboration with SMEs to understand the business context of the data
  • Experience identifying and troubleshooting data anomalies and pipeline issues
  • Exposure to managing and updating cluster configurations to ensure workflow operation
  • Ownership of data throughout its lifecycle
  • Excellent oral and written communication skills

Responsibilities

  • Collaborate with data scientists to identify and extract relevant features from raw data
  • Preprocess and transform data to make it suitable for machine learning models
  • Create new features by combining, modifying, or aggregating existing features
  • Evaluate the quality of features and assess their impact on model performance
  • Monitor feature drift and update features over time to maintain model performance
  • Keep up with the latest advancements in feature engineering techniques and tools.
  • Implement and utilize engineering best practices and methods to deploy and maintain quality, curated data sets using Airflow, including automated alerting and anomaly detection into data flows to ensure data quality and integrity
  • Optimize pipeline performance for real-time or near-real-time model execution
  • Work across our full technology stack (Databricks, Spark, Command Line, Airflow, GitHub, Python, Monte Carlo, etc) to develop and maintain these datasets
  • Managing new data requirements and developing solutions that minimize technical debt creation
  • Embrace and uphold Ibotta’s Core Values: Integrity, Boldness, Ownership, Teamwork, Transparency, & A good idea can come from anywhere

Benefits

  • competitive pay
  • flexible time off
  • benefits package (including medical, dental, vision)
  • Employee Stock Purchase Program
  • 401k match
  • paid parking
  • snacks and occasional meals
  • relocation bonus
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service