Data Engineer

Crocs, Inc.Westminster, CO
$110,000 - $125,000Hybrid

About The Position

At Crocs, Inc., every career offers a chance to make a real impact. No two journeys look the same. And that's exactly how we like it. Whether you’re welcoming customers into our stores, collaborating with global teams at our headquarters, or keeping operations moving at our distribution centers, your impact is real and valued. At Crocs, Inc. you’re not expected to fit a mold. You’re encouraged to break it and create something better.  OverviewCrocs is seeking an experienced Data Engineer to design, implement, and maintain scalable data pipelines, decoupled data infrastructure, dynamic transformations and comprehensive orchestration layer using a combination of Snowflake, DBT, Airflow, Azure Data Lake, GitHub, and other tools as needed including various AI models.   In this role, you will solve unique and complex problems at a rapid pace, utilizing the latest technologies to create solutions that are highly scalable. As part of Enterprise Data Analytics team, you will help advance the adoption of data-driven insights and advanced AI analytics across multiple business domains within Crocs enterprise.

Requirements

  • Bachelor’s degree or equivalent in computer science, information technology, engineering, or equivalent technical degree.
  • 2+ years in a Data Engineering or Software Development role.
  • Strong proficiency in SQL, DBT, Python, Snowflake, Airflow and Git.
  • Experience designing data models following Kimbal Dimensional Modeling best practices.
  • Experience working with modern ETL/ELT tools, such as Databricks.
  • Prior experience working in a cloud platform (Azure preferred).

Nice To Haves

  • Knowledge of Astronomer and Apache NiFi preferred
  • Experience with REST API and Streamlit, preferred
  • Experience in distributed computing, preferred

Responsibilities

  • Data Modeling – Design and implement scalable, high-quality data models that support the Enterprise Data Platform (EDP) leveraging best practices while aligning with business and technical constraints.
  • ELT – Design and implement efficient, scalable, and maintainable data pipelines supporting both batch and near–real-time data processing.
  • CI/CD – Automate code integration, testing, and deployment using tools such as GitHub Actions to enable fast, reliable, and consistent delivery of data pipelines.
  • Engineering Best Practices – Apply modern engineering standards, including AI-assisted development, comprehensive end-to-end testing, agile methodologies, and robust CI/CD practices.
  • Troubleshooting – Proactively identify and resolve data processing, data quality, and performance issues in a timely and effective manner.
  • Continuous Automation – Wherever possible to streamline and optimize and scale data engineering workflows.
  • Continuous Learning – Stay current with advancements in data, analytics, and AI/ML, actively building skills that enhance productivity and product quality. Demonstrate curiosity, adaptability, and a forward-thinking mindset.

Benefits

  • This position is eligible to participate in a company incentive program.
  • This position is eligible for company benefits including but not limited to medical, dental, and vision coverage, life and AD&D, short and long-term disability coverage, paid time off, employee assistance, participation in a 401k program that includes company match, and many other additional voluntary benefits.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service