Data Engineer II

IdexxWestbrook, ID
10h$90,000 - $100,000Hybrid

About The Position

IT Data Engineering is responsible for designing and developing enterprise data collection, aggregation, and access systems for IDEXX's global business environment. We work with traditional data architectures such as enterprise data warehousing and cloud-based data centralization systems like data integration hubs. Our products are used to synchronize billions of data points between enterprise applications such as global ERP and CRM applications, collect and normalize reference lab results and clinical utilization, and create easy-to-access data sources for enterprise analytics, sales enablement tools, and data science initiatives. We are seeking a Data Engineer to join our centralized enterprise data engineering team. In this role, you will design, build, and maintain reliable, scalable data pipelines and data models that support analytics, reporting, and operational use cases across the organization. You’ll work with shared, enterprise-wide datasets, and collaborate with data engineers, data analysts, and business partners to ensure data is accurate, accessible, well-governed, and easy to use. Location : We are looking for those local to our HQ in Westbrook, Maine where we require a minimum of 8 days per month on-site, allowing flexibility of hybrid.

Requirements

  • Typically 2–5 years of experience in data engineering or a related role, or equivalent practical experience
  • On-call is required
  • Strong SQL skills and experience working with cloud data warehouses, especially Snowflake
  • Proficiency in Python for data pipeline development and orchestration
  • Familiarity with AWS services commonly used in data platforms, such as EC2, S3, and Lambda
  • Experience building and scheduling workflows with Apache Airflow
  • Experience working with relational databases such as MySQL and Oracle
  • Solid understanding of data warehousing concepts and relational data modeling
  • Experience supporting shared datasets in a centralized or enterprise data environment
  • Familiarity with Git and collaborative development practices

Nice To Haves

  • Experience working in enterprise or regulated environments, or with complex data ecosystems
  • Exposure to data governance, data quality frameworks, or metadata management
  • Experience optimizing performance and cost in Snowflake
  • Familiarity with CI/CD practices for data pipelines
  • Familiarity with Terraform for infrastructure-as-code (IaC)
  • Experience working in Agile development environments

Responsibilities

  • Design, build, and maintain ETL/ELT pipelines using SQL , Python and Apache Airflow
  • Develop, optimize, and maintain data models and transformations in Snowflake
  • Integrate and manage data from source systems including MySQL and Oracle
  • Build shared enterprise datasets that support analytics, reporting, and operational workflows
  • Implement data quality checks and monitoring to ensure accuracy, completeness, and reliability
  • Monitor, troubleshoot, and improve pipeline performance, reliability, and cost efficiency
  • Contribute to data governance, documentation, naming conventions, and testing standards
  • Work in an established scrum team
  • Collaborate with technical and non-technical partners to understand data requirements and explain data solutions clearly
  • Support ongoing maintenance and continuous improvement of enterprise data platforms

Benefits

  • Base annual salary target: $90000 - $100000 (yes, we do have flexibility if needed)
  • Opportunity for annual cash bonus
  • Health / Dental / Vision Benefits Day-One
  • 5% matching 401k
  • Additional benefits including but not limited to financial support, pet insurance, mental health resources, volunteer paid days off, employee stock program, foundation donation matching, and much more!
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service