Data Engineer

Industrial Electric ManufacturingJacksonville, FL
$130,000 - $170,000Remote

About The Position

The Data Engineer is a core builder on IEM’s data and analytics team. This role develops and maintains the data pipelines and transformation models that power Tableau dashboards and business decisions across the organization. Working within an established modern data stack (Fivetran, Snowflake, dbt, Tableau. The Data Engineer turns raw source data into reliable, well-tested, and well-documented analytics-ready datasets. This is a hands-on individual contributor role with real ownership of production data models and growing influence on engineering standards.

Requirements

  • Bachelor’s degree in Computer Science, Information Systems, Data Science, or related field, or equivalent experience
  • 3–5 years of experience in data engineering, analytics engineering, or BI development with hands-on experience building production data pipelines
  • Strong SQL skills including CTEs, window functions, complex joins, and query optimization
  • Experience with Snowflake or similar cloud data warehouses
  • Working experience with dbt (data build tool) for building and testing data transformation workflows
  • Proficiency in Python for data processing, scripting, and API integrations
  • Experience with data integration platforms such as Fivetran or similar ELT tools
  • Familiarity with Tableau or similar BI tools and understanding of how data structure impacts dashboard performance
  • Comfortable with Git version control and modern development workflows including code review and CI/CD
  • Strong problem-solving skills with ability to debug data issues systematically
  • Clear written and verbal communication skills with ability to document work and explain technical concepts to non-technical stakeholders
  • Self-motivated with ability to work independently in a remote environment while collaborating across a distributed team
  • Experience using AI coding assistants (e.g., Claude, GitHub Copilot) and comfort directing AI agents to perform data engineering tasks such as writing SQL, generating dbt models, and debugging pipeline issues

Nice To Haves

  • Experience with manufacturing, construction, or project management data systems such as Procore, ERP platforms, or supply chain tools
  • Familiarity with dimensional modeling concepts (star schemas, fact/dimension tables)

Responsibilities

  • Data Pipeline Development: Build and maintain ELT pipelines using Fivetran and custom integrations that ingest data from source systems including Procore, Salesforce, ERP platforms, and internal databases into Snowflake
  • dbt Transformation Models: Develop, test, and document dbt models that transform raw data into clean, reliable datasets for analytics and reporting across Finance, Production, Supply Chain, and Engineering
  • Data Modeling: Build dimensional models and staging layers following team conventions, ensuring data is structured for optimal Tableau dashboard performance and ad-hoc analysis
  • Data Quality: Write and maintain dbt tests, monitor data freshness, and investigate data quality issues when they arise, owning resolution through to root cause
  • Source System Integration: Work with APIs and data connectors to integrate new data sources, troubleshoot ingestion issues, and ensure reliable data flow into the warehouse
  • Documentation: Maintain clear documentation for data models, pipeline configurations, and business logic so the team can understand and extend your work
  • Collaboration: Partner with business stakeholders to understand data needs, clarify requirements, and deliver datasets that answer real operational questions
  • Performance: Monitor query performance and pipeline efficiency, identifying opportunities to optimize warehouse costs and model run times
  • Engineering Standards: Participate in code reviews, follow Git workflows and CI/CD practices, and contribute to improving the team’s development processes
  • Continuous Learning: Stay current with modern data stack tools and practices, bringing ideas for improvement back to the team
  • AI-Assisted Development: Use AI coding assistants and agent-based tools to accelerate pipeline development, code generation, testing, and documentation. Manage AI agents as part of your daily workflow to increase throughput and quality
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service