IT Data Platform Engineer 2

Evergen Remote, Remote, US, Remote
Remote

About The Position

We are looking for a hands-on Data Platform Engineer to own, operate, and evolve our modern cloud data stack. You will be the primary technical owner of our data infrastructure — responsible for keeping data flowing reliably from source systems into Snowflake and ensuring clean, trusted data reaches our business teams through Power BI. This is a high-impact role on a small, focused team where your work will be directly visible to the business.

Requirements

  • 3 to 5 years of experience in data engineering, analytics engineering, or a closely related role
  • Hands-on experience with dbt (Core or Cloud) including incremental models, tests, macros, and documentation
  • Proficiency with Snowflake including schema design, query optimization, warehouses, and role-based access control
  • Experience with a managed ingestion tool such as Fivetran and dlt including connector configuration and monitoring
  • Strong SQL skills with the ability to write and debug complex analytical queries
  • Familiarity with ELT pipeline patterns and medallion-style data warehouse architecture
  • Experience troubleshooting pipeline failures independently and communicating issues clearly to non-technical stakeholders
  • Comfort working autonomously in a small team environment with limited oversight

Nice To Haves

  • Experience connecting to on-premises source systems (SQL Server, SAP HANA, Oracle) via ODBC or CDC tooling
  • Familiarity with ERP financial data in NetSuite, SAP, or similar, particularly GL structures and chart of accounts
  • Exposure to Power BI including dataset refresh management and understanding of how semantic models consume warehouse data
  • Experience with Git-based workflows and basic CI/CD practices for data projects
  • Prior involvement in an EDW build or dimensional modeling project (star schema, slowly changing dimensions)

Responsibilities

  • Own and manage Fivetran connectors across all source systems including NetSuite, HubSpot, ADP, SQL Server, SAP HANA, and SharePoint
  • Configure and monitor sync schedules, column exclusions, and incremental load strategies to control cost and reliability
  • Troubleshoot connector failures and proactively manage schema drift from upstream sources
  • Maintain and extend our dbt project across three layers: staging (L1), core dimensions and facts (L2), and business-ready marts (L3)
  • Write and optimize SQL models using incremental merge strategies and watermark patterns
  • Author and maintain dbt tests, model documentation, and source freshness checks to ensure data quality
  • Support the buildout of our finance EDW including GL activity, planning data from Workday Adaptive, and NetSuite financials
  • Manage end-to-end pipeline scheduling and monitoring, ensuring daily refreshes complete reliably before business hours
  • Maintain the integration between Fivetran, dbt, and Power BI dataset refresh triggers
  • Build and maintain alerting so pipeline failures are caught and communicated before the business is impacted
  • Manage Snowflake environments including databases, schemas, roles, warehouses, and cost controls
  • Implement and maintain access controls and role-based permissions across the data platform
  • Contribute to data catalog and lineage documentation to support a growing team and reduce knowledge concentration risk
  • Partner with Finance, Sales, and Operations teams to understand reporting requirements and translate them into reliable data models
  • Support and mentor the junior member of the data team as they develop their skills
  • Work closely with the incoming NetSuite implementation team to ensure clean data integration into the warehouse

Benefits

  • Current or future sponsorship is not available for this role
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service