Reach Financial-posted about 10 hours ago
Full-time • Mid Level
Remote
251-500 employees

As a data engineer at Reach, you’ll be working on our data platform modernization effort alongside our existing Data Engineering team. You’ll be a key contributor to our holistic data practice, implementing pipelines, models, transformations and more as we look to finish our ongoing modernization of Reach’s legacy data stack and begin a new era of cohesive data design and management as part of the greater Engineering organization.

  • Develop and manage data pipelines, models, schemas, and data artifacts as a core engineer as part of our data engineering squad
  • Work alongside our Lead and Senior Data Engineers as well as peer Software and Platform Engineers to evolve and enhance existing best practices and infrastructure
  • Partner with data scientists, analysts, and stakeholders to understand data requirements and translate them into technical solutions
  • Stay up-to-date on the latest data engineering tools and technologies, and help evaluate new solutions to optimize our data ecosystem
  • Collaborate with Platform engineers to ensure seamless integration and deployment of data pipelines via common CI/CD toolkits like Terraform and Github Actions
  • Monitor and troubleshoot data pipelines, proactively identifying and resolving issues
  • Develop and maintain documentation for data pipelines and processes
  • 3+ years of hands-on experience in data engineering, working with DBT (Core or Cloud) and Snowflake, and familiar with other systems like GitHub, Amazon RDS, PostgreSQL, etc
  • Extensive knowledge of SQL and NoSQL data stores
  • Production experience working with and managing a Snowflake deployment
  • Production experience using dbt Cloud or other source-driven transformation management tools
  • Experience with operational database management, including backup, recovery, tuning, data modeling, scaling
  • Must be self-motivated, focused, and highly committed to delivering value
  • Experience extracting information from SaaS systems like Salesforce using ETL tools like FiveTran
  • Experience with Infrastructure as Code and Database Change Management tooling
  • Experience preparing data for analysis in Tableau or other BI tools
  • Dev certifications (and/or real-world experience) building clean data models that flow to Snowflake
  • Demonstrated experience with Python, Java, Scala, or similar languages
  • Familiarity with DBT coding conventions and Medallion data architecture (i.e. Bronze, Silver, Gold data layers)
  • Experience working in the Financial Services industry and familiarity with working in a SOC2 and PCI-DSS environment.
  • Remote First Culture with optional Hybrid opportunities
  • Healthcare, Life Insurance, 401k Match
  • Paid Time Off, Paid 12-week Parental Leave
  • Disability (short-term, long-term), Employee Assistance Program
  • Spending Accounts (Transit/Parking, Medical, Dependent Care)
  • Insurance Discounts (home, auto, pet)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service