About The Position

Dominium is helping tackle the affordable housing crisis - and we're looking for motivated candidates to join our team and help further our mission. With offices in Minneapolis, Atlanta, Dallas and Phoenix, Dominium is one of the nation's most respected and innovative affordable housing development and management companies. We create quality, affordable homes and engage with our residents daily to create a strong sense of community and connectivity. Join us in making a difference in people's lives every day at a company where you can challenge yourself to develop both personally and professionally. We are seeking a Senior Data Engineer to lead the design, development, and optimization of scalable, secure, and high-performance data pipelines. This individual will play a critical role in shaping Dominium's data infrastructure and platform, driving the adoption of modern data engineering best practices, and mentoring junior team members. You will be responsible for building advanced data pipelines, data models, and orchestration solutions that power business intelligence, analytics, and strategic decision-making across the organization.

Requirements

  • Bachelor's degree in Computer Science, Information Systems, Data Engineering, or relevant degree.
  • 5+ years of professional experience in cloud-based data engineering, platform engineering, or a similar OR 9+ years of relevant professional experience in lieu of Bachelor's degree.
  • Expert-level knowledge of Azure and Snowflake.
  • Hands-on expertise in Python and dbt, with strong system design experience.
  • Proficiency in SQL for complex queries, performance tuning, and data modeling.
  • Experience building data pipelines using APIs and SQL-based systems.
  • Strong understanding of Power BI and its integration with Snowflake.
  • Experience with Terraform and CI/CD pipelines.
  • Proactive and self-driven, with a focus on automation, scalability, and operational efficiency.
  • Strong analytical and problem-solving skills.
  • Clear and effective communication skills, with the ability to translate technical concepts to non-technical stakeholders.
  • Comfortable collaborating across engineering, analytics, and business teams.
  • Familiarity with project management methodologies such as Agile and Waterfall.

Responsibilities

  • Leads the design and implementation of enterprise-grade data pipelines that extract, load, and transform (ELT) data from diverse sources (SQL Server, APIs, SharePoint, etc.) into Snowflake.
  • Designs and refines scalable, maintainable data models using dbt to support efficient, high-performance analytics and reporting.
  • Drives data engineering best practices including data quality, observability, orchestration, and automation to ensure system resilience and accuracy.
  • Collaborates with stakeholders across departments to understand business requirements and translate them into scalable data solutions.
  • Provides technical leadership and mentorship to data management team members, fostering growth and knowledge-sharing.
  • Leads infrastructure-as-code efforts using Terraform, and supports CI/CD pipeline implementation for data deployments.
  • Serves as a subject matter expert for Power BI integration with Snowflake and assists in re-architecting legacy reports for performance and scalability.
  • Enforces compliance with data governance policies, including access control, lineage, data privacy, and security standards.
  • Stays abreast of industry trends and evaluates new technologies to continuously improve data engineering capabilities.
  • Additional projects as assigned.

Benefits

  • Competitive salary
  • Incentive bonus program
  • Training and development programs
  • Career growth opportunities
  • Community volunteer and outreach programs
  • Comprehensive benefits package including Medical, Dental, Life & Disability, Paid Time Off, 401(K), Flexible Spending Accounts, Employee Recognition & Wellness Programs
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service