Principal Data Engineer (Remote Work Eligible)

Dimensional Fund AdvisorsRemote - Texas, TX
Hybrid

About The Position

The Technology Department at Dimensional leverages the rapidly evolving state of the art to engineer scalable, innovative, and research driven solutions to improve our client’s financial lives. Data Engineers at Dimensional participate in the design and development of data solutions across an array of domains from Research and Investments to Sales and Marketing; collaboratively developing MVPs to test their ideas and rapidly iterate with constant feedback from users. Dimensional invests heavily in developer tools, platforms, paradigms and experience enabling teams to provide modern solutions that contribute profoundly to our client’s success. We are looking for a Principal Data Engineer to join our team and drive company success by helping Dimensional leverage its critical data. The most important qualifications are a passion for driving business value through quality data solutions and high enthusiasm for learning new technologies and approaches to solving complex business problems You may be a fit for this role if you: Are open-minded, curious, and resourceful Are passionate about/stay current with modern technologies Solve problems systematically and transparently Share ideas, solicit/integrate feedback, design and solve collaboratively Take a software engineering approach and demonstrate automation and security mindsets What you might work on: As a Principal Data Engineer at Dimensional, you will have the opportunity to understand the users’ needs and solve problems like database and schema design and engineering, ETL pipelines, APIs, and microservice design and engineering.

Requirements

  • 10+ Years of Mastery: Extensive professional software development experience, with a decade-long track record in data modeling, architecture, and warehouse/lake engineering.
  • Expert-level proficiency with Snowflake, dbt, and Airflow.
  • Advanced experience with Python and JavaScript (or equivalent languages).
  • Deep hands-on experience with Relational and Non-Relational databases, including expert-level schema design and optimization.
  • Strong working knowledge of DevOps tools and CI/CD pipelines to support automated, reliable data workflows.
  • A deep understanding of cybersecurity best practices and a demonstrated ability to integrate them into development activities.
  • Bachelor’s degree in a technical field or equivalent practical experience.

Nice To Haves

  • Are open-minded, curious, and resourceful
  • Are passionate about/stay current with modern technologies
  • Solve problems systematically and transparently
  • Share ideas, solicit/integrate feedback, design and solve collaboratively
  • Take a software engineering approach and demonstrate automation and security mindsets

Responsibilities

  • Architect and oversee the engineering of large-scale data warehouses and lakes, specifically utilizing Snowflake for high-concurrency and complex modeling needs.
  • Design and manage sophisticated data workflows and transformation layers using Airflow and dbt.
  • Write and review high-quality, production-ready code, primarily utilizing Python and JavaScript.
  • Drive the strategy for relational and non-relational database schemas, focusing on performance, scalability, and long-term maintainability.
  • Lead the implementation of robust DevOps practices and continuous delivery pipelines to streamline data deployment and operations.
  • Embed cybersecurity best practices into the DNA of the engineering process, ensuring data protection and compliance by design.

Benefits

  • comprehensive benefits
  • educational initiatives
  • special celebrations of our history, culture, and growth
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service