Data Engineering Manager

AllianceBernsteinNashville, TN
1d

About The Position

AB (Private Alternatives) is seeking a VP of Data Engineering to lead our data engineering function within the Private Alternatives technology team. This leader will be responsible for building and maintaining the data infrastructure that powers investment decision-making across our CLO, structured credit, and direct lending portfolios. The role sits at the intersection of modern data stack technologies and alternative investment management, requiring someone who can translate complex financial data requirements into robust, scalable data pipelines. The VP will manage a team of data engineers responsible for moving, transforming, and storing data from diverse sources including loan servicers, trustees, fund administrators, market data providers, and internal systems. This person will work closely with investment professionals, portfolio managers, and technology colleagues to ensure that data flows seamlessly from source systems through to analytics tools like Sigma, with Validio providing continuous data quality monitoring to ensure that the data stored in Snowflake remains pristine and trustworthy at every stage of the pipeline. Private credit data presents unique challenges: complex hierarchical structures spanning deals, tranches, and individual loans; semi-structured documents like indentures and credit agreements; irregular reporting cadences from counterparties; and the need to reconcile data across multiple authoritative sources. The ideal candidate will have experience navigating these complexities or will bring transferable experience from other domains involving complex financial instruments.

Requirements

  • 8 or more years of experience in data engineering
  • At least 3 years in a leadership role managing data engineering teams.
  • Deep expertise with Snowflake including performance tuning, cost optimization, and familiarity with advanced features such as Dynamic Tables, Snowpipe Streaming, and Cortex AI capabilities.
  • Strong proficiency with dbt including model design patterns, testing strategies, incremental processing, and experience managing large-scale dbt projects.
  • Hands-on experience with Apache Airflow including DAG design, operational monitoring, and understanding of Airflow 3.0 capabilities including DAG versioning and event-driven scheduling.
  • Experience with Fivetran or similar managed ingestion platforms, including connector configuration, custom connector development, and cost optimization strategies.
  • Experience with data observability and quality platforms is required, with strong preference for hands-on experience with Validio including validator configuration, ML-based threshold management, segmented anomaly detection, and lineage integration with dbt.
  • Candidates should understand how to balance automated anomaly detection with rule-based validation, and how to tune alerting to minimize noise while catching genuine issues.
  • Strong Python skills for custom ingestion development and data processing.
  • Advanced SQL skills including complex analytical queries, window functions, and performance optimization.
  • Familiarity with AWS services commonly used in data platforms including S3, Lambda, and related infrastructure.

Responsibilities

  • managing a team of data engineers responsible for moving, transforming, and storing data from diverse sources including loan servicers, trustees, fund administrators, market data providers, and internal systems
  • ensuring that data flows seamlessly from source systems through to analytics tools like Sigma
  • continuous data quality monitoring to ensure that the data stored in Snowflake remains pristine and trustworthy at every stage of the pipeline
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service