Director - Enterprise Data Platform

BlackRockNew York, NY
Hybrid

About The Position

About the Team: You will be joining a newly launched Forward Deployed Engineering team dedicated to client engagement and platform adoption. This team operates with a startup mentality within the company – high collaboration, agility, and a focus on delivering results. As a principal FDE, you’ll serve as a key leader in this team, helping establish its practices and culture. You’ll work alongside seasoned data platform engineers and product managers, directly contributing to our mission of empowering users to unlock maximum value from data. Job Summary: We are seeking a Principal-level Forward Deployed Engineer (FDE) Lead to spearhead client-facing technical engagements for our Enterprise Data Platform (EDP). In this high-impact individual contributor role, you will bring product-grade data engineering expertise directly to internal (and later external) clients to drive adoption of our EDP. This role is not primarily a pipeline engineering role. Instead, you will act as a technical authority, advisor, and enablement lead—working with our Data Platform as a Service (DPaaS) adopters to design high-quality data pipelines, teach best practices, and guide teams to build and operate pipelines effectively themselves. You will help clients understand how to build pipelines well on EDP, review designs and implementations, and unblock complex issues, while leaving daytoday pipeline construction and ownership with the adopting teams. This role combines deep hands-on technical work (in Airflow, Snowflake, dbt, Great Expectations, DataHub, etc.) with strategic advisory responsibilities. As a founding member of the Forward Deployed Engineering team, you will play a pivotal role in shaping how we accelerate data platform adoption and deliver insights in the investment management domain. The position is remote-friendly with minimal travel (occasional client visits if needed).

Requirements

  • Extensive Data Engineering Experience: 8+ years of hands-on experience in data engineering, data architecture, or related fields, with a track record of designing and delivering large-scale data solutions.
  • Expert SQL Skills: Deep proficiency in SQL development and optimization, including writing and tuning complex SQL queries for both ETL processing and analytical reporting/BI use cases. Proven ability to refactor SQL code for efficiency and readability.
  • Pipeline & ETL Orchestration: Strong experience building and orchestrating data pipelines using tools such as Apache Airflow. Ability to develop, schedule, and monitor complex ETL workflows in a production environment.
  • Data Transformation & Modeling: Proficiency with dbt (data build tool) for data transformations and schema management. Solid background in data modeling (relational, dimensional modeling) and designing data architectures following medallion (layered data lake) and/or data mesh principles.
  • Snowflake Expertise: In-depth knowledge of Snowflake’s Data Cloud platform, including its internal architecture, unique features, and best practices. Experience with Snowflake performance tuning (e.g., optimizing warehouses, clustering keys, query profiling) and understanding of how to leverage Snowflake’s capabilities (like zero-copy cloning, data sharing, etc.) in solution designs.
  • Data Quality & Governance Tools: Hands-on experience with data quality/validation frameworks (e.g., Great Expectations) and familiarity with metadata management or data catalog tools (e.g., DataHub) to ensure transparency and trust in data.
  • Problem Solving & Analytical Thinking: Exceptional analytical and troubleshooting skills. Demonstrated ability to solve complex data engineering problems, optimize performance, and handle large data sets and complex data integration challenges in real-world scenarios.
  • Communication & Stakeholder Engagement: Excellent communication and interpersonal skills. Comfortable working directly with non-technical stakeholders and senior client leaders to gather requirements, explain technical concepts in business terms, and ensure solutions meet business objectives.
  • Agile & Product Mindset: Experience working in Agile/Scrum teams with an iterative delivery approach. Strong product mindset – ability to think of data pipelines and models as products that need to deliver ongoing value and adapt to changing requirements.
  • Education: Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field (or equivalent practical experience).

Nice To Haves

  • Industry Expertise: Experience in the investment management or financial services industry, with understanding of common data challenges and use cases in this domain (e.g. portfolio analytics, risk, trading data, regulatory reporting).
  • Professional Certifications: Relevant certifications such as Snowflake SnowPro certifications and dbt certification (if available) are highly desirable, demonstrating validated expertise.
  • Additional Data Tool Experience: Familiarity with complementary data technologies and frameworks (e.g., cloud data processing with Spark or Kafka streaming, data lake platforms, BI/analytics tools) is a plus.
  • Consulting/Client-Facing Experience: Prior experience in a forward-deployed engineering role, solutions engineering, technical consulting, or technical account management is beneficial. This includes working on-site or closely with customers to deliver technical solutions, gather requirements, and ensure customer success.
  • Leadership in Practice: While this is an IC role, experience leading projects or mentoring other engineers can be advantageous, indicating readiness to operate at a principal/IC-lead level.
  • Experience in investment management or financial industries
  • Master’s/PhD in a related field
  • Snowflake SnowPro certification
  • dbt developer certification (or similar)
  • Background in consulting or client-facing engineering roles
  • Experience with big data tools (Spark, Kafka, etc.)

Responsibilities

  • Direct Client Engagement & Solution Delivery: Serve as a trusted technical advisor for DPaaS adopters, partnering with teams to understand their business goals and data needs. Guide clients in designing EDP-native data pipelines, helping them select appropriate patterns, architectures, and platform features. Enable teams to become self-sufficient and effective builders by transferring knowledge, patterns, and best practices rather than owning pipeline delivery.
  • Data Pipeline Design & Best Practices: Provide expert guidance on EDP data pipeline design, including ingestion, transformation, orchestration, and consumption patterns. Advise on Airflow DAG design, dbt project structure, dependency management, and operational best practices—without routinely building or owning pipelines yourself. Help teams adopt scalable, maintainable approaches aligned to medallion (bronze–silver–gold) architectures, data mesh principles, and EDP standards.
  • Snowflake Architecture & Optimization Guidance: Act as a Snowflake subject-matter expert, advising teams on data modeling, SQL design, warehouse sizing, and performance optimization. Review and provide feedback on query patterns, schema designs, and workload configurations to help adopters achieve cost-efficient, performant solutions. Support teams in diagnosing performance or cost issues, guiding them toward effective remediation.
  • Design Review, Quality & Governance: Review pipeline designs, data models, and implementation approaches to ensure alignment with enterprise data quality, governance, and security standards. Coach teams on implementing data quality checks using Great Expectations and on improving metadata, lineage, and discoverability via DataHub or similar tools. Promote consistent engineering standards, documentation practices, and operational readiness.
  • Data Quality & Governance: Implement and promote best practices in data quality, validation, and governance. Utilize Great Expectations for automated data quality checks and DataHub (or similar metadata/catalog tools) to improve data discoverability and lineage tracking. Ensure that solutions meet enterprise data governance standards and enable trust in data for decision-making.
  • Cross-Functional Collaboration: Collaborate with platform engineering, product management, analytics teams, and business units to align technical solutions with business needs. Provide feedback from client engagements to internal product and engineering teams to influence the EDP’s roadmap and enhancements. Act as a bridge between technical teams and client stakeholders, ensuring clarity and mutual understanding of requirements and outcomes.
  • Technical Leadership & Mentorship: As a principal-level expert, provide thought leadership in data engineering and analytics. Mentor and guide junior forward deployed engineers or analytics engineers in the team (though this role has no direct managerial duties). Lead by example in following agile methodologies, maintaining a data product mindset (delivering iterative improvements and focusing on end-user value), and upholding engineering best practices such as version control, CI/CD for data pipelines, and documentation.
  • Problem Solving & Support: Tackle complex technical challenges in real-time. Troubleshoot critical issues across the EDP stack – from pipeline failures to data inconsistencies – and drive issues to resolution. Serve as the go-to expert for diagnosing problems in SQL queries, data models, Airflow DAGs, and Snowflake performance, ensuring high reliability and responsiveness for client-facing data services.
  • Continuous Improvement: Stay up-to-date with the latest developments in data engineering, analytics, and the investment management industry. Proactively identify opportunities to improve our platform’s practices (e.g., new features in Snowflake, emerging tools in the data ecosystem) and help integrate them to continually enhance the value we deliver to clients.

Benefits

  • employees are eligible for an annual discretionary bonus, and benefits including healthcare, leave benefits, and retirement benefits.
  • strong retirement plan
  • tuition reimbursement
  • comprehensive healthcare
  • support for working parents
  • Flexible Time Off (FTO)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service