Vice President, Senior Data Engineer

Oaktree Capital Management, L.P.Los Angeles, CA

About The Position

Oaktree is a leader among global investment managers specializing in alternative investments, with more than $220 billion in assets under management. The firm emphasizes an opportunistic, value-oriented, and risk-controlled approach to investments in credit, equity, and real estate. The firm has more than 1,400 employees and offices in more than 25 cities worldwide. We are committed to cultivating an environment that is collaborative, curious, inclusive and honors diversity of thought. Providing training and career development opportunities and emphasizing strong support for our local communities through philanthropic initiatives are essential to our culture. The Data Solutions team at Oaktree Capital Management delivers trusted, high-quality data that powers the firm’s global investment and business operations. Through close collaboration with Technology and business partners, we drive data strategy, governance, and product delivery to enable reliable insights, operational efficiency, and a unified data foundation. For additional information, please visit Oaktree’s website at http://www.oaktreecapital.com/ Responsibilities As a Senior Data Engineer within Oaktree's Information Solutions team, you will design, build, and maintain scalable and efficient data pipelines using Azure and Fabric technologies, ensuring data quality, integrity, and availability for various business needs. You will play a crucial role in integrating data from diverse sources, transforming it into meaningful insights, and enabling data-driven decision-making across the organization. Hands-On Technical Leadership & Architecture Serve as the primary technical authority for data engineering across the firm. Personally design, review, and guide complex data pipelines, transformations, and architectures. Lead architecture and implementation using: Azure Data Factory for orchestration dbt Core for transformation and modeling Azure Functions where appropriate Design and own Microsoft Fabric / OneLake architecture, including: Workspace strategy (single vs. multi-workspace) User access, security, and governance models OneLake data organization and lifecycle standards Remain hands-on with SQL and dbt for complex investment and performance-related use cases. Asset Management Data Engineering Lead the design and implementation of data models and pipelines supporting: Portfolio construction and holdings Security and account reference data AUM, investment performance, and attribution Benchmarks, indices, ESG, and alternative data Ensure data solutions align with front-office, risk, and operational workflows. Translate asset management concepts into scalable, well-modeled data structures. Engineering Delivery & Consultant Management Provide hands-on technical leadership to a hybrid team of internal engineers and offshore consultants. Set clear expectations for code quality, testing, documentation, and deployment. Review and approve consultant delivered code and architecture. Act as the technical escalation point for complex or high-risk issues. Enforce DevOps and CI/CD practices for data engineering. Collaboration & Stakeholder Engagement Partner with investment, risk, operations, and technology stakeholders to define data requirements and priorities. Communicate technical decisions, tradeoffs, and risks clearly to both technical and non-technical audiences. Contribute to the data platform roadmap and help prioritize initiatives based on business value.

Requirements

  • 7-10 years of experience in data engineering and analytics, with deep hands-on technical experience
  • Proven experience acting as a lead engineer, architect, or technical VP for enterprise data platforms
  • Knowledge of the asset management industry and asset classes to bridge business and technology for data solutions
  • Strong understanding of key data concepts (e.g., Portfolio Construction, Security/Account Reference data, AUM, Investment Results/Attribution, Index/Benchmark)
  • Experience with industry data (e.g., Bloomberg, FactSet, Morningstar, ESG, Index, alternative data)
  • Relationship Building; works effectively with strong, diverse teams of people with multiple perspectives, talents, and backgrounds
  • Communication; strong interpersonal and verbal/written communication skills; ability to present complex material
  • Independence & Collaboration; experience at working both independently and in a team-oriented, collaborative environment
  • Work Ethic; focus on continual development, performance, accountability, and self-motivation
  • Flexibility & Organization; adapt to shifting priorities, demands and timelines through analytical and problem-solving capabilities
  • Intellectual Curiosity; energized by learning new things and engaging across a wide range of issues
  • Driving Results; sets aggressive timelines and objectives to drive results, conveys a sense of urgency, and drives issues to closure
  • Judgment; makes recommendations and decisions that balance a variety of factors

Responsibilities

  • Design, build, and maintain scalable and efficient data pipelines using Azure and Fabric technologies
  • Serve as the primary technical authority for data engineering across the firm
  • Lead architecture and implementation using Azure Data Factory, dbt Core, and Azure Functions
  • Design and own Microsoft Fabric / OneLake architecture
  • Lead the design and implementation of data models and pipelines supporting portfolio construction, security reference data, AUM, investment performance, benchmarks, indices, ESG, and alternative data
  • Provide hands-on technical leadership to a hybrid team of internal engineers and offshore consultants
  • Partner with investment, risk, operations, and technology stakeholders to define data requirements and priorities

Benefits

  • discretionary bonus incentives
  • a comprehensive benefits package
  • a flexible work arrangement
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service