Data Engineer I

Kroll
107dHybrid

About The Position

Kroll’s Private Capital Markets (PCM) platform is transforming private asset valuation and portfolio workflows for alternative asset managers. We’re seeking a Data Engineer to design and implement secure, scalable data solutions across the PCM platform on Azure. You will collaborate closely with Product and Implementation teams to deliver client-ready analytics, robust APIs, and high-performance data pipelines that power financial workflows spanning private equity, fixed income, derivatives, and structured products. You’ll also help establish engineering standards and communities of practice across a global team of data professionals and developers. This is a hybrid role, requiring 2–3 days of on-site presence each week.

Requirements

  • Proven experience building ETL/ELT pipelines using Azure, AWS, or Databricks platforms.
  • Strong proficiency in SQL (T-SQL, PL/pgSQL, Spark-SQL) for data transformation and optimization.
  • Skilled in Python, C#/.NET, or Java for data engineering and backend services.
  • Hands-on experience with REST API development, Python SDKs, and containerization tools such as Docker and Kubernetes.
  • Working knowledge of CI/CD pipelines, Git, and Azure DevOps.
  • Experience with Microsoft SQL Server, PostgreSQL, and cloud-native databases.
  • Understanding of data warehousing, dimensional modeling, and data lake architectures.
  • Hands-on experience with data pipeline orchestration tools like Airflow, Ascend, or Azure Synapse.
  • Exposure to data quality frameworks and monitoring best practices.
  • Partner effectively with Product Owners and end users in an agile environment.
  • Participate in code reviews, technical design sessions, and architecture discussions.
  • Demonstrated ability to manage multiple priorities, solve complex problems, and deliver scalable solutions.
  • Master’s degree in Computer Science, Data Science, Mathematics, Statistics, or a related field.
  • Minimum 3 years of hands-on data engineering experience, ideally within financial services.
  • Ability to handle confidential and sensitive information with discretion.

Nice To Haves

  • Relevant Cloud (Azure/AWS) or Data Engineering certifications preferred.

Responsibilities

  • Data Pipeline Construction: Design, build, and maintain reliable data pipelines to move, transform, and integrate data from diverse sources into data warehouses or lakes.
  • ETL and Data Integration: Develop and optimize ETL/ELT processes using tools such as Azure Data Factory, Databricks, Synapse, DBT, Airflow, or Informatica.
  • Data Warehousing: Model and manage data warehouses to ensure efficient querying, high performance, and data quality using platforms like Azure Synapse, Snowflake, Redshift, or BigQuery.
  • Data Quality & Monitoring: Implement validation, cleaning, and monitoring processes to ensure data accuracy, consistency, and reliability.
  • Data Security: Apply robust data governance practices, manage access permissions, and ensure compliance with privacy regulations.
  • Performance & Scalability: Optimize systems to handle growing data volumes and support evolving business needs.
  • Lead and mentor cross-functional teams, driving adoption of modern data technologies and best practices.
  • Spearhead greenfield initiatives that align with strategic business objectives, including innovation to support revenue growth and market expansion.
  • Own key functional areas of the PCM platform to ensure operational efficiency, reliability, and peak performance.
  • Promote collaboration and excellence by participating in architectural reviews, defining technical standards, and contributing to a culture of continuous improvement.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service