Senior Data Engineer - SDE 26-04140

NavitasPartnersNew York, NY

About The Position

We are seeking a highly experienced Senior Data Engineer to support the design, development, and optimization of large-scale data platforms within a financial services environment. This role focuses on building robust data pipelines, enabling advanced credit risk analytics, and ensuring compliance with regulatory standards. The ideal candidate will bring deep expertise in credit and counterparty risk, modern data engineering technologies, and cloud-based data platforms, while collaborating closely with cross-functional teams including risk, quant, and compliance stakeholders.

Requirements

  • 12+ years of experience in data engineering or data development
  • Strong experience within financial services, particularly credit or counterparty risk
  • Expertise in regulatory frameworks such as Basel III/IV, IFRS 9, CECL, or FRTB
  • Advanced proficiency in Python and PySpark / Apache Spark
  • Hands-on experience with cloud platforms and data lake technologies (e.g., Databricks, Delta Lake)
  • Strong SQL skills including complex queries, joins, and performance optimization
  • Experience building data pipelines from multiple financial data sources
  • Familiarity with workflow orchestration tools (e.g., Airflow or similar)
  • Experience with CI/CD tools such as Git, Jenkins, or Azure DevOps
  • Cloud certification (e.g., AWS Certified Cloud Practitioner or equivalent)

Nice To Haves

  • Experience with large-scale financial risk systems and analytics platforms
  • Background in designing data architectures and maintaining data dictionaries
  • Experience working in Agile environments using tools such as JIRA or Confluence
  • Exposure to modern data governance and compliance practices

Responsibilities

  • Lead architecture and technical design discussions for credit risk data platforms
  • Design and implement scalable batch and streaming data pipelines using PySpark
  • Build ingestion frameworks for structured and unstructured data from upstream systems into cloud storage (e.g., S3/ADLS)
  • Implement data processing workflows using Medallion Architecture (Bronze, Silver, Gold layers)
  • Develop and optimize transformation logic for large-scale financial datasets
  • Ensure high data quality, auditability, and regulatory compliance
  • Apply optimization techniques such as partitioning, indexing, and schema evolution
  • Model and optimize risk metrics (e.g., PD, LGD, EAD, EPE, PFE, CVA) for analytics and reporting
  • Integrate with external risk engines and support orchestration of complex batch processes
  • Ensure platform reliability, observability, and data lineage tracking
  • Implement and maintain security standards (IAM, encryption, authentication protocols)
  • Troubleshoot production issues and provide ongoing support
  • Collaborate with data scientists, risk analysts, and business stakeholders
  • Contribute to API design and data contracts for internal and external consumers
  • Maintain comprehensive technical documentation for audit and compliance purposes
  • Participate in Agile development processes and ceremonies
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service