Senior Data Architect / Application Engineer

Hudson ManpowerNew York, NY
6dHybrid

About The Position

We are seeking a highly skilled Senior Data Architect / Application Engineer to join our team. In this role, you will be responsible for designing, implementing, and maintaining data platforms and application infrastructure. You will play a key role in driving innovative data solutions while ensuring platform reliability, security, and performance.

Requirements

  • 12–15 years of experience as an application developer or data engineer.
  • Strong communication and collaboration skills .
  • Experience working in Agile environments .
  • Ability to design and document technical architectures and system designs .
  • Proactive and self-driven team player with strong analytical skills.
  • Python (expert level) including PySpark / Spark for data engineering.
  • Azure Databricks with experience implementing Medallion Lakehouse Architecture .
  • Strong SQL expertise including joins, unions, stored procedures, and query optimization.
  • REST API development using frameworks such as Django, Flask, or FastAPI .
  • Experience with CI/CD pipelines using Git, Jenkins, and Azure DevOps .
  • Experience building data ingestion and transformation pipelines .
  • Cloud platform certification such as AWS Certified Cloud Practitioner or equivalent.
  • Strong experience in Credit Risk and Counterparty Risk within the financial services or capital markets domain .

Nice To Haves

  • Advanced degree in Finance, Computer Science, or related field .
  • Experience with risk modeling and financial analytics .
  • Knowledge of deployment, operational support, and monitoring tools .
  • Exposure to technical architecture design and system documentation .

Responsibilities

  • Lead architecture and technical design discussions using industry best practices and modern technologies.
  • Support production operations and resolve complex issues within the Credit Risk application platform .
  • Design and implement batch and ad-hoc data pipelines using Medallion Lakehouse architecture on modern cloud platforms, primarily Databricks .
  • Build and maintain data ingestion pipelines from upstream systems into object storage (e.g., S3, ADLS) using formats like Parquet , including partitioning, z-ordering, and schema evolution.
  • Integrate with external XVA / risk engines and develop orchestration logic for long-running computations.
  • Model and optimize risk metrics such as EPE and PFE for efficient querying and analytics.
  • Ensure platform reliability, security, observability, and auditability , including IAM roles, authentication mechanisms, and encryption.
  • Contribute to API design for internal and external consumers, including versioning, documentation, error handling, and SLAs.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service