Junior Data Engineer

Calamos InvestmentsNaperville, IL
28d

About The Position

Develop and maintain data pipelines in Databricks following Agile/Scrum methodology Write clean, well-documented code that adheres to team and firm standards Participate in the on-call rotation to monitor overnight data delivery SLAs, troubleshoot failures, and respond to data quality issues Collaborate with architects, engineers, and business stakeholders to understand requirements and deliver solutions Contribute to design sessions, code reviews, and continuous improvement initiatives Bachelor's degree in Computer Science, Data Engineering, or related field 3+ years of experience developing data pipelines in a modern data platform environment Strong Python development skills with experience in PySpark and Databricks Understanding of medallion architecture and data lakehouse concepts Experience with Azure cloud services (Data Lake Storage, Key Vault, Azure DevOps) Familiarity with Delta Lake and data quality frameworks Experience with modern CI/CD practices and infrastructure as code Strong SQL skills and understanding of data modeling principles Excellent communication skills with ability to translate technical concepts for business stakeholders Financial services industry experience is a plus Equal Opportunity Employer This employer is required to notify all applicants of their rights pursuant to federal employment laws. For further information, please review the Know Your Rights notice from the Department of Labor.

Requirements

  • Bachelor's degree in Computer Science, Data Engineering, or related field
  • 3+ years of experience developing data pipelines in a modern data platform environment
  • Strong Python development skills with experience in PySpark and Databricks
  • Understanding of medallion architecture and data lakehouse concepts
  • Experience with Azure cloud services (Data Lake Storage, Key Vault, Azure DevOps)
  • Familiarity with Delta Lake and data quality frameworks
  • Experience with modern CI/CD practices and infrastructure as code
  • Strong SQL skills and understanding of data modeling principles
  • Excellent communication skills with ability to translate technical concepts for business stakeholders

Nice To Haves

  • Financial services industry experience is a plus

Responsibilities

  • Develop and maintain data pipelines in Databricks
  • Write clean, well-documented code
  • Participate in the on-call rotation to monitor overnight data delivery SLAs, troubleshoot failures, and respond to data quality issues
  • Collaborate with architects, engineers, and business stakeholders to understand requirements and deliver solutions
  • Contribute to design sessions, code reviews, and continuous improvement initiatives
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service