Databricks Developer

LeidosAshburn, VA
6dOnsite

About The Position

The Homeland Sector within Leidos is seeking a Databricks Developer to design, develop and architect enterprise reporting solutions for critical systems supporting the Passenger Systems Program Directorate (PSPD) within Customs and Border Protection (CBP). PSPD supports the Department of Homeland Security (DHS) and CBP critical missions, specifically screening and processing travelers at the ports of entry (POEs) into the United States. This position is responsible for ensuring high data quality, robust architecture, and efficient development practices. The ideal candidate will have deep expertise in data warehouse concepts and Databricks, and will drive best practices in data engineering and reporting. This position REQUIRES the candidate to be in Ashburn, VA, twice a week

Requirements

  • 10 years’ experience in databases such as Oracle, PostGRES, Aurora, BigData
  • Minimum 5 years of experience in Software development should include the following technologies: Oracle & SQL
  • 3+ years of experience in DataLake/DataWarehouse conceptual design, implementation, case studies, and performance tuning.
  • Must be able to maintain and obtain a CBP Background Investigation prior to start
  • Must be a US citizen
  • Deep knowledge of AWS Cloud/BigData
  • Strong PLSQL and SQL experience, also Postgres SQL
  • Knowledge of ETL processes
  • Strong hands-on experience with Databricks (Spark, Delta Lake, notebooks, clusters).
  • Deep understanding of data warehouse concepts, dimensional modeling, and ETL processes.
  • Proficiency in SQL, Python, and/or Scala for data engineering tasks.
  • Experience with cloud platforms (preferably AWS) and integrating Databricks with cloud data services.
  • Software development experience in an Agile/SecDevOps environment with proven ability to deliver on commitments.

Nice To Haves

  • Experience with CBP
  • CBP Full BI

Responsibilities

  • Design and architect scalable enterprise reporting solutions using Databricks and data warehouse technologies.
  • Develop and optimize ETL pipelines, data models, and reporting frameworks.
  • Implement and monitor data quality controls, ensuring accuracy and consistency across datasets.
  • Collaborate with stakeholders to translate business requirements into technical solutions.
  • Lead code reviews, enforce development standards, and mentor junior developers.
  • Troubleshoot and resolve issues related to data integration, processing, and reporting.
  • Maintain documentation for architecture, design, and operational procedures.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service