Analytics Data Engineer

Booz Allen HamiltonWashington, DC
9d

About The Position

Analytics Data Engineer The Opportunity: Are you passionate about solving problems through technology? As an experienced tech professional, you know that implementing improved digital initiatives with effective data engineering, pipeline optimization, and platform modernization is crucial for an organization’s future. You’re eager to apply your advanced experience and Databricks knowledge to redefine what’s possible and achieve it. We’re looking for someone like you to help organizations solve their traditional business challenges with new digital transformation architecture from roadmap to implementation. As a digital transformation architect on our team, you’ll transform federal data systems by implementing scalable data pipelines, medallion architecture, and analytics-ready data layers. Using Python, SQL, Databricks, and AWS services, you'll work alongside key stakeholders and other team members to share your expertise on data ecosystems and implement transformative data platforms and processes. In this role, you'll directly impact federal data transparency and decision-making by migrating legacy SQL Server and Pentaho systems into modern, scalable Databricks workflows, building robust ingestion frameworks, and architecting data layers that enable analytics and future AI-driven initiatives. With opportunities to learn new tools and skills, we focus on growing and collaborating as a team to make the best solutions for our customers. Work with us as we implement new technologies to change federal data management for the better. Join us. The world can’t wait.

Requirements

  • 2+ years of experience in data engineering with Databricks, and with Python, including pandas, psycopg2, boto3, and PySpark
  • 1+ years of experience with AWS datastores, including RDS, S3, and Redshift
  • 1+ years of experience in data modeling and data lake or warehouse architecture
  • Experience using Spark and Parquet
  • Experience with data process operations and maintenance
  • Experience with Linux, RHEL, or Windows
  • Knowledge of AI/ML concepts and how data engineering supports machine learning initiatives
  • Ability to translate SQL Server stored procedures and ETL logic into Databricks workflows
  • Public Trust
  • Bachelor's degree

Nice To Haves

  • Experience working in Agile environments
  • Experience with AWS DMS
  • Knowledge of Jira and Git
  • Knowledge of data catalog tools such as Alation
  • Knowledge of Databricks AI features such as MLflow, AutoML, or Feature Engineering
  • Ability to architect data pipelines with consideration for AI/ML use cases

Benefits

  • health
  • life
  • disability
  • financial
  • retirement benefits
  • paid leave
  • professional development
  • tuition assistance
  • work-life programs
  • dependent care
  • recognition awards program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service