Data Engineer

Acuity INC

About The Position

Acuity, Inc. seeks a Data Engineer to design, develop, and maintain scalable data pipelines and integration solutions supporting a federal client. This role builds and optimizes ETL/ELT workflows across the data warehouse, data lake, and analytics platform environment, with strong emphasis on Databricks, Apache Spark, and AWS. The Data Engineer delivers reliable, secure, and high-performing data solutions that support analytics, reporting, and downstream advanced analytics use cases. This individual works closely with engineers, architects, and analysts to implement modern data engineering practices across cloud-based data platforms. Why Acuity? We’re building more than careers—we’re building a place people genuinely love to work. At Acuity, you’ll get: • Strong work-life balance and a people-first culture• Competitive compensation + Health/Dental/Vision coverage• 401(k) with company match• Up to $6,000/year for training & professional development 🏆 Recognized as a Best Places to Work (Washington Business Journal, 8+ years)🏆 Named a Top Workplace by The Washington Post (2022–2025) Ready to grow with a team that invests in you? Learn more: www.myacuity.com Why Acuity? Are you ready to use your expertise in the areas of IT Modernization, Data Enablement, and Hyperautomation to make a real difference? Join Acuity, Inc., a technology consulting firm that supports federal agencies. We combine industry partnerships and long-term federal experience with innovative technical leadership to support our customers’ critical missions.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent relevant experience
  • 5+ years of experience developing big data or data engineering solutions in an Agile environment
  • Hands-on experience with Databricks, including Delta Lake or lakehouse-based data engineering
  • Experience developing and deploying ETL/ELT pipelines using Apache Spark
  • Strong SQL skills, including complex queries, joins, aggregations, and window functions
  • Experience with AWS and cloud-based data platform services
  • Experience with Python, PySpark, SQL, or similar data engineering languages and frameworks
  • Experience with code versioning, CI/CD, and configuration management concepts
  • Experience working in Agile delivery environments, including Scrum
  • Ability to troubleshoot and optimize data pipelines for quality, performance, and reliability

Nice To Haves

  • Experience with Delta Lake design and optimization
  • Experience integrating Databricks with Git-based workflows in controlled network environments
  • Experience with data quality, lineage, logging, and metadata practices
  • Experience supporting advanced analytics or machine learning data pipelines
  • Experience tuning performance in distributed data processing environments

Responsibilities

  • Design, build, and maintain scalable data pipelines, ETL/ELT processes, and integration workflows
  • Develop transformations and curated datasets using Databricks, Apache Spark, and SQL
  • Support ingestion, processing, and delivery of structured, semi-structured, and unstructured data
  • Build and optimize workflows across data lake, data warehouse, and analytics platforms
  • Implement data quality checks, validation rules, and operational monitoring across pipelines and datasets
  • Improve performance, reliability, scalability, and security of data processing solutions
  • Support cloud-based data engineering solutions in AWS, including storage and data service integration
  • Collaborate with engineers, architects, analysts, and stakeholders to implement technical requirements
  • Support Big Data, Advanced Analytics, and Machine Learning use cases from a data engineering perspective
  • Maintain documentation for pipelines, transformations, data flows, and operational procedures
  • Support code versioning, CI/CD, and controlled deployment of data engineering assets
  • Troubleshoot pipeline, data, and platform issues across development, test, and production environments

Benefits

  • Competitive compensation
  • Health/Dental/Vision coverage
  • 401(k) with company match
  • Up to $6,000/year for training & professional development
  • Strong work-life balance and a people-first culture
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service