Data Engineer

AcuityReston, VA

About The Position

Acuity, Inc. seeks a Data Engineer to design, develop, and maintain scalable data pipelines and integration solutions supporting a federal client. This role builds and optimizes ETL/ELT workflows across the data warehouse, data lake, and analytics platform environment, with strong emphasis on Databricks, Apache Spark, and AWS . The Data Engineer delivers reliable, secure, and high-performing data solutions that support analytics, reporting, and downstream advanced analytics use cases. This individual works closely with engineers, architects, and analysts to implement modern data engineering practices across cloud-based data platforms. Why Acuity? We’re building more than careers—we’re building a place people genuinely love to work. At Acuity, you’ll get: Strong work-life balance and a people-first culture Competitive compensation + Health/Dental/Vision coverage 401(k) with company match Up to $6,000/year for training & professional development 🏆 Recognized as a Best Places to Work (Washington Business Journal, 8+ years) 🏆 Named a Top Workplace by The Washington Post (2022–2025) Ready to grow with a team that invests in you? Learn more: www.myacuity.com Are you ready to use your expertise in the areas of IT Modernization, Data Enablement, and Hyperautomation to make a real difference? Join Acuity, Inc., a technology consulting firm that supports federal agencies. We combine industry partnerships and long-term federal experience with innovative technical leadership to support our customers’ critical missions.

Requirements

  • Bachelor’s degree in Computer Science , Engineering, or a related field, or equivalent relevant experience
  • 5+ years of experience developing big data or data engineering solutions in an Agile environment
  • Hands-on experience with Databricks , including Delta Lake or lakehouse -based data engineering
  • Experience developing and deploying ETL/ELT pipelines using Apache Spark
  • Strong SQL skills, including complex queries, joins, aggregations, and window functions
  • Experience with AWS and cloud-based data platform services
  • Experience with Python, PySpark , SQL , or similar data engineering languages and frameworks
  • Experience with code versioning, CI/CD, and configuration management concepts
  • Experience working in Agile delivery environments, including Scrum
  • Ability to troubleshoot and optimize data pipelines for quality, performance, and reliability
  • Must be a U.S. Citizen with the ability to obtain and maintain U.S. suitability

Nice To Haves

  • Experience with Delta Lake design and optimization
  • Experience integrating Databricks with Git-based workflows in controlled network environments
  • Experience with data quality, lineage, logging, and metadata practices
  • Experience supporting advanced analytics or machine learning data pipelines
  • Experience tuning performance in distributed data processing environments

Responsibilities

  • Design, build, and maintain scalable data pipelines, ETL/ELT processes, and integration workflows
  • Develop transformations and curated datasets using Databricks, Apache Spark, and SQL
  • Support ingestion, processing, and delivery of structured, semi-structured, and unstructured data
  • Build and optimize workflows across data lake, data warehouse, and analytics platforms
  • Implement data quality checks, validation rules, and operational monitoring across pipelines and datasets
  • Improve performance, reliability, scalability, and security of data processing solutions
  • Support cloud-based data engineering solutions in AWS , including storage and data service integration
  • Collaborate with engineers, architects, analysts, and stakeholders to implement technical requirements
  • Support Big Data, Advanced Analytics, and Machine Learning use cases from a data engineering perspective
  • Maintain documentation for pipelines, transformations, data flows, and operational procedures
  • Support code versioning, CI/CD, and controlled deployment of data engineering assets
  • Troubleshoot pipeline, data, and platform issues across development, test, and production environments

Benefits

  • Strong work-life balance and a people-first culture
  • Competitive compensation + Health/Dental/Vision coverage
  • 401(k) with company match
  • Up to $6,000/year for training & professional development
  • Grow Your Career, Your Way We invest in you with personalized development plans, mentorship, and up to $6,000 annually for training and certifications—so you can keep building the career you want.
  • Be Part of Something Innovative You’ll work on cutting-edge solutions that support important government missions, in an environment that encourages new ideas and continuous improvement.
  • Thrive in a People-First Culture Collaboration, respect, and support aren’t just values—they’re how we operate. Your voice is heard, your contributions are recognized, and your success is shared.
  • Feel Valued and Rewarded We offer competitive compensation, comprehensive benefits, and a strong focus on work-life balance so you can perform at your best—at work and at home.
  • Join an Award-Winning Team Our employees consistently rank us among the best—earning honors like Best Places to Work (Washington Business Journal, 9+ years) and Top Workplaces (The Washington Post, 2022–2025).
  • Bring Your Whole Self to Work We’re committed to building a diverse, inclusive environment where everyone feels respected, supported, and empowered to succeed.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service