About The Position

Acuity, Inc. seeks a Databricks Administrator to support a federal client by managing secure, stable, and efficient Databricks environments for data engineering, analytics, and advanced analytics workloads. This role owns administration of Databricks workspaces, compute policies, SQL warehouses, user access, platform configuration, monitoring, and operational support across development, test, and production environments. The Databricks Administrator partners with data engineers, analysts, platform engineers, and security teams to maintain platform health, enforce governance, optimize performance, and support controlled delivery of analytics solutions. Why Acuity? We’re building more than careers—we’re building a place people genuinely love to work. At Acuity, you’ll get: • Strong work-life balance and a people-first culture• Competitive compensation + Health/Dental/Vision coverage• 401(k) with company match• Up to $6,000/year for training & professional development 🏆 Recognized as a Best Places to Work (Washington Business Journal, 8+ years)🏆 Named a Top Workplace by The Washington Post (2022–2025) Ready to grow with a team that invests in you? Learn more: www.myacuity.com Why Acuity? Are you ready to use your expertise in the areas of IT Modernization, Data Enablement, and Hyperautomation to make a real difference? Join Acuity, Inc., a technology consulting firm that supports federal agencies. We combine industry partnerships and long-term federal experience with innovative technical leadership to support our customers’ critical missions.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent relevant experience
  • 5+ years of experience developing big data or data engineering solutions in an Agile environment
  • Hands-on experience with Databricks, including Delta Lake or lakehouse-based data engineering
  • Experience developing and deploying ETL/ELT pipelines using Apache Spark
  • Strong SQL skills, including complex queries, joins, aggregations, and window functions
  • Experience with AWS and cloud-based data platform services
  • Experience with Python, PySpark, SQL, or similar data engineering languages and frameworks
  • Experience with code versioning, CI/CD, and configuration management concepts
  • Experience working in Agile delivery environments, including Scrum
  • Ability to troubleshoot and optimize data pipelines for quality, performance, and reliability
  • Must be a U.S. Citizen with the ability to obtain and maintain U.S. suitability

Nice To Haves

  • Experience with Delta Lake design and optimization
  • Experience integrating Databricks with Git-based workflows in controlled network environments
  • Experience with data quality, lineage, logging, and metadata practices
  • Experience supporting advanced analytics or machine learning data pipelines
  • Experience tuning performance in distributed data processing environments

Responsibilities

  • Design, build, and maintain scalable data pipelines, ETL/ELT processes, and integration workflows
  • Develop transformations and curated datasets using Databricks, Apache Spark, and SQL
  • Support ingestion, processing, and delivery of structured, semi-structured, and unstructured data
  • Build and optimize workflows across data lake, data warehouse, and analytics platforms
  • Implement data quality checks, validation rules, and operational monitoring across pipelines and datasets
  • Improve performance, reliability, scalability, and security of data processing solutions
  • Support cloud-based data engineering solutions in AWS, including storage and data service integration
  • Collaborate with engineers, architects, analysts, and stakeholders to implement technical requirements
  • Support Big Data, Advanced Analytics, and Machine Learning use cases from a data engineering perspective
  • Maintain documentation for pipelines, transformations, data flows, and operational procedures
  • Support code versioning, CI/CD, and controlled deployment of data engineering assets
  • Troubleshoot pipeline, data, and platform issues across development, test, and production environments

Benefits

  • Competitive compensation
  • Health/Dental/Vision coverage
  • 401(k) with company match
  • Up to $6,000/year for training & professional development
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service