Sev1 Tech-posted about 18 hours ago
Full-time • Mid Level
Arlington, VA
501-1,000 employees

Overview/ Job Responsibilities We are seeking a highly skilled Senior Databricks Platform Administrator to take ownership of our already-operational, AWS-hosted Databricks platform and lead it into its next stage of maturity, automation, and operational excellence. This is not primarily a data engineering role and not a greenfield design effort. We have a running enterprise Databricks environment that is expanding rapidly, and we need an experienced platform operator who can own, operate, automate, and harden the system as adoption accelerates. You will be the hands-on technical owner guiding operational excellence, shaping governance, implementing automation, developing scalable processes, and ensuring the environment is secure, efficient, and ready for large-scale enterprise usage. Your work will directly influence the platform’s ability to grow while maintaining consistency, reliability, and control. If your background includes administering Databricks at scale in AWS, not just using it for project-based workloads this role is built specifically for you.

  • Platform Administration & Operations (Primary Focus) This is a hands-on-keyboard platform operations role, not a supervisory or abstract architecture position. Serve as the hands-on administrator for an established, production AWS Databricks environment. Strengthen and mature the platform as it enters a major growth stage with increasing users, workloads, and data domains. Monitor, troubleshoot, and optimize workspace performance, cluster reliability, compute utilization, job operations, and platform health. Implement and maintain compute policies, cluster configurations, pools, serverless SQL warehouses, and workload management strategies. Lead ongoing efforts to automate provisioning, governance, monitoring, and operational processes across the platform.
  • Governance, Security, and Policies Manage and enhance Unity Catalog, metastore configuration, catalog hierarchies, permissions, and lineage. Define and enforce RBAC, workspace controls, access policies, and compliance frameworks. Implement security best practices across S3, IAM, KMS, secret scopes, encryption, and network configuration. Ensure adherence to FedRAMP/FISMA requirements and agency-specific data governance standards.
  • Automation & Process Leadership Guide the creation of an automation roadmap for provisioning, user lifecycle, policy enforcement, workspace configuration, lineage, and monitoring. Build the frameworks and workflows required for scalable, consistent platform operations. Establish strong operational documentation, standards, and procedures.
  • AWS & Infrastructure Integration Work closely with cloud engineering teams to optimize S3, IAM, networking, logging, monitoring, and related AWS components. Enhance the platform’s integration with enterprise identity solutions (SAML, SCIM, Entra).
  • Collaboration & Support Partner with data engineers, analysts, and data owners to ensure best practices across jobs, clusters, Delta Lake designs, and data access patterns. Act as the platform SME, guiding teams toward secure, optimized usage of Databricks
  • Bachelor's degree in computer science, information technology, or a related field. Equivalent experience will also be considered.
  • Proven experience in building and configuring enterprise-level data lake solutions using Databricks in an AWS or Azure environment.
  • In-depth knowledge of Databricks architecture, including workspaces, clusters, storage, notebook development, and automation capabilities.
  • Strong expertise in designing and implementing data ingestion pipelines, data transformations, and data quality processes using Databricks.
  • Experience with big data technologies such as Apache Spark, Apache Hive, Delta Lake, and Hadoop.
  • Solid understanding of data governance principles, data modeling, data cataloging, and metadata management.
  • Hands-on experience with cloud platforms like AWS or Azure, including relevant services like S3, EMR, Glue, Data Factory, etc.
  • Proficiency in SQL and one or more programming languages (Python, Scala, or Java) for data manipulation and transformation.
  • Knowledge of data security and privacy best practices, including data access controls, encryption, and data masking techniques.
  • Strong problem-solving and analytical skills, with the ability to identify and resolve complex data-related issues.
  • Excellent interpersonal and communication skills, with the ability to collaborate effectively with technical and non-technical stakeholders.
  • Experience in a senior or lead role, providing technical guidance and mentorship to junior team members.
  • Must be eligible to obtain a Department of Homeland Security EOD clearance (Requirements 1. US Citizenship, 2. Favorable Background Investigation)
  • Relevant certifications such as Databricks Certified Developer or Databricks Certified Professional are highly desirable.
  • Active DHS/CISA suitability - 1st priority
  • Any DHS badge + DoD Top Secret - 2nd choice
  • DoD Top Secret + willingness to obtain DHS/CISA suitability - 3rd choice (it can take 10-60 days to obtain suitability – work can only begin once suitability is fully adjudicated).
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service