Database Administrator - Mountlake Terrace, WA

Mindful Support ServicesMountlake Terrace, WA
Onsite

About The Position

The Database & Pipeline Administrator will serve as the primary owner of Mindful Support Services data structures, databases, and Azure DevOps (ADO) data pipelines, with end-to-end responsibility for how data is modeled, moved, and stored across our environment. Reporting to the Operations Director, this role will partner with internal stakeholders to design and maintain scalable schemas, build and operate ETL pipelines, and ensure that our databases and data workflows reliably support business processes and reporting. You will manage CI/CD for database changes in ADO, optimize performance, and enforce data governance and security standards. This is a full-time, in-person role based out of our Mountlake Terrace Headquarters with travel to other Mindful Support Services locations as needed.

Requirements

  • Bachelor’s degree in Computer Science, Information Systems, or a related field, or equivalent practical experience.
  • 4+ years of experience in a Database Engineer / Developer position – responsible for delivering code for development and improvement of databases.
  • 4+ years of experience in a Database Administrator position – responsible for high-level ownership, development, and direction of database systems.
  • Strong experience building and managing Azure DevOps (ADO) pipelines or similar CI/CD tooling for database and data/ETL workloads.
  • Advanced proficiency in SQL and relational database concepts, including performance tuning, indexing strategies, and query optimization.
  • Hands-on experience designing and maintaining database schemas and data models for operational and analytical workloads.
  • Demonstrated experience creating and supporting ETL/ELT pipelines using tools such as Azure Data Factory or comparable platforms.
  • Familiarity with Azure data and integration services (e.g., Azure SQL, Fabric, Logic Apps, Functions) and their role in end-to-end data solutions.
  • Experience with scripting or programming languages (e.g., PowerShell, Python, JavaScript, GO) to automate database administration and pipeline operations.
  • Strong analytical and proactive problem-solving skills, with an ability to map business processes to data flows, schemas, and ETL logic.
  • Effective communicator with the ability to explain data and infrastructure concepts to both technical and non-technical audiences.

Responsibilities

  • Own the Azure DevOps (ADO) pipelines for database and data-integration work, including build, release, and deployment of database changes and ETL jobs.
  • Leverage Python, PySpark, and SQL to engineer, transform, and optimize large-scale datasets within the Fabric ecosystem.
  • Design, document, and maintain core database schemas, tables, views, and other data structures to support both operational systems and analytics use cases.
  • Apply a strong understanding of the Medallion architecture to design and optimize data pipelines across Bronze, Silver, and Gold layers.
  • Build and refine fact and dimension tables in Microsoft Fabric Warehouse to support scalable analytics solutions.
  • Write advanced DAX measures to support complex KPI calculations and analytical models.
  • Create, maintain, and monitor ETL/ELT processes that extract, transform, and load data between source systems, the data warehouse, and downstream reporting tools.
  • Write and optimize SQL (stored procedures, functions, views, and ad hoc queries) to support applications, integrations, and reporting.
  • Administer, monitor, and tune databases (e.g., SQL, and Fabric platforms), ensuring availability, performance, and capacity to meet SLAs.
  • Implement and manage backup, restore, and disaster recovery processes for all owned databases and data pipelines, regularly testing and documenting procedures.
  • Configure and enforce database and data-access security, including roles, permissions, and data-protection controls aligned with HIPAA and internal governance.
  • Use Azure services (e.g., Azure SQL, Fabric, Logic Apps, Function apps) to orchestrate and operate cloud-based data workflows and integrations.
  • Monitor data and pipeline health (job runs, failures, data quality checks), troubleshoot issues, and drive root-cause analysis and long-term fixes.
  • Collaborate with engineers, analysts, and business stakeholders to translate business requirements into data models, ETL jobs, and ADO pipeline changes.
  • Continuously assess and improve database, ETL, and pipeline standards, introducing best practices around version control, testing, and deployment.

Benefits

  • 75% employer covered Health, Dental & Vision benefits plan
  • 401(k) savings plan with employer matching upon eligibility
  • 8 paid holidays
  • 15 PTO days accrued annually
  • Professional and career development opportunities
  • Compensation evaluated with opportunities for advancement
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service