Data Warehouse Developer II

NATIONAL COMMISSION ON CERTIFICATION OF PHYSJohns Creek, GA
15hHybrid

About The Position

As the Data Warehouse Developer II, you will be responsible for designing, developing, and maintaining robust ETL processes and data warehouse solutions that support enterprise-wide data initiatives and ensure the integrity, performance, and scalability of the data infrastructure. Reports To: Senior Director of Application Development Who We Are National Commission on Certification of Physician Assistants (NCCPA) is the only certifying organization for physician assistants in the United States. Established as a not-for-profit organization in 1974, we are dedicated to assuring the public that board certified PAs meet established standards of clinical knowledge and cognitive skills upon entry into practice and throughout their careers. All U.S. states, the District of Columbia and the U.S. territories have decided to rely on NCCPA certification as one of the criteria for licensure or regulation of PAs. Location:NCCPA is a hybrid work environment with our headquarters located in Johns Creek, Georgia. NCCPA has determined that the telecommuting status of this position is remote, which means that employees in remote positions are allowed to primarily work remotely with required travel for mandatory meetings, including to the NCCPA headquarters. As a matter of policy, NCCPA restricts remote positions to those in which the employee’s home office location in a U.S. jurisdiction in the Eastern or Central time zones. Why Work at NCCPA We get to do meaningful work every day and we enjoy working and having fun together! No wonder we’ve been ranked #5out of 62 small businesses in the metro Atlanta area by the Atlanta Journal Constitution's 2025 Top Workplacesin the Region.

Requirements

  • Bachelor's degree in Computer Science, Information Systems, or a related field.
  • At least 3 years of experience in ETL development using SSIS or similar tools.
  • At least 3 years of experience with SQL Server and T-SQL development.
  • Strong understanding of data warehousing concepts and dimensional modeling.
  • Experience with performance tuning and troubleshooting of ETL processes.
  • Familiarity with reporting tools such as Power BI or SSRS.
  • Experience with cloud-based data platforms (e.g., Azure Data Factory, Synapse).
  • Knowledge of scripting languages (e.g., Python, PowerShell).
  • Experience with Agile methodologies and DevOps tools (e.g., Azure DevOps).
  • Familiarity with data governance and data quality frameworks.
  • Your strong analytical and problem-solving skills.
  • Your highly developed time management skills, systematic approach to organization/planning and keen attention to detail while managing multiple projects.
  • Your excellent verbal and written communication skills.
  • Your interpersonal skills and ability to collaborate with cross-functional teams.

Nice To Haves

  • Experience with data lake architecture and implementation.
  • Exposure to data privacy regulations such as GDPR and HIPAA.
  • Experience with real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming).
  • Experience with data lineage and metadata management practices.
  • Familiarity with data observability and monitoring tools.
  • An understanding of data mesh or modern data platform concepts.

Responsibilities

  • Design, develop, and maintain complex ETL processes using SQL Server Integration Services (SSIS) and other tools.
  • Build and optimize star and snowflake schema data models to support analytical and reporting needs.
  • Ensure the accuracy, integrity, and performance of data pipelines and warehouse structures.
  • Monitor, troubleshoot, and tune ETL jobs and data warehouse performance.
  • Collaborate with business analysts, data architects, and other stakeholders to gather requirements and deliver data solutions.
  • Develop and maintain technical documentation for ETL processes and data models.
  • Support data quality initiatives and implement data validation and cleansing routines.
  • Participate in code reviews, testing, and deployment activities following SDLC best practices.
  • Monitor and validate nightly ETL jobs and data loads; troubleshoot and resolve failures.
  • Implement alerting and logging for ETL processes and participate in on-call rotations as needed.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service