About The Position

The Knights of Columbus is embarking on the modernization of its core data platforms. We are currently seeking an Associate Data Movement Engineer to lead the design, development, and optimization of our cloud-based data warehouse. This role involves building scalable data models, managing ETL/ELT workflows, and ensuring the delivery of high-quality, secure, and efficient data. The ideal candidate will have strong hands-on experience with Snowflake, Data Vault modeling, and modern orchestration frameworks, along with a passion for data architecture and governance.

Requirements

  • Bachelor’s degree in Computer Science, Information Systems, or related field.
  • 1+ years of experience in data warehousing and production support.
  • Strong hands-on experience with Snowflake and Data Vault modeling. Snowflake Certification Preferred.
  • Proficiency in SQL and data architecture.
  • Proficiency in scripting languages (e.g., Python, Shell).
  • Familiarity with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Talend).
  • Experience with incident tracking systems (e.g., ServiceNow, Jira) and CI/CD practices.
  • Some experience with cloud-based database technologies.
  • Working knowledge of data warehousing concepts, structures and ETL best practices.
  • Ability to solve problems using analytical thinking skills.
  • Must work well independently and be inquisitive.
  • Strong organizational and time management skills.
  • Strong communication skills including verbal and written.
  • Strong project management skills to ensure timely delivery.

Responsibilities

  • Complete full life cycle of ETL/ELT development to address business needs or resolve issues including design, mappings, data transformations, scheduling and testing.
  • Translate data movement requirements into technical designs.
  • Develop data extraction and transmissions to external vendors.
  • Develop test plans and perform unit testing.
  • Create supporting documentation for new processes.
  • Work closely with data analysts to gain understanding of business processes and corporate data.
  • Determine impacts to data warehouse structures and information flows due to changes in business or technical requirements.
  • Contribute to architectural decisions to support business processes.
  • Provide production support for data solutions.
  • Complete root cause analysis and contribute to remediation planning and implementation.
  • Perform data quality analysis, report data quality issues and propose solutions for data quality management.
  • Learn and expand upon internal controls and participate in customer support.
  • Prepare effort estimation including researching and estimating costs of software development, unit testing. May provide estimates for upgrades of vendor packages upgrades and integration with existing systems.
  • Monitor and troubleshoot data warehouse jobs and Snowflake pipelines in production.
  • Investigate and resolve data quality issues, performance bottlenecks, and system failures.
  • Perform root cause analysis and implement long-term solutions to recurring problems.
  • Design and maintain scalable data models using Data Vault 2.0 methodology.
  • Ensure alignment of data models with business requirements and enterprise architecture standards.
  • Collaborate with data architects and business analysts to evolve the data warehouse design.
  • Develop and optimize SQL scripts, stored procedures, and data pipelines in Snowflake.
  • Implement performance tuning strategies for queries and warehouse resources.
  • Manage Snowflake roles, permissions, and data security policies.
  • Implement data validation, reconciliation, and lineage tracking mechanisms.
  • Support data governance initiatives including metadata management and audit compliance.
  • Ensure consistent and reliable data delivery to downstream systems and reporting platforms.
  • Manage and enhance job scheduling and orchestration using Apache Airflow.
  • Automate routine support tasks and improve monitoring capabilities.
  • Work closely with business users, data engineers, and application teams to resolve issues and deliver enhancements.
  • Maintain detailed documentation of data models, workflows, and production support procedures.
  • Contribute to knowledge base and support documentation for recurring issues.
  • On Call and/or after hours work required.
  • Other related duties as directed.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service