Senior Data Engineer

Ziply FiberKirkland, WA
11dRemote

About The Position

The Senior Data Engineer will be responsible for designing, building, and maintaining scalable data pipelines, data models, and infrastructure that support business intelligence, analytics, and operational data needs. This role involves working with various structured and unstructured data sources, optimizing data workflows, and ensuring high data reliability and quality. The ideal candidate will be proficient in modern data engineering tools and cloud platforms bringing innovative solutions to a fast-paced and diverse data infrastructure.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Minimum of eight (8) years of experience in data engineering, ETL development, or related fields.
  • Strong proficiency in SQL and database technologies (PostgreSQL, MySQL, Oracle, SQL Server, etc.).
  • Familiarity with Linux/Unix and scripting technologies utilized on them.
  • Proficiency in programming languages such as Python for data engineering tasks.
  • Hands-on experience with cloud platforms such as Microsoft Azure and its data services such as Azure Data Factory and Azure Synapse Analytics,
  • Experience working with data warehouses such as Snowflake or Azure SQL Data Warehouse.
  • Familiarity with workflow automation tools such as Autosys.
  • Knowledge of data modeling, schema design, and data architecture best practices.
  • Strong understanding of data governance, security, and compliance standards.
  • Ability to work independently in a remote environment across different time zones and collaborate effectively across teams.
  • Exposure to GraphQL and RESTful APIs for data retrieval and integration.
  • Familiarity with NoSQL databases such as MongoDB.
  • Experience with version control software such as GitLab.

Nice To Haves

  • Proven aptitude for independently managing complex procedures, even when encountered infrequently.
  • Proactive approach to learning and optimizing operational workflows.
  • Familiarity with DevOps practices and CI/CD pipelines for data engineering, including Azure DevOps.
  • Proficient in designing, writing, and maintaining complex stored procedures and stored procedure–based ETL workflows for robust data processing.
  • Comfortable working in complex ecosystems with heterogeneous data sources and diverse end-user requirements, adapting solutions to fit unique contexts.
  • Working knowledge of data wrangling and ETL tools, including Alteryx or similar technologies.
  • Understanding of data privacy regulations such as GDPR and CCPA.

Responsibilities

  • Design, develop, and maintain scalable data pipelines for ingestion, transformation, and storage of large datasets.
  • Troubleshoot and resolve data pipeline and ETL failures, implementing robust monitoring and alerting systems.
  • Automate data workflows increase efficiency and reduce manual intervention.
  • Optimize data models for analytics and business intelligence reporting.
  • Build and maintain data infrastructure, ensuring performance, reliability, and scalability.
  • Implement best practices for data governance, security, and compliance.
  • Work with structured and unstructured data, integrating data from various sources including databases, APIs, and streaming platforms.
  • Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and design appropriate solutions.
  • Mentor and train junior engineers, fostering a culture of learning and innovation.
  • Develop and maintain documentation for data engineering processes and workflows.
  • Performs other duties as required to support the business and evolving organization.

Benefits

  • Medical
  • dental
  • vision
  • 401k
  • flexible spending account
  • paid sick leave and paid time off
  • parental leave
  • quarterly performance bonus
  • training
  • career growth and education reimbursement programs
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service