Flightsafety International-posted 2 months ago
Columbus, OH
5,001-10,000 employees
Educational Services

The Senior Data Engineer is a hands-on technical expert responsible for designing, building, and maintaining modern data pipelines and architectures in a cloud-based environment. This role supports enterprise analytics by enabling scalable, reliable, and automated data solutions using Azure, Databricks, DBT, and Airflow. The engineer collaborates across teams to deliver high-quality data products that drive business intelligence and advanced analytic.

  • Design and develop scalable ETL/ELT pipelines using Azure Data Factory (ADF), Databricks, and DBT
  • Implement real-time and batch data processing using Delta Live Tables (DLT)
  • Orchestrate data workflows using Databricks LakeFlow, Apache Airflow, and ADF pipelines
  • Design and implement Data Vault 2.0 models for cloud-based data warehousing
  • Develop data ingestion and replication solutions using tools such as Fivetran, SQDR, Rivery, or custom Python scripts
  • Write Python and PySpark code for data transformation, cleansing, and automation
  • Monitor and optimize pipeline performance, ensuring data quality and reliability
  • Collaborate with analysts, architects, and business stakeholders to understand data needs and deliver consistent datasets
  • Maintain documentation for data flows, models, and pipeline logic
  • Support data governance, metadata management, and compliance initiatives
  • Participate in Agile ceremonies and contribute to sprint planning, reviews, and retrospectives
  • Troubleshoot and resolve production issues related to data pipelines and integrations
  • Contribute to CI/CD automation and DevOps practices for data engineering components
  • Provide input on architectural decisions and participate in enterprise data strategy discussions
  • Infrequent travel as needed
  • Bachelor's degree from an accredited institution or equivalent industry experience
  • 10+ years of experience in software or data engineering roles
  • 5+ years of experience in enterprise ETL, analytics, and reporting (SSIS, SSAS, SSRS)
  • 3+ years of experience with Azure Data Factory and Azure Data Lake Storage
  • 2+ years of experience with Databricks, Delta Live Tables, and Unity Catalog
  • 2+ years of experience with Python and PySpark
  • Experience with DBT for data modeling and transformation
  • Experience with Apache Airflow for workflow orchestration
  • Experience with Data Vault 2.0 modeling (certification preferred)
  • Experience in Education or Aviation industries is a plus
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service