The candidate will be responsible for designing, building, and testing end-to-end data pipelines, including data ingestion (streaming, events, and batch), data integration, and data curation. This role involves designing, developing, and deploying scalable data pipelines and ETL processes on cloud-based infrastructure using Azure, Snowflake, DBT, Airflow, and Cosmos DB. The engineer will define and implement automation for jobs and testing, optimize data pipelines for various workloads and use cases, and support mission-critical applications and near real-time data needs. Additionally, the role requires addressing data and environment issues, performing impact and root cause analysis, and implementing corrective, adaptive, and perfective maintenance. The engineer will also implement data models, transformations, and schema designs to support analytical and reporting needs, and optimize Snowflake performance, including query optimization, resource management, and scaling strategies.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Senior