Position Description: Supports and maintains solutions in Cloud service providers – Amazon Web Services (AWS) and Azure. Designs and delivers data lakes, data warehouses, and reporting platforms, using SQL. Provides technical solutions for platforms, using workflow management tools (Datalake and Airflow). Leverages data integration processes Extract, Load, transform (ELT) and Extract, Transform, Load (ETL). Builds Continuous Integration and Continuous Delivery (CI/CD) automation pipelines and software version control, using Jenkins, UDeploy and Concourse. Programs using Python and Java. Deploys applications using services – PCs, Simple Storage Service (S3), Elastic Cloud Compute (EC2), Relational Database Service (RDS), Azure SQL MI, VNet, and other Cloud service offerings in AWS and Azure. Primary Responsibilities: Guides and helps to design technical solutions. Provides a long-term solutions for foundational platforms. Provides solutions that deliver business value. Analyzes information to determine, recommend, and plan computer software specifications on major projects and proposes modifications and improvements based on user need. Develops software system testing and validation procedures, programming, and documentation. Develops new functionalities and supports existing applications to deliver innovative and cost-effective solutions. Writes clean, testable, readable, and easily maintainable code. Analyzes information to determine, recommend, and plan computer software specifications on major projects and proposes modifications and improvements based on user need. Develops software system testing and validation procedures, programming, and documentation. Develops original and creative technical solutions to on-going development efforts. Designs applications or subsystems on major projects and for/in multiple platforms. Develops applications for multiple projects supporting several divisional initiatives. Assists in the planning and conducting of user acceptance testing. Works on complex assignments and often multiple phases of a project. Education and Experience: Bachelor’s degree in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field (or foreign education equivalent including 3-year foreign degree) and three (3) years of experience as a Senior Data Engineer (or closely related occupation) developing and maintaining Cloud-based ETL data pipelines on AWS platform, using Airflow and Snowflake data warehouses. Or, alternatively, Master’s degree in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and one (1) year of experience as a Senior Data Engineer (or closely related occupation) developing and maintaining Cloud-based ETL data pipelines on AWS platform, using Airflow and Snowflake data warehouses. Skills and Knowledge: Candidate must also possess: Demonstrated Expertise (“DE”) designing, implementing, testing, and performing code review of ETL data pipelines, using Airflow, Azure Data Factory, Informatica, and Python; and automating and scheduling jobs in Control-M, and Unix or Linux Shell platforms. DE performing database maintenance and development in Cloud database services (AWS and Microsoft Azure) and in on-premise database platforms (Oracle, Microsoft SQL Server, MySQL, and PostgreSQL), using database tools (Toad, SQL Developer, PG Admin, and MySQL Workbench). DE designing data architecture, engineering, and models; and developing and implementing technology applications and implementing data virtualization techniques to stream data from data sources and vendors, using data engineering best practices -- repeatability, modularity, and reliability. DE performing DevOps operations, using Jenkin, Maven, Dockers, Git Repo, Azure, and AWS DevOps. #PE1M2 #LI-DNI
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level