Involved in the Study of the business logic and coordinate with the client to gather the requirements. Designing and implementing large-scale ingest systems in a Big Data environment Optimizing all stages of the data life cycle, from initial planning, to ingest, through final display and beyond Designing and implementing data extraction, cleansing, transformation, loading, and replication/distribution capabilities Developing custom solutions/code to ingest and exploit new and existing data sources Working with Sponsor development teams to improve application performance Providing support to maintain, optimize, troubleshoot, and configure the AWS/spark/Hadoop environment as needed Collaborating with team-mates, other service providers, vendors, and users to develop new and more efficient method Experience with CI/CD pipelines, unit tests, integration, and regression testing Significant experience with Airflow Strong ability to manage competing priorities and communication to multiple stakeholders Bachelor's degree in computer science, Computer Engineering, or a related discipline with experience in software design and development.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level
Industry
Professional, Scientific, and Technical Services
Number of Employees
5,001-10,000 employees