Working and learning data engineering concepts that relate to data science and insurance. Develop data pipelines for customers across the liberty organization, and utilize various tools such as R, Python, Redshift, SAS, AWS EC2, and Power BI for this purpose. Building all required features and adding them to the pipeline to help data scientists to make their decision making faster. Identifying data quality issues and communicating the associated risks to the stakeholders. Developing ETL data pipelines in Redshift, AWS Glue to eliminate dependency on outdated warehousing system. Automation of whole business scenarios from initial data pipeline building to final prediction making algorithms. Telecommuting permitted up to 100%.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level