Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Job Location - Fort Mills SC (Day One Onsite - Hybrid) About The Role Snowflake, Snowpark : The candidate should have a deep understanding of Snowflake data warehousing platform and be proficient in using Snowpark for data processing and analytics. DBT : Experience with DBT (Data Build Tool) for modeling data and creating data transformation pipelines is a plus. AWS services (Airflow) : The candidate should have hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows and pipelines. AWS services (Lambda) : Proficiency in AWS Lambda for serverless computing and event-driven architecture is essential for this role. AWS services (Glue) : The candidate should be well-versed in AWS Glue for ETL (Extract, Transform, Load) processes and data integration. Fivetran (HVR); Working knowledge and handson experience on Fivetran HVR. Python : Strong programming skills in Python are required for developing data pipelines, data transformations, and automation tasks