Engineer

TATA Consulting ServicesIrving, TX
27d$70,000 - $85,000

About The Position

Building and Implementing data ingestion and curation process developed using Big data tools such such as Spark (Scala/python/Java), Hive, HDFS, Sqoop ,Kafka, Kerberos, Impala etc. and CDP 7.x Ingesting huge volumes data from various platforms for Analytics needs and writing high-performance, reliable, and maintainable ETL code Strong Strong analytic skills related to working with unstructured datasets. Strong experience in building/designing Data warehouses, data stores for analytics consumption On prem and Cloud (real time as well as batch use cases) Ability to interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions. Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) Processes. Develop reusable frameworks to reduce the development effort involved thereby ensuring cost savings for the projects. Develop quality code with thought through performance optimizations in place right at the development stage. Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies. Work with team spread across the globe in driving the delivery of projects and recommend development and performance improvements.

Requirements

  • Should be strong in Python, SQL, and PySpark for data wrangling.
  • Associates should have extensive experience in ELT workflows across cloud and on-prem platforms.
  • Should have AWS hands on knowledge, should be proficient in Informatica IDMC, IBM DataStage, Airflow, and Autosys.
  • Associates should have extensive knowledge in Azure, Salesforce, Snowflake, AWS work integrations.
  • Associate should be experience to support ER diagrams using SAP Power Designer.

Responsibilities

  • Building and Implementing data ingestion and curation process developed using Big data tools such as Spark (Scala/python/Java), Hive, HDFS, Sqoop ,Kafka, Kerberos, Impala etc. and CDP 7.x
  • Ingesting huge volumes data from various platforms for Analytics needs and writing high-performance, reliable, and maintainable ETL code Strong
  • Strong analytic skills related to working with unstructured datasets.
  • Strong experience in building/designing Data warehouses, data stores for analytics consumption On prem and Cloud (real time as well as batch use cases)
  • Ability to interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions.
  • Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) Processes.
  • Develop reusable frameworks to reduce the development effort involved thereby ensuring cost savings for the projects.
  • Develop quality code with thought through performance optimizations in place right at the development stage.
  • Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies.
  • Work with team spread across the globe in driving the delivery of projects and recommend development and performance improvements.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Industry

Professional, Scientific, and Technical Services

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service