GCP Data Engineer

InfosysRichardson, TX
8d

About The Position

Infosys is seeking a GCP data engineer . In this role, you will enable digital transformation for our clients in a global delivery model, research on technologies independently, recommend appropriate solutions and contribute to technology-specific best practices and standards. You will be responsible to interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued. Required Qualifications: Candidate must be located within commuting distance of Richardson, TX or be willing to relocate to the area. This position may require travel in the US Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education. Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time At least 4 years of Information Technology experience. Experience working with technologies like – GCP with data engineering – data flow / air flow, pub sub/ kafta, data proc/Hadoop, Big Query. ETL development experience with strong SQL background such as Python/R, Scala, Java, Hive, Spark, Kafka Strong knowledge on Python Program development to build reusable frameworks, enhance existing frameworks. Application build experience with core GCP Services like Dataproc, GKE, Composer,

Requirements

  • Candidate must be located within commuting distance of Richardson, TX or be willing to relocate to the area. This position may require travel in the US
  • Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
  • Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time
  • At least 4 years of Information Technology experience.
  • Experience working with technologies like – GCP with data engineering – data flow / air flow, pub sub/ kafta, data proc/Hadoop, Big Query.
  • ETL development experience with strong SQL background such as Python/R, Scala, Java, Hive, Spark, Kafka
  • Strong knowledge on Python Program development to build reusable frameworks, enhance existing frameworks.
  • Application build experience with core GCP Services like Dataproc, GKE, Composer

Nice To Haves

  • Good knowledge on Google Big Query, using advance SQL programing techniques to build Big Query Data sets in Ingestion and Transformation layer.
  • Experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data
  • Knowledge on Airflow Dag creation, execution, and monitoring.
  • Good understanding of Agile software development frameworks
  • Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams.
  • Experience and desire to work in a global delivery environment.

Benefits

  • Medical/Dental/Vision/Life Insurance
  • Long-term/Short-term Disability
  • Health and Dependent Care Reimbursement Accounts
  • Insurance (Accident, Critical Illness, Hospital Indemnity, Legal)
  • 401(k) plan and contributions dependent on salary level
  • Paid holidays plus Paid Time Off
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service