Infosys LTD-posted about 1 month ago
Full-time • Mid Level
Bentonville, AR
5,001-10,000 employees
Professional, Scientific, and Technical Services

Infosys Limited is seeking a GCP certified Big Data Engineer. In this role, you will enable digital transformation for our clients in a global delivery model, research on technologies independently, recommend appropriate solutions and contribute to technology-specific best practices and standards. You will be responsible to interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued.

  • Candidate must be located within commuting distance of Bentonville, Arkansas or be willing to relocate to the area. This position may require travel in the US.
  • Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
  • Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time
  • At least 4 years of Information Technology experience
  • Good expertise in Spark, Python, Scala, SQL databases, and type-safe functional programming.
  • Experience working with technologies like - GCP with data engineering - data flow / air flow, pub sub/ kafka, data proc/Hadoop, Big Query.
  • ETL development experience with strong SQL background such as Python/R, Scala, Java, Hive, Spark, Kafka
  • Strong knowledge on Python Program development to build reusable frameworks, enhance existing frameworks.
  • Good experience in end-to-end implementation of data warehouse and data marts.
  • Strong knowledge and hands-on experience in Python and SQL.
  • Knowledge on CICD pipeline using Terraform in Git.
  • Good knowledge on Google Big Query, using advance SQL programing techniques to build Big Query Data sets in Ingestion and Transformation layer.
  • Experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data
  • Big Data Technologies: Apache Spark, Dataproc, BigQuery, Snowflake
  • DevOps Tools: Jenkins, Docker, Kubernetes, SonarQube
  • Knowledge on Airflow Dag creation, execution, and monitoring.
  • Good understanding of Agile software development frameworks
  • Medical/Dental/Vision/Life Insurance
  • Long-term/Short-term Disability
  • Health and Dependent Care Reimbursement Accounts
  • Insurance (Accident, Critical Illness , Hospital Indemnity, Legal)
  • 401(k) plan and contributions dependent on salary level
  • Paid holidays plus Paid Time Off
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service