Infosys-posted 4 months ago
Phoenix, AZ
5,001-10,000 employees
Professional, Scientific, and Technical Services

Infosys is looking for Big Data Engineer who must be Polyglots with expertise in multiple technologies and can work as a full-stack developer in complex engineering projects. The ideal candidate has experience building products across the stack and a firm understanding of web frameworks, APIs, databases, and multiple back-end languages. The full stack developer will join a small team that uses new technology to solve challenges for both the front-end and back-end architecture, ultimately delivering amazing experiences for global users.

  • Build products across the stack and understand web frameworks, APIs, databases, and multiple back-end languages.
  • Work as a full-stack developer in complex engineering projects.
  • Lead the migration of enterprise-scale data pipelines and analytics workloads from Google BigQuery to Snowflake.
  • Categorize, catalog, cleanse, and normalize datasets.
  • Provide users access to datasets using REST and Python APIs.
  • Extract, transform, and load data from a variety of data sources using Python, SQL, and AWS technologies.
  • Translate query logic, optimize performance, and ensure data integrity throughout the migration.
  • Familiarity with orchestration tools and compliance with financial data governance standards.
  • Bachelor's degree or foreign equivalent required from an accredited institution, or three years of progressive experience in the specialty in lieu of every year of education.
  • Deep expertise in Scala or Python for Spark application development along with Snowflake.
  • Experience in end-to-end implementation of projects using Cloudera Hadoop, Snowflake, Spark, Hive, HBase, Sqoop, Kafka, Elasticsearch, Grafana, ELK stack.
  • Deep expertise in Snowflake architecture, SQL optimization, and data modeling.
  • Hands-on experience with Big Query schemas, UDFs, and partitioning strategies.
  • Candidates authorized to work for any employer in the United States without employer-based visa sponsorship.
  • Experience in data warehousing technologies, ETL/ELT implementations.
  • Sound knowledge of software engineering design patterns and practices.
  • Strong understanding of functional programming.
  • Experience with Ranger, Atlas, Tez, Hive LLAP, Neo4J, NiFi, Airflow, or any DAG based tools.
  • Knowledge and experience with cloud and containerization technologies: Azure, Kubernetes, OpenShift, and Dockers.
  • Experience with data visualization tools like Tableau, Kibana, etc.
  • Experience with design and implementation of ETL/ELT framework for complex warehouses/marts.
  • Planning and coordination skills.
  • Experience and desire to work in a global delivery environment.
  • Ability to work in a team in a diverse/multiple stakeholder environment.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service