Data Engineer - CETS

American ExpressPhoenix, AZ
8h$89,250 - $150,250Hybrid

About The Position

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. We are looking for a highly skilled Data Engineer with a solid experience of building Bigdata, GCP Cloud based ETL Pipelines and Spark applications. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organization’s data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. Joining the CETS team, this role will be focused on the delivery of innovative solutions to satisfy the needs of our business. As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team. We pride ourselves on a culture of kindness and positivity, and a continuous focus on supporting colleague development to help you achieve your career goals. We lead with integrity, and we emphasize work/life balance for all of our teammates. How will you make an impact in this role? There are hundreds of opportunities to make your mark on technology and life at American Express. Here’s just some of what you’ll be doing:

Requirements

  • BS or MS degree in computer science, computer engineering, or other technical discipline, or equivalent work experience.
  • 5+ years of hands-on software development experience with Big Data & Analytics solutions
  • Hadoop Hive, Spark, Python, shell scripting, GCP Cloud - Big Query, Airflow, DataProc, PubSub
  • Strong experience in designing, developing, and optimizing data pipelines for large-scale data processing, transformation, and analysis using Big Data and GCP technologies.
  • Proficiency in SQL and database systems, with experience in designing and optimizing data models for performance and scalability.
  • Experience of building event processing pipelines with Kafka or GCP PubSub.
  • Knowledge of distributed (multi-tiered) systems, algorithms & relational databases.
  • Strong Object-Oriented Programming skills and design patterns.
  • Experience with CICD pipelines, Automated test frameworks, and source code management tools (XLR, Jenkins, Git).
  • Good knowledge and experience with configuration management tools like GitHub
  • Ability to analyze complex data engineering problems, propose effective solutions, and implement them effectively.
  • Ability and willingness to learn, adopt and build solutions using the Enterprise frameworks established for Big Data
  • Looks proactively beyond the obvious for continuous improvement opportunities.
  • Communicates effectively with product and cross functional team.
  • Willingness to learn new technologies and leverage them to their optimal potential.
  • Understanding of various SDLC methodologies, familiarity with Agile & scrum ceremonies.

Nice To Haves

  • Design and development experience with Airflow, PubSub Kafka, Git, Jenkins is desirable.
  • Certifications in cloud platform (GCP Professional Data Engineer) is a plus.

Responsibilities

  • Develop modern, high quality, and robust operational engineering capabilities.
  • Develop software in technology stack which is constantly evolving but currently includes Big data platforms like BigQuery, Airflow, DataProc.
  • Work with Business partners and stakeholders to understand functional requirements, architecture dependencies, and business capability roadmaps.
  • Create technical solution designs to meet business requirements.
  • Define best practices to be followed by team.
  • Taking your place as a core member of an Agile team driving the latest development practices
  • Identify and drive reengineering opportunities, and opportunities for adopting new technologies and methods.
  • Perform peer code review and participate in technical discussions with the team on the best solutions possible

Benefits

  • Competitive base salaries
  • Bonus incentives
  • 6%25 Company Match on retirement savings plan
  • Free financial coaching and financial well-being support
  • Comprehensive medical, dental, vision, life insurance, and disability benefits
  • Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need
  • 20+ weeks paid parental leave for all parents, regardless of gender, offered for pregnancy, adoption or surrogacy
  • Free access to global on-site wellness centers staffed with nurses and doctors (depending on location)
  • Free and confidential counseling support through our Healthy Minds program
  • Career development and training opportunities
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service