Lead Software Engineer

Wells FargoIrving, TX
2d

About The Position

About this role: Wells Fargo is seeking a Lead Software Engineer to drive innovation and technical excellence within the CTR space. This role will lead the design and optimization of scalable data platforms using technologies such as Apache Spark. The engineer will ensure robust observability and performance monitoring. They will also oversee workflow orchestration and DevOps practices, enabling efficient collaboration and delivery across teams. Strong partnership with business stakeholders is essential to deliver timely, impactful solutions to complex operational challenges. In this role, you will: Lead complex technology initiatives including those that are companywide with broad impact Act as a key participant in developing standards and companywide best practices for engineering complex and large-scale technology solutions for technology engineering disciplines Design, develop, and maintain ETL/ELT pipelines. Automate workflows for ingestion, transformation, and loading of data Integrate data from multiple sources into data lakes or warehouses. Ensure optimal storage strategies balancing cost and performance Cleanse, normalize, and enrich raw data for analytics. Implement robust data quality checks and governance frameworks Monitor and optimize data workflows for scalability and efficiency Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals Lead projects, teams, or serve as a peer mentor

Requirements

  • 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
  • 5+ years of experience building data pipelines on Google Cloud Platform (GCP) using Cloud Dataflow, Cloud Dataproc, Cloud Composer, BigQuery, Cloud Storage (GCS), Pub/Sub, or similar GCP-native frameworks
  • 5+ years of experience with workflow orchestration using Cloud Composer
  • 5+ years of strong development proficiency in SQL and Python for data engineering tasks
  • 2+ years of experience with monitoring and observability using Cloud Monitoring, Grafana, or equivalent tools
  • Solid understanding of data modeling, ETL/ELT processes, and data governance best practices
  • Experience with Data Fusion, Dataplex and Dataprep would be a plus

Nice To Haves

  • Extensive hands-on experience with BigQuery, Cloud Storage, Cloud SQL, Cloud Dataflow and Cloud Composer.
  • Experience with performance monitoring and pipeline observability using GCP-native tools or third-party solutions
  • Proficiency with GCP resource management and workload optimization and cost control strategies
  • Familiarity with BigQuery for large-scale analytics and schema evolution
  • Knowledge of data lakehouse patterns using Apache Iceberg or similar technologies
  • Well-versed in PySpark, Pandas, and Polars for data processing
  • Experience using Jira and Confluence for agile workflows
  • Experience with interactive notebook environments (Jupyter, Zepplin, databricks or similar) for data engineering development

Responsibilities

  • Lead complex technology initiatives including those that are companywide with broad impact
  • Act as a key participant in developing standards and companywide best practices for engineering complex and large-scale technology solutions for technology engineering disciplines
  • Design, develop, and maintain ETL/ELT pipelines.
  • Automate workflows for ingestion, transformation, and loading of data
  • Integrate data from multiple sources into data lakes or warehouses.
  • Ensure optimal storage strategies balancing cost and performance
  • Cleanse, normalize, and enrich raw data for analytics.
  • Implement robust data quality checks and governance frameworks
  • Monitor and optimize data workflows for scalability and efficiency
  • Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives
  • Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals
  • Lead projects, teams, or serve as a peer mentor

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service