About The Position

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. We are seeking a highly experienced and visionary Global Platform Big Data Architect to spearhead the design, development, and evolution of our next-generation Data Fabric platform. This pivotal role will be responsible for defining the architectural roadmap, establishing best practices, and providing expert guidance to engineering teams building scalable, reliable, and secure data solutions across both Google Cloud Platform (GCP) and Amazon Web Services (AWS). The ideal candidate will possess deep technical expertise in big data technologies, cloud-native data services, and a proven track record of delivering complex data platforms. To adhere to our corporate location policies, this resource will be required to be local to the surrounding Atlanta, GA. You are required to adhere to our Return To Office (RTO) / weekly onsite requirements (Tuesday, Wednesday, and Thursday).

Requirements

  • Bachelor's or Master's degree in Computer Science, Software Engineering, or a related quantitative field.
  • 10+ years of progressive experience in data architecture, big data engineering, or cloud platform engineering.
  • 5+ years of hands-on experience specifically designing and building large-scale data platforms in a cloud environment.
  • Expertise in designing and implementing data lakes, data warehouses, and data marts in cloud environments.
  • Proficiency in at least one major programming language for data processing (e.g., Python, Scala, Java).
  • Deep understanding of distributed data processing frameworks (e.g., Apache Spark, Flink).
  • Experience with various data modeling techniques (dimensional, relational, NoSQL).
  • Solid understanding of DevOps principles, CI/CD pipelines, and infrastructure as code (e.g., Terraform, CloudFormation).
  • Experience with real-time data streaming technologies (e.g., Kafka, Kinesis, Pub/Sub).
  • Strong understanding of data governance, data quality, and metadata management concepts.
  • Excellent communication, presentation, and interpersonal skills with the ability to articulate complex technical concepts to both technical and non-technical audiences.
  • Proven ability to lead and influence technical teams without direct authority.

Nice To Haves

  • Strong, demonstrable experience with Google Cloud Platform (GCP) big data services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, Cloud Functions).
  • GCP certifications (e.g., Professional Data Engineer, Professional Cloud Architect).
  • Strong, demonstrable experience with Amazon Web Services (AWS) big data services (e.g., S3, EMR, Kinesis, Redshift, Glue, Athena, Lambda).
  • AWS certifications (e.g., Solutions Architect Professional, Big Data Specialty).
  • Experience with data mesh principles and implementing domain-oriented data architectures.
  • Familiarity with other cloud platforms (e.g., Azure) or on-premise data technologies.
  • Experience with containerization technologies (e.g., Docker, Kubernetes).
  • Knowledge of machine learning operationalization (MLOps) principles and platforms.
  • Contributions to open-source big data projects.

Responsibilities

  • Define and champion the architectural vision and strategy for our enterprise-wide Data Fabric platform, enabling seamless data discovery, access, integration, and governance across disparate data sources.
  • Lead the design and architecture of highly scalable, resilient, and cost-effective data solutions leveraging a diverse set of big data and cloud-native services in GCP and AWS.
  • Provide expert architectural guidance, technical leadership, and mentorship to multiple engineering teams, ensuring adherence to architectural principles, best practices, and design patterns.
  • Drive the selection, implementation, and continuous improvement of core data platform components, tools, and frameworks.
  • Leverage deep understanding of GCP and AWS data services to design optimal solutions.
  • Architect and implement robust data governance, security, privacy, and compliance measures within the data platform, ensuring data integrity and regulatory adherence.
  • Identify and address performance bottlenecks, optimize data pipelines, and ensure efficient resource utilization across cloud environments.
  • Stay abreast of emerging big data and cloud technologies, evaluate their potential impact, and recommend their adoption where appropriate.
  • Collaborate closely with data scientists, data engineers, analytics teams, product managers, and other stakeholders to understand data requirements and translate them into architectural designs.
  • Develop and maintain comprehensive architectural documentation, standards, and guidelines for data platform development.
  • Lead and execute proof-of-concepts for new technologies and architectural patterns to validate their feasibility and value.

Benefits

  • Comprehensive compensation and healthcare packages.
  • 401k matching.
  • Paid time off.
  • Organizational growth potential through our online learning platform with guided career tracks.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Senior

Education Level

Master's degree

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service