Generative AI Engineer

EquifaxAtlanta, GA
20h

About The Position

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What you’ll do Apply the knowledge of data characteristics and data supply pattern, develop rules and tracking process to support data quality model. Prepare data for analytical use by building data pipelines to gather data from multiple sources and systems. Integrate, consolidate, cleanse and structure data for use by our clients in our solutions. Perform design, creation, and interpretation of large and highly complex datasets. Stay up-to-date with the latest trends and advancements in GCP and related technologies, actively proposing and evaluating new solutions. Understand best practices for data management, maintenance, reporting and security and use that knowledge to implement improvements in our solutions. Implement security best practices in pipelines and infrastructure. Develop and implement data quality checks and troubleshoot data anomalies. Provide guidance and mentorship to junior data engineers. Review dataset implementations performed by junior data engineers. What experience you need BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred 5+ years of experience as a data engineer or related role Cloud certification strongly preferred Advanced skills using programming languages such as Python or SQL and intermediate level experience with scripting languages Intermediate level understanding and experience with Google Cloud Platforms and overall cloud computing concepts, as well as basic knowledge of other cloud environments Experience building and maintaining moderately-complex data pipelines, troubleshooting issues, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects Experience designing and implementing moderately complex data models and experience enabling optimization to improve performance Demonstrates advanced Git usage and CI/CD integration skills What could set you apart Master’s degree in a related field is a strong plus Exposure in Vertex AI, GCP AI/ML services (AutoML, BigQuery ML, Cloud Run, etc.) or a similar cloud technology Strong foundational skills in Linux Operating System. Understanding of NLP, deep learning, and generative architectures (Transformers, Diffusion Models, etc.) Background in credit risk, financial data analytics or risk modeling. Experience working with large datasets on a big data platform (e.g., Google Cloud, AWS, Snowflake, Hadoop) Experience in Business Intelligence, data visualization, and customer insights generation. Familiarity with data governance, model bias mitigation, and regulatory frameworks (GDPR, AI Act, SEC compliance). Experience with MLOps practices, model monitoring, and CI/CD for AI workflows. Knowledge of prompt tuning, fine-tuning, and parameter-efficient methods (LoRA, PEFT). Hands-on experience with RAG, multi-modal AI, and hybrid AI architectures. Contributions to the AI community through publications, open-source projects, or conference presentations. We offer comprehensive compensation and healthcare packages, 401k matching, paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Primary Location: USA-Atlanta-One-Atlantic-Center USA-Atlanta JV White Function: Function - Data and Analytics Schedule: Full time At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity employer, and qualified applicants will receive consideration for employment without regard to race, color, religion, ancestry, age, sex/gender, sexual orientation, gender identity or expression, service in the Armed Forces, protected veteran status, national origin, physical or mental disability, genetic information, citizenship status or any other status protected by law. For US Applicants If you'd like more information on your EEO rights under the law, please view our EEO is the Law Declarations, and Nondiscrimination Provision. If you need a reasonable accommodation to assist with your job search or applicant for employment, please contact us by sending an email to [email protected]. In your email, please include a description of the specific accommodation you are requesting and a description of the position for which you are applying. Equifax participates in E-Verify and Right to Work (English and Spanish).

Requirements

  • BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred
  • 5+ years of experience as a data engineer or related role
  • Cloud certification strongly preferred
  • Advanced skills using programming languages such as Python or SQL and intermediate level experience with scripting languages
  • Intermediate level understanding and experience with Google Cloud Platforms and overall cloud computing concepts, as well as basic knowledge of other cloud environments
  • Experience building and maintaining moderately-complex data pipelines, troubleshooting issues, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects
  • Experience designing and implementing moderately complex data models and experience enabling optimization to improve performance
  • Demonstrates advanced Git usage and CI/CD integration skills

Nice To Haves

  • Master’s degree in a related field is a strong plus
  • Exposure in Vertex AI, GCP AI/ML services (AutoML, BigQuery ML, Cloud Run, etc.) or a similar cloud technology
  • Strong foundational skills in Linux Operating System.
  • Understanding of NLP, deep learning, and generative architectures (Transformers, Diffusion Models, etc.)
  • Background in credit risk, financial data analytics or risk modeling.
  • Experience working with large datasets on a big data platform (e.g., Google Cloud, AWS, Snowflake, Hadoop)
  • Experience in Business Intelligence, data visualization, and customer insights generation.
  • Familiarity with data governance, model bias mitigation, and regulatory frameworks (GDPR, AI Act, SEC compliance).
  • Experience with MLOps practices, model monitoring, and CI/CD for AI workflows.
  • Knowledge of prompt tuning, fine-tuning, and parameter-efficient methods (LoRA, PEFT).
  • Hands-on experience with RAG, multi-modal AI, and hybrid AI architectures.
  • Contributions to the AI community through publications, open-source projects, or conference presentations.

Responsibilities

  • Apply the knowledge of data characteristics and data supply pattern, develop rules and tracking process to support data quality model.
  • Prepare data for analytical use by building data pipelines to gather data from multiple sources and systems.
  • Integrate, consolidate, cleanse and structure data for use by our clients in our solutions.
  • Perform design, creation, and interpretation of large and highly complex datasets.
  • Stay up-to-date with the latest trends and advancements in GCP and related technologies, actively proposing and evaluating new solutions.
  • Understand best practices for data management, maintenance, reporting and security and use that knowledge to implement improvements in our solutions.
  • Implement security best practices in pipelines and infrastructure.
  • Develop and implement data quality checks and troubleshoot data anomalies.
  • Provide guidance and mentorship to junior data engineers.
  • Review dataset implementations performed by junior data engineers.

Benefits

  • comprehensive compensation and healthcare packages
  • 401k matching
  • paid time off
  • organizational growth potential through our online learning platform with guided career tracks
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service