Data Scientist- Payments Optimization

WorldpayCincinnati, OH
20h

About The Position

Are you ready to unleash your full potential? We’re looking for people who are passionate about payments to chart Worldpay’s path to being the largest and most-loved payments company in the world. About the team Worldpay, LLC seeks Data Scientist- Payments Optimization in Cincinnati, OH to work within the payment Strategic Routing team to build, validate, and deploy machine learning (ML) models and statistical models that drive smarter debit routing decisions and real time transaction optimization. What you will be doing The Data Scientist- Payments Optimization will develop automated, end to end pipelines for data ingestion, feature engineering, and model deployment using Databricks, Snowflake, Apache Spark, and other big data frameworks. Additionally, the role will: Perform exploratory analysis on large structured & unstructured datasets and present findings through dashboards and interactive visualizations for both technical and non-technical stakeholders. Design and evaluate A/B tests to measure uplift, optimize routing tables, and inform product strategy. Audit existing analytical workflows and introduce AI powered automation to improve efficiency, accuracy, and scalability. Translate business challenges into technical solutions, working closely with product, engineering, risk, and compliance teams. Ensure data quality, security, and regulatory compliance; document models and establish monitoring to track performance over time. Balance sophistication with practicality, choosing the right level of complexity to meet performance, latency, and cost goals.

Requirements

  • Bachelor’s degree or foreign equivalent in Information Systems or related field and five (5) years of progressively responsible experience in the job offered or a related occupation: utilizing Python , R, SQL, and at least one JVM language (Scala/Java) for Spark
  • working with ML/data science libraries (scikit learn, PySpark ML, TensorFlow, etc.)
  • working with cloud data platforms (AWS, GCP, or Azure), Databricks, and Snowflake
  • operating with data modeling, ETL/ELT pipelines, and distributed computing frameworks (Spark, Hadoop)
  • implementing version control & DevOps fundamentals (Git, CI/CD)
  • working with A/B testing, experiment design, and optimization techniques
  • employing Tableau, Power BI, Matplotlib, or R Shiny.

Responsibilities

  • Develop automated, end to end pipelines for data ingestion, feature engineering, and model deployment using Databricks, Snowflake, Apache Spark, and other big data frameworks.
  • Perform exploratory analysis on large structured & unstructured datasets and present findings through dashboards and interactive visualizations for both technical and non-technical stakeholders.
  • Design and evaluate A/B tests to measure uplift, optimize routing tables, and inform product strategy.
  • Audit existing analytical workflows and introduce AI powered automation to improve efficiency, accuracy, and scalability.
  • Translate business challenges into technical solutions, working closely with product, engineering, risk, and compliance teams.
  • Ensure data quality, security, and regulatory compliance; document models and establish monitoring to track performance over time.
  • Balance sophistication with practicality, choosing the right level of complexity to meet performance, latency, and cost goals.

Benefits

  • A competitive salary and benefits
  • A variety of career development tools, resources and opportunities
  • The chance to work on some of the most challenging, relevant issues in the payment industry
  • Time to support charities and give back in your community
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service