Intercontinental Exchange, Inc.(ICE)-posted 3 months ago
Senior
Sandy Springs, GA
5,001-10,000 employees
Securities, Commodity Contracts, and Other Financial Investments and Related Activities

The Document and Data Automation team is seeking a Senior Data Intelligence Engineer to build and improve AI-driven systems that automate mortgage workflows. We develop document recognition, data-extraction and analytics solutions that shorten loan processing times from weeks to minutes, directly supporting ICE Mortgage Technology's mission to streamline homeownership. As a Senior Engineer, you will design scalable data and machine learning solutions, uncover complex data patterns, and collaborate across engineering, data science, and product teams to integrate GenAI capabilities that accelerate intelligent automation, enhance decision-making, and drive measurable improvements in accuracy and efficiency.

  • Develop Data Intelligence and ML solutions: Design, develop, and deploy scalable data pipelines, ML models, and GenAI-powered solutions that automate document processing and data extraction with high accuracy and performance.
  • Leverage GenAI building blocks - such as prompting techniques, agentic frameworks, vector databases, and Model Context Protocol (MCP) - to enrich data workflows and enable intelligent decision-making.
  • Conduct comprehensive data analysis to identify actionable insights, inform strategic decisions, and continuously improve system performance.
  • Drive Innovation and Collaboration: Continuously explore and adopt emerging GenAI technologies to enhance automation capabilities.
  • Rapidly build and test new approaches to solve complex business problems and improve operational efficiency.
  • Support junior engineers and peers in applying GenAI concepts effectively by leveraging AI-assisted coding for rapid prototyping and feature implementation, and by incorporating GenAI building blocks directly into solution architectures to accelerate learning and delivery.
  • Promote Data-Product Mindset: Align data pipeline design with end users' needs by defining clear outputs and business outcomes.
  • Empower stakeholders to engage with data solutions as customers, fostering ownership and feedback-driven iteration.
  • Deliver and Optimize Impactful Results: Continuously monitor deployed solutions to ensure optimal performance, accuracy, scalability, and reliability.
  • Identify, troubleshoot, and resolve performance issues and data anomalies proactively.
  • Track and communicate key performance metrics clearly and regularly to stakeholders.
  • Bachelor's degree in data science, Engineering, Computer Science, or related field (master's preferred).
  • 5+ years of software/data-engineering experience with strong understanding of the full SDLC, CI/CD practices, and production-grade system design.
  • Solid understanding of statistics, data analysis, and Machine Learning fundamentals.
  • Practical experience implementing and deploying solutions using GenAI building blocks such as prompting strategies, agentic frameworks, vector databases, RAG pipelines, reasoning models, and MCP.
  • Hands-on experience with NLP and/or CV.
  • Extensive Python programming experience (3+ years), with strong proficiency in libraries such as pandas, scikit-learn, Hugging Face Transformers, MLFlow and Jupyter.
  • Proficiency in SQL and experience with data querying, transformation, and analytics tools.
  • Experience with big data platforms such as Databricks, Snowflake, or similar.
  • Strong analytical skills with a demonstrated ability to perform complex data analysis.
  • Development tools and CI/CD pipelines - git, Docker, Terraform, Jenkins.
  • Excellent written and verbal communication skills, ability to document complex systems clearly and collaborate effectively across cross-functional teams.
  • Proven experience using AI-assisted development tools such as Cursor and GitHub Copilot is a significant advantage.
  • Practical experience with modern GenAI frameworks and orchestration tools - such as DSPy, LangChain, ADK, LlamaIndex, AWS Bedrock AgentCore, OpenAI Agents SDK, Adept ACT-2, Perplexity - is a significant advantage.
  • Exposure to cloud platforms and services (AWS Sagemaker, S3, Lambda, EC2, Step Functions, ECS, or equivalent).
  • Experience developing RESTful APIs and web services.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service