IT - Lead AI and Data Engineer

Rialto CapitalMiami, FL
1d

About The Position

Key Responsibilities: AI & Data Engineering Design and build enterprise AI knowledge bases that unify structured, semi-structured, and unstructured data including internal data and third-party data sources (market data, property data, economic indicators, benchmarks, research) to power investment analysis and LLM-driven applications. Lead the technical evaluation, integration, and ongoing optimization of third-party data vendors, ensuring data quality, coverage, and seamless integration into AI, analytics, and investment workflows. Architect and operate Retrieval-Augmented Generation (RAG) pipelines using vector databases such as Pinecone and Cosmos DB, including embedding strategies, chunking, metadata modeling, multi-stage retrieval, reranking, grounding, and relevance tuning. Build scalable data ingestion and transformation pipelines for internal and external data using SQL / Azure SQL, Cosmos DB (Core and vector workloads), and Microsoft Fabric (Lakehouse, Data Pipelines, Semantic Models). Develop entity intelligence and feature engineering pipelines for assets, funds, and geographies supporting downstream analytics, modeling, and AI reasoning. Design and maintain graph-based intelligence layers to capture relationships across entities, enabling relationship-aware reasoning, portfolio analysis, and hybrid querying across vector, graph, and relational data. Develop backend services and middleware (Python, Fast API, Node.js) that expose AI and analytics capabilities via secure APIs, including semantic search, document intelligence, investment Q&A, summarization, classification, and analytical services. Integrate and orchestrate LLM workflows using LangChain, LlamaIndex, Semantic Kernel, or custom frameworks, securely connecting enterprise and third-party data with LLM APIs (Azure OpenAI, OpenAI, Anthropic Claude). Apply machine learning techniques where appropriate (classification, similarity, clustering, forecasting, anomaly detection) to enhance data quality, investment insights, and AI-driven analysis focusing on practical, production-ready models. Lead deployment and operations of AI, data, and analytics platforms on Azure, with deep integration into Azure OpenAI, Cosmos DB, Azure SQL, and Microsoft Fabric, ensuring scalability, fault tolerance, performance, and cost efficiency. Establish governance, reliability, and observability for AI and analytics systems, including LLM evaluation frameworks, retrieval quality metrics, CI/CD pipelines, logging, monitoring, access controls, and compliance with data security and ethical AI standards. Team Leadership & Collaboration Lead and mentor AI and data engineers, setting technical direction across AI platforms, data engineering, and analytics systems. Define architecture standards, design patterns, and documentation for AI, data, and investment analytics platforms. Partner closely with Investment, Asset Management, Research, and Technology teams to translate business problems into scalable AI- and data-driven solutions. Innovation and Continuous Improvement Evaluate and adopt emerging technologies in AI, knowledge management, and automation that deliver measurable business impact. Champion a culture that blends strong engineering discipline with modern AI capabilities and continuous improvement, balancing engineering rigor with speed-to-value. Specifications: Education & Experience Bachelor’s degree in Computer science, Engineering, Data Science, or a related field required. 7+ years of experience across AI engineering, data engineering, backend development, or applied machine learning, with at least 2 years in a senior or technical leadership role. Proven experience building RAG-based systems, AI platforms, and data-driven analytics systems in production. Experience working with investment, financial, or real asset data (real estate, credit, capital markets) is highly preferred. Strong analytical, communication, and interpersonal skills, with an ability to convey complex technical concepts to non-technical stakeholders. Technical & Professional Skills AI & LLM Engineering: Deep hands-on experience integrating LLM APIs (Azure OpenAI, OpenAI, Anthropic Claude), with strong command of RAG architecture, embeddings, retrieval strategies, and orchestration frameworks such as LangChain, LlamaIndex, and Semantic Kernel. Data Engineering: Proven expertise designing and operating AI-ready data pipelines using vector databases (Pinecone, Cosmos DB vector, Azure AI Search), relational data stores (SQL / Azure SQL), graph data modeling, and Microsoft Fabric (Lakehouse, pipelines, analytics). Investment Analytics Platforms: Expertise integrating and operationalizing third-party investment data vendors including integration, entity alignment, and enhancing vendor data into analytics, and investment workflows. Backend & Middleware Development: Advanced proficiency in Python (FastAPI) and Node.js for designing secure, scalable APIs, asynchronous processing, job orchestration, and service-to-service communication powering enterprise AI systems. Cloud Platforms, DevOps & Enterprise Delivery: Azure-first experience delivering production-grade AI, data, and analytics platforms, including CI/CD pipelines, observability, performance optimization, governance, and cross-functional collaboration. While performing the duties of this job, the Associate is occasionally required to stand; walk; sit; use hands to finger, handle, or feel objects, tools or controls; reach with hands and arms; climb stairs; balance; stoop. The Associate must occasionally lift and/or move up to 25 pounds. Evening or weekend work may be necessary to meet deadlines. This description outlines the basic tasks and requirements for the position noted. It is not a comprehensive listing of all job duties. Rialto Capital is committed to the principles of Equal Employment Opportunity. Our policy is to provide equal employment opportunity to all applicants and Associates without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity or expression, national origin, age (40+), disability, veteran status, genetic information (including family medical history), or any other legally protected status. Our company uses E-Verify to confirm the employment eligibility of all newly hired employees. To learn more about E-Verify, including your rights and responsibilities please visit: e-verify.gov. Rialto Capital is an integrated investment management and asset management platform, with a dedicated special servicer. Our mission is to be a world-class, industry leading organization that creates long term value for our investors and sustains results across market cycles. Our team of over 260 associates throughout the United States and Europe is guided by our commitment to a culture of social responsibility, inclusion & diversity, and responsible investing. Rialto Capital has made a commitment to create a high-performance culture driven by an entrepreneurial spirit and an absolute commitment to integrity, respect, inclusion and giving back to the communities in which we work and live. We have focused our vision on becoming a real estate investment, asset management, servicing and finance company most admired for its people, capabilities and performance. We are always looking for Talent within our Investment Management team. Please click on the Get Started button below to upload your resume and tell us a bit about yourself.

Requirements

  • Bachelor’s degree in Computer science, Engineering, Data Science, or a related field required.
  • 7+ years of experience across AI engineering, data engineering, backend development, or applied machine learning, with at least 2 years in a senior or technical leadership role.
  • Proven experience building RAG-based systems, AI platforms, and data-driven analytics systems in production.
  • Strong analytical, communication, and interpersonal skills, with an ability to convey complex technical concepts to non-technical stakeholders.
  • Deep hands-on experience integrating LLM APIs (Azure OpenAI, OpenAI, Anthropic Claude), with strong command of RAG architecture, embeddings, retrieval strategies, and orchestration frameworks such as LangChain, LlamaIndex, and Semantic Kernel.
  • Proven expertise designing and operating AI-ready data pipelines using vector databases (Pinecone, Cosmos DB vector, Azure AI Search), relational data stores (SQL / Azure SQL), graph data modeling, and Microsoft Fabric (Lakehouse, pipelines, analytics).
  • Expertise integrating and operationalizing third-party investment data vendors including integration, entity alignment, and enhancing vendor data into analytics, and investment workflows.
  • Advanced proficiency in Python (FastAPI) and Node.js for designing secure, scalable APIs, asynchronous processing, job orchestration, and service-to-service communication powering enterprise AI systems.
  • Azure-first experience delivering production-grade AI, data, and analytics platforms, including CI/CD pipelines, observability, performance optimization, governance, and cross-functional collaboration.

Nice To Haves

  • Experience working with investment, financial, or real asset data (real estate, credit, capital markets) is highly preferred.

Responsibilities

  • Design and build enterprise AI knowledge bases that unify structured, semi-structured, and unstructured data including internal data and third-party data sources (market data, property data, economic indicators, benchmarks, research) to power investment analysis and LLM-driven applications.
  • Lead the technical evaluation, integration, and ongoing optimization of third-party data vendors, ensuring data quality, coverage, and seamless integration into AI, analytics, and investment workflows.
  • Architect and operate Retrieval-Augmented Generation (RAG) pipelines using vector databases such as Pinecone and Cosmos DB, including embedding strategies, chunking, metadata modeling, multi-stage retrieval, reranking, grounding, and relevance tuning.
  • Build scalable data ingestion and transformation pipelines for internal and external data using SQL / Azure SQL, Cosmos DB (Core and vector workloads), and Microsoft Fabric (Lakehouse, Data Pipelines, Semantic Models).
  • Develop entity intelligence and feature engineering pipelines for assets, funds, and geographies supporting downstream analytics, modeling, and AI reasoning.
  • Design and maintain graph-based intelligence layers to capture relationships across entities, enabling relationship-aware reasoning, portfolio analysis, and hybrid querying across vector, graph, and relational data.
  • Develop backend services and middleware (Python, Fast API, Node.js) that expose AI and analytics capabilities via secure APIs, including semantic search, document intelligence, investment Q&A, summarization, classification, and analytical services.
  • Integrate and orchestrate LLM workflows using LangChain, LlamaIndex, Semantic Kernel, or custom frameworks, securely connecting enterprise and third-party data with LLM APIs (Azure OpenAI, OpenAI, Anthropic Claude).
  • Apply machine learning techniques where appropriate (classification, similarity, clustering, forecasting, anomaly detection) to enhance data quality, investment insights, and AI-driven analysis focusing on practical, production-ready models.
  • Lead deployment and operations of AI, data, and analytics platforms on Azure, with deep integration into Azure OpenAI, Cosmos DB, Azure SQL, and Microsoft Fabric, ensuring scalability, fault tolerance, performance, and cost efficiency.
  • Establish governance, reliability, and observability for AI and analytics systems, including LLM evaluation frameworks, retrieval quality metrics, CI/CD pipelines, logging, monitoring, access controls, and compliance with data security and ethical AI standards.
  • Lead and mentor AI and data engineers, setting technical direction across AI platforms, data engineering, and analytics systems.
  • Define architecture standards, design patterns, and documentation for AI, data, and investment analytics platforms.
  • Partner closely with Investment, Asset Management, Research, and Technology teams to translate business problems into scalable AI- and data-driven solutions.
  • Evaluate and adopt emerging technologies in AI, knowledge management, and automation that deliver measurable business impact.
  • Champion a culture that blends strong engineering discipline with modern AI capabilities and continuous improvement, balancing engineering rigor with speed-to-value.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service