Data & AI Engineer – Equities Technology

William BlairChicago, IL

About The Position

We’re seeking a Data & AI Engineer to design, build, and maintain intelligent, scalable data and AI‑enabled platforms supporting our Equities business across Trading, Research, and Sales. This role spans the full software development lifecycle and is responsible for moving data and AI solutions from concept through production, while ensuring reliability, security, and compliance with firm standards. The ideal candidate brings strong data engineering fundamentals combined with hands‑on experience integrating AI and LLM-based capabilities into production systems in a regulated environment.

Requirements

  • Bachelor’s degree in information technology or related field
  • 4–6+ years of hands‑on experience with Databricks, Spark, Azure Data Factory, Azure Synapse, Python, ADLS, and Azure Functions
  • Strong experience designing and managing Synapse/ADF pipelines, activities, and linked services
  • Proven ability to build full and incremental data loads from Azure and on‑prem data sources
  • Experience designing reusable ETL/ELT frameworks and orchestrating pipelines across ADF/Synapse/Databricks
  • Experience implementing LLM‑enabled or RAG‑based solutions in production environments
  • Proficiency with REST APIs, data gateways, and third-party system integrations
  • Strong SQL skills with experience in data modeling, analytical storage, and performance tuning
  • Experience with Azure DevOps and YAML‑based CI/CD pipelines
  • Familiarity with Azure Key Vault, automation runbooks, Logic Apps, and cloud security best practices
  • Strong communication, collaboration, and problem-solving skills

Nice To Haves

  • Hands‑on experience with Azure AI Services, including OpenAI, embeddings
  • Familiarity with Microsoft Fabric (future roadmap alignment)
  • Proficiency in ASP.NET, .NET Core, or C#
  • Background in financial services, capital markets, or other regulated environments

Responsibilities

  • Design, build, and maintain scalable data pipelines using Databricks, Azure Data Factory, and Azure Synapse
  • Implement robust ETL/ELT workflows for structured and unstructured financial data across on‑prem and cloud platforms
  • Ensure data quality, lineage, governance, security, and observability across all pipelines and storage layers
  • Design and optimize data models and analytical schemas (star/snowflake, partitioning, distribution strategies)
  • Build reusable ingestion and transformation frameworks to support analytics and AI workloads
  • Build and deploy AI‑enabled services, agents, and workflows supporting equity research, trading, sales, and client service use cases
  • Implement LLM‑based and agentic patterns, including Retrieval‑Augmented Generation (RAG), using proprietary firm data
  • Integrate AI capabilities into existing applications and platforms via APIs, batch jobs, and event-driven workflows
  • Partner with Data Science and business stakeholders to translate AI concepts into production‑ready solutions
  • Productionalize AI and Data Science POCs into secure, scalable, and monitored services suitable for regulated environments
  • Optimize prompts, embeddings, orchestration logic, and inference workflows for accuracy, performance, cost, and reliability
  • Ensure AI solutions meet firm standards for security, auditability, explainability, and compliance
  • Establish operational practices for AI solutions, including monitoring, alerting, lifecycle management, and runbooks
  • Support migration of legacy data and application solutions (SQL, SSIS, Synapse, custom ETLs) to modern Azure‑native architectures
  • Implement CI/CD pipelines using Azure DevOps and YAML, following infrastructure‑as‑code and automation best practices
  • Leverage Azure services (Functions, Key Vault, Logic Apps, Automation Runbooks) to build secure, reliable, and maintainable solutions
  • Develop operational dashboards to monitor pipeline health, SLAs, system performance, and cloud spend
  • Work closely with Product Managers, Software Engineers, Data Scientists, and business stakeholders to define functional and technical requirements
  • Participate in Agile ceremonies, sprint planning, and retrospectives
  • Lead testing of new and modified software, analyze issues, and resolve defects efficiently
  • Document technical designs, integrations, and maintain operational playbooks and runbooks
  • Monitor industry trends in data engineering, cloud platforms, and AI, and recommend adoption where aligned with firm strategy

Benefits

  • medical, dental and vision coverage
  • employer paid short & long-term disability and life insurance
  • 401(k)
  • profit sharing
  • paid time off
  • Maven family & fertility benefit
  • parental leave (including adoption, surrogacy, and foster placement)
  • other voluntary benefits
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service