AI Data Engineer - Cortex AI

Techtorch
17hRemote

About The Position

As an AI Data Engineer, you will design and build modern data pipelines and AI-ready data platforms with Snowflake as the core warehouse, leveraging Snowflake Cortex AI, AWS Bedrock, and other mainstream AI services. Your work will enable advanced analytics, LLM-powered use cases, and AI-driven automation across enterprise environments. This role sits at the intersection of data engineering and applied AI. You will ensure data is high-quality, well-modeled, and accessible for downstream analytics, machine learning, and GenAI applications.

Requirements

  • 4+ years of experience in data engineering, with strong hands-on Snowflake experience
  • Practical experience building AI-enabled data solutions or preparing data for ML/LLM workloads
  • Strong proficiency in SQL and Python for data transformation and pipeline development
  • Experience with Snowflake Cortex AI or similar warehouse-native AI capabilities
  • Hands-on experience with AWS, ideally including AWS Bedrock, Lambda, S3, and IAM
  • Experience with ELT tooling such as dbt, Airflow, or similar orchestration frameworks
  • Solid understanding of data modeling, data warehousing, and performance optimization
  • Comfortable working in cloud-native, enterprise environments with high delivery expectations
  • Strong communication skills and ability to collaborate across technical and business teams

Nice To Haves

  • Experience with vector databases or embedding workflows (e.g., Snowflake, OpenSearch, Pinecone)
  • Familiarity with LLM orchestration frameworks (e.g., LangChain, Bedrock Agents)
  • Experience supporting AI agents, RAG pipelines, or GenAI analytics use cases
  • Exposure to regulated or security-conscious environments

Responsibilities

  • Design, build, and maintain scalable data pipelines using Snowflake as the central data platform
  • Develop AI-ready data models and feature layers to support analytics, ML, and GenAI use cases
  • Leverage Snowflake Cortex AI for embedding, classification, summarization, and AI-assisted analytics
  • Integrate and operationalize AI workflows using AWS Bedrock and related AWS services (e.g., Lambda, Step Functions)
  • Build and optimize ELT pipelines using tools such as dbt, SQL, and Python
  • Integrate data from diverse sources including APIs, SaaS platforms, databases, and event streams
  • Ensure data quality, observability, and governance across pipelines and AI workloads
  • Collaborate with AI engineers, data scientists, and business teams to translate use cases into scalable data solutions
  • Document data models, pipelines, and AI-related design decisions clearly for long-term maintainability

Benefits

  • Opportunity to work on AI-first data platforms using Snowflake Cortex and AWS Bedrock
  • High-impact role at the intersection of data engineering and applied AI
  • Exposure to private equity-backed enterprise transformation programs
  • Global, collaborative team with strong technical standards
  • Flexible, remote-first working environment with autonomy and ownership
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service