Senior AI Data Engineer

CorticaSan Diego, CA
Remote

About The Position

Cortica is looking for a Senior AI Data Engineer to join its growing team! The Senior AI Data Engineer will serve as both architect and builder of our data ecosystem. Every initiative will follow a complete engineering lifecycle: gathering stakeholder requirements, designing the solution, building and testing it, and shipping it to production. This role will work across data lakes, analytics pipelines, and lightweight application development— the multi-disciplinary data equivalent of a full-stack developer. The Senior AI Data Engineer will work closely with the data science, finance, and clinical operations teams to design intelligent, automated data solutions that power care decisions, financial planning, and operational efficiency. AI augmentation is not optional — it is the standard working mode. Cortica is a rapidly growing healthcare company pioneering the most effective treatment methods for children with neurodevelopmental differences. Our mission is to design and deliver life-changing care – one child, one family, one community at a time. Ultimately, we envision a world that cultivates the full potential of every child. At Cortica, every team member is instrumental in helping us achieve our mission! Our culture and values guide how we work and treat one another. Cortica celebrates diversity and fosters an inclusive environment, seeking ideas and opinions from everyone on the team. We safeguard equal rights and respect for all individuals, regardless of race, color, religion, sex, national origin, age, disability, creed, genetic information, sexual orientation, gender identity or expression, ancestry, veteran status or other applicable, legally protected characteristics. All Cortica employment decisions are made based on an individual’s qualifications and ability to successfully perform the job responsibilities.

Requirements

  • 5+ years of hands-on data engineering experience, including building and operating production data pipelines.
  • Expert-level Python skills for ETL, pipeline orchestration, and automation.
  • Deep SQL proficiency — query optimization, data modeling, stored procedures.
  • 2+ years’ experience working with AI first development workflows.
  • 4+ years’ experience with the following AWS (S3, Glue, Lambda, Redshift), and/or Azure big data services.
  • 1+ year of experience with Snowflake.
  • 2+ years of experience with orchestration frameworks.
  • 2+ years of Salesforce experience with Apex and configurations.
  • Experienced with Kimball dimensional modeling — you've built star schemas and conformed dimensions in production.
  • Power BI (or equivalent BI tool) experience — data model design and report development.
  • API integration experience — REST, GraphQL, event streaming (Kafka, Kinesis, or similar).
  • Application development literacy — comfortable building lightweight web tooling (Python/Flask, Node, or similar) to complement data products.
  • Reside in one of the following states: CA, TX, NC, WA, ID, NV, AZ, CO, KS, AR, LA, AL, GA, FL, SC, TN, VA, MD, NJ, DE, IL, WI, MI, OH, MA, PA, NH, CT

Responsibilities

  • Engage stakeholders directly to gather, clarify, and document project requirements.
  • Translate requirements into architected data solutions: choose the right storage, pipeline, modeling, and delivery approach for each problem.
  • Own testing end-to-end — unit tests, data quality checks, reconciliation, and integration tests before anything reaches production.
  • Deploy solutions to production and monitor post-deployment health, iterating rapidly based on real-world feedback.
  • Run parallel AI coding sessions (Claude Code, Cursor, Codex) across different facets of a pipeline simultaneously — orchestrate, verify, and integrate the outputs.
  • Build and maintain context files (CLAUDE.md equivalents) for data projects that encode schema conventions, pipeline patterns, and institutional knowledge — making every future AI session smarter.
  • Design verification loops: automated data quality checks, dbt tests, CI hooks, and pipeline monitors that give AI agents concrete feedback on correctness.
  • Build MCP (Model Context Protocol) or equivalent integrations to connect AI agents directly to Snowflake, Amazon Athena, Postgresql, MySql, Power BI APIs, Salesforce, and internal tooling.
  • Prefer frontier models for complex architectural decisions and rely on AI acceleration to dramatically increase engineering throughput.
  • Design and build complex, reliable data pipelines ingesting from AWS, Azure, Salesforce, MuleSoft, and multiple third-party APIs into our AWS Data Lake and Snowflake warehouse.
  • Implement and evolve data models using Kimball methodology to support financial, operational, and clinical analytics.
  • Optimize pipeline performance, manage data quality, and perform root-cause analysis on data anomalies — internal and external.
  • Develop and maintain orchestration workflows in Python, and AWS Glue.
  • Continuously evolve the data schema as business and engineering requirements change.
  • Build and support Power BI data models and reports; empower analytics team members to self-serve on a reliable data foundation.
  • Work with data analysts and data scientists to build reusable, well-documented pipeline components they can extend independently.
  • Deliver data products that drive clinical care decisions, financial planning, and operational performance improvements.
  • Build lightweight internal data applications and tooling where needed; data entry interfaces, operational dashboards, automation scripts that bridge the gap between data pipelines and end users.
  • Design for agentic workflows: build AI-powered data tools accessible via web interfaces or Slack that surface insights proactively.
  • Integrate with Salesforce Health Cloud and other platforms using APIs and event-driven patterns.
  • Ensure data security and HIPAA compliance in all pipeline and application work.
  • Partner with IT to enforce data governance standards.
  • Document decisions, tradeoffs, and architecture clearly so that future engineers (and AI agents) can build on your work effectively.
  • Collaborate across IT, finance, clinical operations, and data science — acting as the connective tissue between data infrastructure and business outcomes.

Benefits

  • medical, dental, and vision insurance
  • a 401(k) plan with company matching and rapid vesting
  • paid holidays and wellness days
  • life insurance
  • disability insurance options
  • tuition reimbursements for professional development and continuing education
  • referral bonuses
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service