Senior Data Integration Engineer (100% Remote)

Vori HealthNashville, TN
$110,000 - $120,000Remote

About The Position

Vori Health is seeking a Senior Data Integration Engineer to design, build, and operate API-driven data services. This role involves connecting operational systems, synchronizing data reliably across boundaries, and landing curated datasets in Snowflake, working closely with Hasura as the operational database and API layer. The engineer will collaborate with Data Engineers and Implementation teams to ensure seamless end-to-end data flow.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience in data operations, with a strong background in building and optimizing data delivery pipelines and storage.
  • Knowledge of Python, Typescript, bash, GraphQL, SQL
  • Expertise working with AWS-based solutions
  • Expertise working with modern data platforms (e.g., Redshift, Snowflake, etc.) and tools (e.g. AWS S3, lambdas, etc.)
  • Experience with SCM and CI/CD tools such as GitHub/Github Actions
  • Exceptional problem-solving skills and the ability to work collaboratively in a fast-paced, dynamic startup environment.
  • Excellent communication and project management skills, with a proven track record of delivering projects that provide significant business value.
  • Solid understanding of networking, authentication, and encryption concepts (SSH, SSL/TLS, key management).
  • Understanding of data pipelines and orchestration (Airflow, dbt Cloud jobs, or similar).
  • Familiarity with observability tools (Datadog or equivalent).

Nice To Haves

  • Prior experience with clinical or healthcare data, with a strong understanding of relevant data structures, such as Encounters, Claims, Financial Data, etc.
  • Stay updated with emerging data technologies and assess their applicability to the organization.
  • Drive initiatives for continuous improvement in data management processes and tools.
  • Leverage AI first mentality for optimizing and innovating on data architecture.

Responsibilities

  • Own the design and improvement of data delivery pipelines across AWS (S3, Lambda, EventBridge, CloudWatch, etc.).
  • Build and maintain integration platforms and services (REST/GraphQL/event-driven) that expose and consume APIs safely, with clear contracts, versioning, and observability.
  • Lead automation of ingestion, monitoring, and recovery processes — reducing manual effort and improving reliability.
  • Build reusable infrastructure-as-code patterns (Terraform, CloudFormation) for consistent environment provisioning.
  • Develop and maintain shared frameworks and scripts in Python and Bash for ingestion and pre-processing of third-party data files.
  • Design and implement secure, scalable APIs in Node.js and Typescript to support our virtual care workflows.
  • Partner with Data Engineers to align ingestion mechanisms with warehouse needs and dbt transformations.
  • Design and maintain alerting and observability frameworks for all data workflows.
  • Implement metrics and dashboards to monitor latency, data freshness, file arrivals, and pipeline health.
  • Perform root-cause analysis on failures and drive permanent improvements to prevent recurrence.
  • Establish standards for logging, error handling, and notification integration (e.g., Slack, DataDog).
  • Define and enforce SLAs for ingestion timeliness and system uptime.
  • Oversee secure data exchange protocols (SFTP, HTTPS, AWS Transfer Family).
  • Manage and rotate public/private keys for SFTP authentication with third parties.
  • Partner with Security and IT to enforce least-privilege IAM roles, key policies, and encryption-at-rest/in-transit standards.
  • Maintain audit trails and compliance documentation for all data integrations.
  • Serve as the primary technical contact for external IT teams involved in data transfers.
  • Coordinate with the Implementation team to onboard new partners, manage credentials, and troubleshoot issues.
  • Track which third parties push vs. pull data, and document all mechanisms (SFTP, portals, secure email, APIs).
  • Proactively manage external communication and escalation channels to minimize integration downtime.
  • Communicate technical details clearly and diplomatically across internal and external teams.
  • Identify opportunities to streamline the data-ingestion lifecycle through automation, tooling, or process improvement.
  • Lead technical reviews, mentor DataOps Engineers, and drive adoption of best practices.
  • Contribute to the team’s long-term roadmap, with a focus on observability, security, and automation maturity.
  • Partner with Data Engineers to align data-ingestion reliability with transformation and warehouse stability.
  • Stay informed about new AWS and Snowflake features and advocate for practical adoption where appropriate.

Benefits

  • Competitive Salary & Equity
  • Equity Options
  • Medical, dental, and vision coverage
  • Wellness programs
  • Mental health resources
  • 401(k) plan and Roth options
  • Generous paid time off, including vacation days, holidays, and sick leave
  • Fully Remote Work
  • Professional Development
  • Training
  • Workshops
  • Professional Development stipend
  • Paid parental leave
  • Employee Assistance Program (EAP)
  • Confidential counseling and support
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service