About The Position

We're seeking an experienced Senior Integration Engineer to join our engineering team and lead the development of critical data integrations for our cold chain logistics platform. You'll be responsible for building and maintaining complex integrations that process shipment tracking data, device telemetry, and temperature monitoring feeds from diverse logistics partners and IoT devices—enabling real-time visibility and proactive risk management for temperature-sensitive pharmaceutical shipments. This role requires someone who thrives on solving complex data challenges, from debugging international geocoding issues to optimizing high-volume telemetry processing workflows and reconciling conflicting data across multiple vendor systems. You'll work directly with major pharmaceutical clients to ensure their critical shipments maintain temperature integrity throughout the supply chain.

Requirements

  • 4+ years building production data integrations with complex transformation requirements
  • Expert-level JavaScript/TypeScript for integration development
  • Strong experience with iPaaS platforms (Prismatic.io strongly preferred)
  • Proficiency with PostgreSQL and modern database patterns (Supabase/PostgREST experience a plus)
  • Experience with geocoding APIs and address normalization challenges
  • Understanding of webhook patterns, REST APIs, and real-time data streaming
  • Frontend experience for building integration monitoring dashboards
  • Cold chain logistics and temperature-controlled shipments (pharmaceutical strongly preferred)
  • Experience with 3PL data formats and tracking systems
  • IoT device integration and telemetry processing
  • Understanding of GxP compliance and pharmaceutical logistics requirements
  • Knowledge of thermal packaging and temperature excursion management
  • EDI/AS2 protocol experience
  • Proven ability to debug complex data quality issues across multiple systems
  • Experience handling inconsistent vendor data formats and missing fields
  • Skills in reverse-engineering undocumented APIs and payload structures
  • Understanding of retry strategies and error recovery patterns
  • Experience with high-volume data processing optimization
  • Obsessive attention to data quality and validation
  • Ability to context-switch between multiple integration issues
  • Strong debugging skills with distributed systems
  • Experience managing vendor relationships and escalations
  • Clear technical documentation and communication skills

Responsibilities

  • Build and maintain production integrations using Prismatic.io, handling complex payload transformations and multi-step workflows
  • Process and normalize shipment tracking data from 3PL partners (UPS, Kuehne+Nagel, World Courier, DHL, FedEx)
  • Implement robust error handling and retry logic for unreliable vendor APIs ensuring 100% of data is captured
  • Design device selection algorithms to optimize telemetry data processing
  • Create matching logic for automated shipment routing decisions
  • Build integration deviation detection and alerting workflows
  • Process device telemetry streams from multiple IoT providers (Sensitech, Tag-n-Trac, Elpro, Tive, Samsara, Controlant, Roambee, etc.)
  • Calculate packaging thermal life and predict temperature breaches
  • Implement 3PL/carrier milestone processing for shipment status updates
  • Design dashboards for real-time integration health visibility for internal and external stakeholders
  • Debug and resolve geocoding issues with international addresses using Mapbox and other APIs
  • Implement comprehensive data validation for vendor payloads
  • Build data quality monitoring and anomaly detection systems
  • Create reconciliation workflows for mismatched shipment and device data
  • Handle API (REST, GraphQL, SOAP), EDI (AS2), and legacy (email, SFTP) transmission implementations
  • Design and optimize Supabase database schemas for high-volume telemetry data
  • Write complex PostgREST queries for data retrieval and aggregation
  • Implement efficient upsert patterns for real-time data updates
  • Build data archival and retention strategies
  • Create database migration scripts and maintain schema documentation
  • Debug production issues across multiple client environments
  • Optimize integration performance for high-volume data streams
  • Support Linear project management workflows
  • Document PAXAFE integration workflows, schemas, and other specifications

Benefits

  • Competitive base salary
  • Equity participation
  • Five weeks PTO
  • Full healthcare coverage
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service