Senior Data Engineer - ETL & Integrations

Lasso Informatics Inc Lasso Informatique IncMinneapolis, MN
3dOnsite

About The Position

This is a senior-level data engineering and systems integration role focused on building and operating production-grade ETL pipelines and integrations. You will work across data ingestion, transformation, and loading while leveraging a BPMN-based workflow engine to model and manage process flows where appropriate. The role sits at the intersection of data engineering, backend development, and workflow-driven systems, with a clear distinction between ETL responsibilities and workflow orchestration capabilities.

Requirements

  • 5+ years of experience in data engineering, ETL, or systems integration roles
  • Strong experience building and operating production ETL pipelines
  • Proficiency in Python and/or Java in backend or data-processing environments
  • Strong PostgreSQL and SQL experience, including performance tuning
  • Hands-on experience with data transformation and loading techniques (ETL vs ELT, incremental loads, CDC concepts)
  • Experience integrating systems via REST APIs and API gateways
  • Experience working with BPMN-based workflow engines or workflow modeling tools
  • Experience operating distributed systems in production environments
  • Strong troubleshooting, debugging, and operational mindset
  • Familiarity with common architectural patterns (e.g., layered architectures, event-driven systems, integration patterns)

Nice To Haves

  • Experience with specific BPMN workflow engines such as Camunda, Zeebe, or Flowable
  • Experience with event-driven architectures or message queues
  • Cloud platforms experience (AWS, GCP, or Azure)
  • Docker and Kubernetes experience
  • CI/CD pipelines for data or backend systems
  • Experience working in regulated or compliance-driven environments

Responsibilities

  • Design, build, and operate end-to-end ETL pipelines and data integrations
  • Develop BPMN-based workflows to model and manage complex process flows
  • Build integration services and transformation logic in Java and Python
  • Integrate internal and external systems using REST APIs, API gateways, and asynchronous messaging
  • Apply appropriate data transformation and loading strategies (batch and near-real-time)
  • Design and optimize PostgreSQL schemas, queries, indexes, and bulk loading mechanisms
  • Work with structured and semi-structured data formats (JSON, CSV, XML, Parquet, Avro)
  • Ensure data quality, consistency, and reliability through validation, deduplication, and idempotency
  • Monitor, troubleshoot, and optimize production ETL pipelines and integration services
  • Collaborate with engineering, product, and external partners on integration contracts and data models
  • Document ETL pipelines, workflows, schemas, and operational procedures

Benefits

  • Competitive salary and benefits package
  • In-office work culture with required presence Tuesday through Thursday
  • Opportunities for leadership and professional growth
  • Collaborative team committed to innovation, quality, and scientific impact
  • Access to training resources and ongoing professional development
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service