Data Engineer

Ensono
4h$114,000 - $148,000

About The Position

At Ensono, our Purpose is to be a relentless ally, disrupting the status quo and unleashing our clients to Do Great Things! We enable our clients to achieve key business outcomes that reshape how our world runs. As an expert technology adviser and managed service provider with cross-platform certifications, Ensono empowers our clients to keep up with continuous change and embrace innovation. We can Do Great Things because we have great Associates. The Ensono Core Values unify our diverse talents and are woven into how we do business. We achieve our purpose by living five core values: Honesty, Reliability, Collaboration, Curiosity, and Passion! At Ensono, we are evolving into a software-first Managed Services Provider—a place where AI, automation, and human expertise work together to deliver 10x productivity for our clients. Our Envision Operating System is the backbone of this transformation, orchestrating operations across mainframe, distributed, and cloud environments. Data Engineer (Operational Data & AI Enablement) The Data Engineer plays a pivotal role in Ensono’s evolution toward predictive, zero‑touch managed services. This is not a traditional analytics or back‑office data role. As an operational data and AI enablement engineer, you will design and build production‑grade data pipelines and platforms that power predictive services, anomaly detection, and intelligent automation. From ServiceNow tickets to mainframe and cloud telemetry, you’ll turn raw, noisy operational signals into high‑quality, AI/ML‑ready datasets that enable real‑time insights and proactive operations. The work you do directly impacts uptime, cost optimization, and Ensono’s ability to move from manual, reactive support to a predictive, automated model. We are looking for engineers who don’t just architect pipelines but get things done—builders who deliver working solutions, iterate quickly, and collaborate closely with data scientists, ML engineers, and operations teams to ensure models don’t just run in notebooks but meaningfully change how work gets done in production. If you want to be part of the team rewiring managed services for the AI era, this is your role.

Requirements

  • Strong SQL skills and solid data modeling fundamentals
  • Expertise in ELT/ETL pipeline development and orchestration
  • Python (required) plus experience with at least one of Java, Scala, or C++
  • Hands‑on experience with Snowflake or equivalent cloud data warehouse platforms
  • Minimum of 2 years Snowflake experience required
  • Proven experience extracting, transforming, and operationalizing data from ServiceNow and/or other enterprise operational systems (e.g., monitoring platforms, ITSM, finance, or HR systems such as Workday or Concur)
  • Familiarity with observability tooling and distributed data systems
  • Knowledge of enterprise data governance, compliance, and data lineage practices
  • Experience supporting AI/ML feature pipelines in production environments
  • 7+ years' experience is required
  • Get Stuff Done – Biased toward execution and results over prolonged design cycles
  • Business Impact Driven – You build pipelines that directly improve uptime, cost efficiency, and operational predictability
  • Collaborative Partner – Comfortable working at the intersection of Operations, AI/ML, and business stakeholders
  • Continuous Learner – Actively explores new tools and techniques to accelerate delivery and improve outcomes

Responsibilities

  • Data Pipeline Development: Build, optimize, and maintain ELT/ETL pipelines that ingest, clean, and organize operational data from ServiceNow, mainframe environments, distributed systems, and cloud platforms.
  • ServiceNow Data Integration: Develop robust extraction, transformation, and ingestion patterns for ServiceNow operational data (incidents, alerts, changes, requests), ensuring it is reliable, well‑modeled, and ready for AI/ML use cases.
  • Data Infrastructure & Architecture: Design scalable data models, storage frameworks, and integration layers in Snowflake and related modern data platforms, with an emphasis on performance, reliability, and operational relevance.
  • Data Quality & Governance: Implement data quality standards, monitoring, validation, and lineage to ensure pipelines produce clean, trustworthy, and auditable datasets.
  • Collaboration with AI/ML Teams: Partner with Data Scientists, ML Engineers, and MLOps to deliver and maintain production‑grade feature pipelines, training datasets, and inference‑ready data, supporting predictive models, anomaly detection, and intelligent runbooks.
  • ML Deployment Enablement: Support ML production workflows by enabling model registration, versioning, and lifecycle management within Snowflake, working alongside ML and MLOps teams (model development itself is not the primary responsibility of this role).
  • Snowflake AI & LLM Integration: Integrate Snowflake AI components, including Cortex LLMs, into data workflows for use cases such as enrichment, summarization, and operational intelligence.
  • Automation & Optimization: Identify opportunities to streamline data workflows, reduce manual intervention, and lower operational costs while improving reliability and scalability.
  • Cross‑Functional Enablement: Work with Finance, Procurement, Cloud Operations, Mainframe Operations, and Service Operations teams to ensure data products align with high‑value business outcomes.

Benefits

  • Unlimited Paid Days Off
  • Three health plan options
  • 401k with company match
  • Eligibility for dental, vision, short and long-term disability, life and AD&D coverage, and flexible spending accounts
  • Family Forming Benefit including fertility coverage and adoption/surrogacy reimbursement
  • Paid childbearing and paternal leave
  • Education Reimbursement, Student Loan Assistance or 529 College Funding
  • Sabbatical leave
  • Wellness program
  • Flexible work schedule
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service