About The Position

Hexion is seeking a senior, hands-on Azure data engineer to lead the design, implementation, and production support of enterprise data integrations. The role is centered on Azure Data Factory, Azure Logic Apps, and Databricks as well as other adjacent Azure data services to deliver governed, reliable, and maintainable pipelines. You will work closely with architecture, security, and infrastructure teams to apply the right guardrails to enable Hexion’s business objectives.

Requirements

  • Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent practical experience.
  • 7+ years of experience in data engineering, data integration, and production support (cloud and/or hybrid environments).
  • Strong hands-on expertise with Azure data services and integration patterns (ADF, ADLS, Azure SQL, Databricks, Logic Apps).
  • Proven experience designing and implementing Azure Data Factory pipelines and broader Azure data integration solutions.
  • Strong understanding of ETL, data integration, data warehousing, and production support practices.
  • Ability to partner with cloud platform, security, and networking teams to meet requirements for connectivity, identity, and controls needed by data workloads.
  • Working knowledge of Entra ID and Azure IAM concepts (RBAC, managed identities, service principals) as they apply to securing data pipelines and services.
  • Proficiency in SQL, Python, and PySpark.
  • Strong troubleshooting, communication, and collaboration skills, with the ability to operate effectively in fast-moving environments.
  • Azure Data & Integration: Azure Data Factory (ADF), Logic Apps, Databricks, ADLS Gen2, Azure SQL, Power BI
  • Languages: SQL, Python, PySpark
  • Security/Governance: Entra ID, RBAC, Managed Identity, Key Vault
  • DevOps/Operations (as applicable): CI/CD, Azure DevOps/GitHub Actions, monitoring/logging, Terraform/Bicep

Nice To Haves

  • Experience with Azure SQL Database, ADLS, Databricks, and Power BI.
  • Experience with Infrastructure-as-Code (Terraform and/or Bicep) and automating deployments for data platforms.
  • Exposure to AI/ML, agentic AI, or automation-heavy Azure workloads.
  • Experience supporting secure multi-vendor Azure environments.
  • Familiarity with Azure monitoring, logging, and disaster recovery patterns.
  • Experience implementing or operationalizing controls aligned to ISO/IEC 27018 (protection of personally identifiable information in public clouds) is a plus.
  • Familiarity with SAP platforms (ECC R/3, S/4HANA, BW, and Datasphere), especially in the context of integrating SAP data into Azure.
  • Azure certifications such as Azure Solutions Architect Expert.
  • Databricks certifications such as Data Engineer Associate or Professional
  • Databricks accreditations such as Databricks Fundamentals, Azure Platform Architect, or Platform Administrator.
  • AWS experience is a plus.

Responsibilities

  • Lead the design, development, and implementation of Azure Data Factory pipelines, Azure Logic Apps workflows and related Azure data solutions to meet business requirements.
  • Collaborate with business, architecture, and technical teams to translate data requirements into Azure integration deliverables.
  • Develop and enforce best practices for data pipeline development, ETL processes, data quality, governance, and documentation.
  • Optimize, monitor, and troubleshoot existing pipelines to improve performance, reliability, and maintainability.
  • Support integrations across Azure services such as Azure SQL Database, Azure Data Lake Storage, Azure Databricks, and Power BI.
  • Coordinate with platform teams to onboard data workloads into approved Azure environments (subscriptions/resource groups), ensuring required standards (naming, tagging, logging) are met.
  • Define repeatable environment and deployment patterns for data products (dev/test/prod), including configuration, secrets, and release boundaries for internal teams and vendors.
  • Ensure prerequisite platform capabilities are available for data delivery (e.g., access to ADLS, Azure SQL, Databricks workspaces), partnering with owners to provision when needed.
  • Apply governance guardrails for data workloads (policies/controls, logging, and cost visibility) and validate that deployments remain compliant over time.
  • Monitor and optimize the cost/performance of data services (e.g., ADF, Databricks, storage) using tagging/chargeback practices, budget alerts, and right-sizing recommendations.
  • Define least-privilege access patterns for data services (ADF, ADLS, Azure SQL, Databricks) using Entra ID, managed identities, service principals, and RBAC—then work with administering teams to implement required changes.
  • Implement secure secret handling (Key Vault), encryption, and credential rotation approaches for pipelines and integrations.
  • Partner with network teams to meet private connectivity requirements (private endpoints, routing, firewall rules) for data sources and targets.
  • Ensure data integrations are production-ready and auditable (logging, lineage/documentation where applicable) and aligned to enterprise security and governance requirements.
  • Implement Infrastructure-as-Code using Terraform and/or Bicep.
  • Create and support CI/CD integration for both infrastructure and application or data deployments.
  • Set up monitoring, logging, alerting, and basic break-fix support using Azure-native tools.
  • Support disaster recovery planning and operational readiness for Azure resources and data services.
  • Support vendor onboarding into Azure environments, including access, permissions, deployment boundaries, and operational guardrails.
  • Ensure external and internal teams can deploy and operate safely without impacting core enterprise workloads.
  • Work closely with Hexion leads, architects, and vendors to unblock delivery and accelerate execution.
  • Mentor junior team members and help drive engineering maturity, documentation quality, and continuous improvement across the environment.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service