Data Integration Engineering Lead

Pinnacle Group, Inc.Pasadena, TX

About The Position

We are building a team of trailblazers, who embody growth, impact, and excellence. Job Description The Enterprise Data Architect, Lead plays a crucial role in designing, implementing, and maintaining data integration solutions within our organization. This role collaborates with customers and cross-functional teams to ensure seamless data pipelines from customers’ systems. The expertise for this role will contribute to the organization’s overall strategy and architecture for data acquisition.

Requirements

  • Hands-on experience extracting data from at least two of: OSIsoft PI, SAP PM/EAM, Maximo, eMaint, or similar industrial/operational systems. This is non-negotiable.
  • Direct experience working with customer or client IT teams to negotiate and establish data access (firewall rules, VPN connectivity, service accounts, API credentials).
  • SQL proficiency — specifically the ability to explore unfamiliar database schemas and write extraction queries with little or no documentation.
  • Python for data extraction, transformation, and pipeline automation.
  • Experience with cloud-based data integration (Azure Data Factory, Azure Functions, or comparable).
  • Strong knowledge of data integration patterns, ETL/ELT, APIs, and messaging protocols (REST, SOAP, OPC).
  • Demonstrated experience with enterprise database technologies and data modeling.
  • Excellent communication skills — you’ll be the person answering detailed technical emails from client IT directors and leading discovery calls
  • Expertise in data integration tools and platforms (e.g., Azure Data Factory, Informatica, Talend).
  • Proficiency in big data platforms (Hadoop, Spark, etc.) and analytics tools (Power BI, Tableau).
  • Familiarity with DevOps tools and practices (e.g. Azure DevOps).

Nice To Haves

  • Experience in oil and gas, refining, chemicals, or heavy industry environments.
  • Familiarity with reliability engineering concepts (RBI, CMMS workflows, asset hierarchy management, inspection data).
  • Experience with Cognite Data Fusion (CDF) or similar industrial data platforms.
  • Knowledge of PI Web API, PI SDK, or AF SDK for historian data extraction.
  • Experience with OPC-UA/DA protocols for real-time industrial data.
  • Background in data governance and compliance measures.
  • Understanding of microservices architecture and containerization (Docker, Kubernetes).
  • Experience with DevOps tools and practices (Azure DevOps, CI/CD pipelines

Responsibilities

  • Source System Extraction (This Is the Core of the Role) Independently extract data from industrial source systems including OSIsoft PI historians, SAP PM/EAM, Maximo, eMaint, lab/LIMS systems, and other CMMS/ERP platforms.
  • Navigate customer IT environments to establish connectivity — VPNs, service accounts, firewall rules, read-only database access — often with limited or no documentation.
  • Reverse-engineer undocumented or poorly documented source schemas to identify the right data for integration.
  • Build and own the extraction layer: connectors, API calls, direct database queries, file-based ingestion from heterogeneous client environments.
  • Handle the reality that every customer’s data is messy in a different way — inconsistent tag naming, mismatched equipment IDs, unmaintained asset hierarchies.
  • Data Transformation and Pipeline Development Design, build, and maintain data pipelines that clean, transform, and load extracted data into our reliability platform.
  • Develop integration architecture and blueprints tailored to each customer’s source system landscape.
  • Implement data quality checks, reconciliation processes, and monitoring to ensure ongoing accuracy.
  • Build and maintain master data mapping strategies — including change management processes as clients execute MOCs, add equipment, or decommission assets.
  • Own pipeline monitoring, alerting, and uptime SLAs for all production data extraction and integration systems. These are live production pipelines serving customers — when extraction fails, you are responsible for detecting the failure, diagnosing the root cause, and restoring the data flow within SLA.
  • Client Communication and Technical Leadership Serve as the primary technical point of contact with customer IT teams for all data access and connectivity matters.
  • Respond to detailed technical inquiries from client IT leadership (architecture questions, data mapping strategies, security concerns) with clarity and confidence.
  • Lead discovery sessions with customers to understand their source systems, data flows, and integration requirements.
  • Create and maintain architecture documentation, integration runbooks, and data dictionaries for each client engagement.
  • Provide technical guidance and mentorship to team members and drive knowledge sharing across the data engineering team.
  • Manage integration project plans, timelines, and deliverables across multiple concurrent client engagements.
  • Drive accountability on milestones, coordinate dependencies with client IT teams, and ensure integrations are completed on schedule.
  • Strategy and Team Building Lead the enterprise data integration strategy and platform architecture across the organization.
  • Provide new ideas and approaches to the CTO and enterprise architecture team on data acquisition and integration best practices.
  • Drive recruitment to build and grow a high-performing data engineering team.
  • Continuously evaluate and adopt emerging data technologies and practices.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

501-1,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service