Analyst-Application Developer (Hybrid)

Kinder MorganHouston, TX
54dHybrid

About The Position

The IT Operations Support Group (OSG) Integrity team is responsible for the design, implementation, and support of various systems and applications related to the pipeline Integrity Management Program (IMP) for all pipeline business units across Kinder Morgan's operating footprint. This data-focused developer position will be responsible for the continuous improvement and support of in-house data integration pipelines and validation processes that feed systems for pipeline safety risk analysis, long-range integrity assessment planning, and a wide variety of integrated reporting.The primary duties of this role are as follows:Build and maintain reliable data pipelines for data extraction, transformation, and loading (ETL/ELT)Integrate data from various sources, including databases, APIs, and external data providersCombine data from different formats and structures into a unified data modelDesign data models and schemas to support business requirementsData warehousing of a wide variety of pipeline integrity and risk modeling datasetsCreate and maintain data catalog and data lineage documentation to support data discovery and auditConduct data profiling, data cleansing, de-duplication and validation activitiesImplement data quality monitoring and alerting systemsMonitor and optimize data systems and pipelines for performance issues and bottlenecksTroubleshoot and resolve data-related problems promptlyCollaborate with cross-functional and cross-domain teams on data-related Pipeline Integrity projectsParticipate in high-impact business process improvement initiatives across the Pipeline Integrity departmentIdentify and present Pipeline Integrity process improvement opportunities and follow through with implementation once stakeholder agreement is obtainedProvide support for other aspects of integrity relation programs, processes, and procedures as required

Requirements

  • 3-5 years' experience working with SQL, Python, or FME in data analytics, data engineering, or similar role
  • Ability to clearly explain and document processes and data integration architecture
  • Strong written and verbal communication skills with staff at various levels of technical knowledge
  • Demonstrates excellent analytical, critical thinking, and problem-solving skills
  • Demonstrates strong time management and organizational skills
  • Ability to self-motivate, set goals, prioritize and meet deadlines
  • Bachelor's degree or higher in Computer Science, Management Information Systems, Engineering or a related technical field
  • Ability and willingness to perform all essential functions of the position including those listed above
  • Ability to effectively organize, plan, prioritize, document, and complete work independently
  • Ability to successfully work in a team environment
  • Maintain a regular, dependable attendance and high level of performance

Nice To Haves

  • 5+ years' experience designing and implementing system solutions to support business needs
  • Prior experience working with pipeline integrity-related datasets
  • Familiarity with pipeline linear referencing concepts and data models such as PODS
  • Experience with ETL and data warehousing of spatial datasets
  • Experience with Safe Software's FME Desktop (Form) and FME Server (Flow)
  • Prior experience building and delivering BI solutions

Responsibilities

  • Build and maintain reliable data pipelines for data extraction, transformation, and loading (ETL/ELT)
  • Integrate data from various sources, including databases, APIs, and external data providers
  • Combine data from different formats and structures into a unified data model
  • Design data models and schemas to support business requirements
  • Data warehousing of a wide variety of pipeline integrity and risk modeling datasets
  • Create and maintain data catalog and data lineage documentation to support data discovery and audit
  • Conduct data profiling, data cleansing, de-duplication and validation activities
  • Implement data quality monitoring and alerting systems
  • Monitor and optimize data systems and pipelines for performance issues and bottlenecks
  • Troubleshoot and resolve data-related problems promptly
  • Collaborate with cross-functional and cross-domain teams on data-related Pipeline Integrity projects
  • Participate in high-impact business process improvement initiatives across the Pipeline Integrity department
  • Identify and present Pipeline Integrity process improvement opportunities and follow through with implementation once stakeholder agreement is obtained
  • Provide support for other aspects of integrity relation programs, processes, and procedures as required

Benefits

  • Competitive Wages
  • Hybrid work schedule
  • 401(k) Savings Plan
  • Retirement Plan
  • Comprehensive Medical/Rx and Dental Plans
  • Paid Time Off
  • Paid Holidays
  • Bonus Program
  • Paid Bus Pass or Parking
  • health care benefits including medical, prescription drug, dental , and vision coverage
  • Flexible and Health Spending Accounts
  • Life and Accidental Death and Disability insurance
  • Supplemental Life and Accidental Death and Disability insurance for employee and dependents
  • retirement benefits including a 401(k) with employer and employee contributions and a company-funded pension plan
  • paid time off
  • paid holidays
  • a flexible work schedule and many voluntary benefit plans
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service