Integration Developer

Permian ResourcesMidland, TX
Onsite

About The Position

Permian Resources (NYSE: PR) is seeking an Integration Developer reporting to the Enterprise Data Senior Manager in Midland, Texas. This role sits at the intersection of data engineering and systems integration — responsible for building and maintaining the technical pipelines, APIs, and automated workflows that power our Databricks data platform and connect our enterprise systems. The ideal candidate brings strong technical skills in data pipeline development and software engineering, a genuine passion for building solutions that streamline business processes, and the ability to work effectively in a fast-paced, collaborative oil and gas environment. We are open to mid-level candidates who demonstrate strong fundamentals and clear potential, as well as senior candidates ready to hit the ground running.

Requirements

  • 5+ years of relevant experience in data engineering, software development, or systems integration; mid-level candidates with strong fundamentals and demonstrated potential are encouraged to apply.
  • 3 years of experience working in the oil and gas industry is required
  • Bachelor's degree (BS/BA) in Computer Science, Engineering, Business Analytics, Statistics, or a related field; equivalent practical experience will be considered.
  • Proficiency in Python and/or other modern languages (Java, C++, R); solid understanding of object-oriented design principles.
  • Hands-on experience with Databricks or a comparable cloud-based data platform (e.g., Snowflake); strong working knowledge of SQL.
  • Experience designing and maintaining REST and/or SOAP APIs; familiarity with authentication and security best practices.
  • Experience with ETL/ELT frameworks and tools (e.g., dbt); comfortable working with JSON, CSV, and XML formats.
  • Strong verbal and written communication skills; ability to translate technical concepts for non-technical stakeholders across all levels of the organization.
  • Capable of working independently and integrating effectively within multi-disciplinary teams in a fast-paced environment.

Nice To Haves

  • Experience working with or exploring AI/ML tools, large language models, or AI-assisted development workflows is a plus.
  • Significant exposure to upstream oil and gas datasets, systems, and operational workflows is strongly preferred.
  • Experience working with or exploring AI/ML tools, large language models, or AI-assisted development workflows is a plus.

Responsibilities

  • Design, build, optimize, and maintain ETL/ELT pipelines that move and transform large volumes of data from source systems into our Databricks analytics environment.
  • Build and maintain APIs and integration layers (REST, SOAP) that enable seamless data flow between enterprise applications, working across a range of data formats including JSON, CSV, and XML.
  • Monitor and support development and production environments within Databricks to ensure system availability, reliability, and data quality.
  • Develop data models and pipelines that support reporting, dashboards, and analytical use cases across multiple business teams.
  • Analyze existing integrations and pipelines to identify performance bottlenecks; implement best practices in code quality, data modeling, and architecture.
  • Partner closely with business subject matter experts, data scientists, and IT members to deliver reliable, clean data that meets operational needs.
  • Create and maintain clear technical documentation; adhere to coding standards and governance practices that support long-term maintainability and scalability.
  • Use GitHub or similar version control platforms for collaborative development, code reviews, and repository management.
  • Provide guidance to less experienced team members and contribute to a culture of knowledge-sharing across the team.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service