Data Engineer

Dolese Bros. Co.Oklahoma City, OK
Hybrid

About The Position

The Dolese Promise is built on a foundation of integrity driven by our passion for quality, safety, and reliability. We are one of Oklahoma's most respected employee-owned companies because of our people and our values. We strongly believe in positively impacting our communities through our products, our actions, and our financial support. Being a part of the Dolese team affords a unique opportunity to join an organization that rewards its owners through profit sharing. Our employees are one of our most important resources, which is why we promise to deliver. Dolese Delivers: Stable Foundation Treat with Respect Safe Environments Employee Focus Grow your career with a company built on Safety, Integrity, Teamwork, and Stewardship. As a Data Engineer at Dolese, you will design, build, and scale the data and AI foundations that power enterprise reporting, analytics, and automation across the organization. This role focuses on developing reliable, well-governed data pipelines, curated data models, and machine learning / large language model (LLM)-enabled capabilities that support self-service analytics and natural language data access. The Data Engineer partners closely with data analytics and business teams to ensure high-quality, production-ready data products that enable scalable insights, AI-assisted reporting, and operational decision support.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, Data Analytics or related technical field required
  • Minimum of four (4) years of experience in data engineering, analytics engineering, or software engineering roles supporting analytics or AI workloads.
  • Strong proficiency in SQL and Python, including building production-grade data pipelines and automation.
  • Experience with data warehousing, lakehouse architectures, and ETL/ELT patterns supporting scalable analytics and AI workloads.
  • Ability to translate business needs into scalable, reliable data and AI-enabled solutions that support reporting, automation, and decision-making.
  • Strong data engineering foundation with applied software engineering skills.
  • Excellent communication skills, with the ability to explain data platform concepts and solutions to technical and non-technical audiences.
  • Proven ability to build effective stakeholder partnerships that align analytics to business needs.
  • Demonstrated technical curiosity and continuous-learning mindset, with active interest in emerging AI/ML and BI technologies.
  • Proven attention to detail and commitment to high standards for accuracy.
  • Ability to manage multiple priorities in a fast-paced environment while maintaining focus and quality.

Nice To Haves

  • Master’s degree in Computer Science, Engineering, Data Analytics or related technical field preferred.
  • Experience with cloud analytics platforms such as Azure Synapse, Databricks, or Snowflake preferred.
  • Experience supporting or implementing applied machine learning or LLM-enabled solutions using Python-based workflows in production or near-production environments preferred.
  • Familiarity with MLOps concepts, model monitoring, and AI governance practices as related to supporting reliable deployment of ML models in production BI environments preferred.
  • Industry experience in construction materials, manufacturing, energy, industrial operations, healthcare, or customizable services preferred.

Responsibilities

  • Design, build, and maintain scalable, reliable data pipelines (ETL/ELT) that support analytics, reporting, and machine learning workloads.
  • Develop and support machine learning and large language model (LLM)–enabled capabilities to automate reporting, insight generation, and natural language data access.
  • Enable AI-driven self-service analytics, including natural language query and AI-assisted reporting tools, allowing business users to interact directly with governed data assets.
  • Ensure data quality, performance, observability, and reliability across production data and AI systems.
  • Support operational readiness of data pipelines and AI services, including monitoring, issue resolution, and continuous improvement.
  • Design and maintain curated data models, feature datasets, and analytical layers that enable BI, advanced analytics, and ML/AI use cases.
  • Support enterprise data governance by ensuring accuracy, consistency, and quality across analytics assets.
  • Collaborate with analytics, BI, and platform teams to improve data pipelines, definitions, and reliability and support scalable modern data architecture (e.g., lakehouse patterns).
  • Enable BI and reporting teams by providing performant, well-modeled, and trusted datasets that serve as the foundation for dashboards and reports.
  • Support integration of AI-powered capabilities within BI platforms (e.g., Copilot, intelligent alerting, embedded forecasting).
  • Work closely with analytics, BI, and business stakeholders to understand reporting, automation, and AI requirements.
  • Partner with business stakeholders to support KPI, metric, and measurement strategy enablement through strong data modeling and governance.
  • Communicate data platform capabilities, limitations, and best practices to technical and non-technical partners.
  • Provide guidance and knowledge sharing on data engineering, automation, AI/ML enablement, and platform best practices.
  • Support continuous improvement of analytics processes, including automation, documentation, and standardization.
  • Apply software engineering best practices, including version control, code reviews, testing, and CI/CD where applicable.
  • Contribute feedback to established analytics standards, templates, and frameworks.

Benefits

  • profit sharing
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service