Back End Data Ops Engineer

EKI Environment & Water IncPortland, OR
45dRemote

About The Position

EKI Environment & Water, Inc. (EKI) is an employee-owned, full service, engineering and environmental sciences consulting firm serving private- and public-sector clients throughout the United States. If you are interested in improving the environment and advancing your career, then EKI is the place for you. The Back End DataOps Engineer position reports to the Chief Operating Officer and is responsible for supporting critical IT infrastructure, building scalable data pipelines, and delivering modern analytics and AI-powered decision support tools. You’ll work cross-functionally with business analysts, ML/AI developers, and IT teams to deploy high-impact internal systems and client-facing solutions.  This position may office out of our physical office locations or from a remote home office.

Requirements

  • Minimum of eight (8) years of experience in back-end development, DevOps engineering, or data engineering roles
  • proficient SQL skills with strong experience in relational databases (PostgreSQL, SQL Server, Snowflake) with proven ability to write optimized, complex queries
  • Strong Python development experience, especially for data pipelines (Pandas, SQLAlchemy) and automation tasks
  • Experience leading complex ETL/ELT data pipelines and workflow automation using Microsoft products such as Power Query, Power Apps, and Power Automate
  • Strong experience in supporting back-end development and data modeling for dashboarding tools such as Power BI or Tableau
  • Proven experience with deploying web-based apps such as Streamlit and R Shiny on cloud platforms including AWS (using EC2 and S3) or Azure (using Azure App Service or Azure Container Instances), as well as on internal servers for secure, organization-wide access
  • Exposure to AI/ML tools and techniques, such as RAG, LLMs, LangChain, and Hugging Face
  • Strong understanding of CI/CD, Git workflows, and containerization using Docker
  • Must have effective communication skills both oral and written
  • Must have ability to multi-task, stay organized, work independently, and collaborate with various project teams
  • Must have a strong desire to grow personally as well as professionally
  • Must have a great attitude and eager to learn
  • Must have a current valid driver’s license
  • Must be able to work in a fast-paced environment with high-volume workload and frequent short deadlines
  • Must be able to work remaining in a stationary position, often standing or sitting for prolonged periods
  • Must be able to communicate with others to exchange information
  • Must be able to assess the accuracy, neatness and thoroughness of the work assigned

Nice To Haves

  • Familiarity with Deltek products, Mosaic, and Bamboo HR is a plus
  • Experience with vectorizing databases, embedding models, or multi-agent LLM workflows preferred
  • Prior experience supporting internal analytics platforms or operations teams preferred
  • Comfortable working across multidisciplinary teams (IT, data science, business analysts, etc.) preferred

Responsibilities

  • Design, build, and manage robust data pipelines and ETL/ELT workflows using tools like Python (Pandas, SQLAlchemy), Power Query, or Azure Data Factory
  • Develop and maintain integrations between enterprise-level systems (e.g., accounting, marketing, and HR). Familiarity with Deltek products, Mosaic, and Bamboo HR is a plus
  • Optimize and manage relational databases such as PostgreSQL, SQL Server, and Snowflake for data availability, performance, and governance
  • Support the development and deployment of internal analytics dashboards and apps using Microsoft products including Power BI, Power Automate, and Power Apps
  • Collaborate with stakeholders to translate business needs into automated, visual, and data-driven solutions
  • Optimize client-facing dashboards and data models using Power BI (DAX, M language, Power Query), Tableau, and SQL to enhance performance and usability
  • Contribute to building and deploying AI-powered tools, including building LLM -powered tools (using RAG), and vector database solutions
  • Prototype and support Python- or R-based workflows for AI, automation, and analytics innovation
  • Integrate automation solutions using tools like LangChain, Hugging Face, Power Automate, and custom Python scripts to streamline business processes
  • Implement and maintain CI/CD pipelines and deployment workflows for data and analytics tools using Git, GitHub Actions, or Azure DevOps
  • Containerize and deploy applications using Docker and orchestrate environments for internal apps, APIs, and dashboards
  • Ensure production systems are reliable, scalable, and secure by applying best practices in monitoring, version control, and infrastructure automation
  • Other duties as required

Benefits

  • EKI provides excellent compensation and comprehensive benefits packages, including career advancement opportunities, outstanding training opportunities, incentive compensation including bonuses, retirement benefits through an Employee Stock Ownership Program (ESOP) and 401(k) profit-sharing contributions, and group medical, vision, and dental benefits.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

51-100 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service