Sr. Data Engineer

MCIM by Fulcrum CollaborationsGlen Allen, VA
7dRemote

About The Position

MCIM is a Salesforce ISV Partner that develops MCIM (www.mcim24x7.com), a cutting-edge mission-critical information management platform for data centers. Our platform integrates with various enterprise systems, including CMMS, ERP, service desk, and building automation systems, to provide data-driven insights and operational intelligence. We are looking for a Data Engineer with DevOps expertise to help build and deliver data products that power analytics, reporting, and AI-driven decision-making. You will be responsible for designing scalable data pipelines, automating deployments, deploying REST API endpoints, and optimizing our data infrastructure.

Requirements

  • 5+ years of experience in Data Engineering with DevOps expertise.
  • Strong proficiency in Python, SQL, and data pipeline orchestration tools
  • Hands-on experience with CI/CD pipelines, containerization
  • Hands-on experience with REST API development using FastAPI, Flask, Django or Golang
  • Expertise in cloud platforms (AWS, Azure, or GCP) for data storage and processing.
  • Experience with Infrastructure-as-Code
  • Familiarity with data pipeline monitoring and observability tools.

Nice To Haves

  • Experience integrating data from CMMS, ERP, and IoT systems.
  • Understanding of ML Ops and AI model deployment.
  • Knowledge of data privacy and compliance standards.

Responsibilities

  • Design, build, and optimize scalable ETL/ELT pipelines to support data analytics
  • Design, develop, and maintain RESTful API using Flask, Django or similar Python frameworks.
  • Implement authentication, authorization, and security best practices for API access.
  • Develop and manage data lakes, data warehouses, and real-time data processing using tools like Snowflake and AWS services.
  • Ensure data integrity, security, and governance across Fulcrum's data platform.
  • Work with structured and unstructured data from sources such as CMMS or ERP data
  • Collaborate with data analysts, product managers, and software engineers to develop data products that provide actionable insights to customers.
  • Create and maintain clear technical documentation for data pipelines, including data lineage, transformation logic, and dependencies.
  • Develop API documentation using industry-standard tools (e.g., Swagger/OpenAPI), including endpoint specs, request/response examples, and error handling.
  • Document data models, schemas, and data dictionaries to enable self-service discovery.
  • Build and maintain CI/CD pipelines for data pipelines, infrastructure deployments, and analytics models using GitHub Actions, docker, Heroku, AWS ECR/ECS or GitLab CI.
  • Deploy and manage containerized data processing tools
  • Implement observability, logging, and monitoring for data workflows
  • Optimize cloud infrastructure (AWS) to balance performance and cost efficiency.
  • Implement best practices for data security, encryption, and access control.
  • Ensure compliance with GDPR, SOC2, HIPAA, and other regulatory requirements.
  • Define and document data pipeline standards, testing frameworks, and automation best practices.

Benefits

  • Competitive salary
  • benefits
  • flexible remote work options

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

51-100 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service