Senior Data Engineer (20187)

La Mesa RvPhoenix, AZ
22h

About The Position

La Mesa RV A family owned and operated company, La Mesa RV was founded in 1972. Our original location was in La Mesa, California. We have recently relocated to Phoenix, AZ. Our business philosophy is Customers and Employees are the most important people in the world. Putting this belief into practice has enabled James K, our founder, to guide LMRV on a path of growth and prosperity. LMRV has grown over the years to become one of the largest multi-location RV dealerships in the world and is recognized as a leader in the industry. Apply to LMRV!! We offer a lot of room to grow internally! Position Summary We are seeking a highly skilled Senior Data Engineer to design, build, and optimize enterprise data platforms that enable advanced analytics and AI initiatives. This is a hands-on development role focused on implementing scalable, secure, and high-performing data solutions. The ideal candidate will have deep expertise in data modeling, modern data architecture, data governance, and experience setting up data lakes, lakehouses, and data warehouses. Proficiency in Microsoft Fabric and familiarity with Power BI are essential.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field or relevant professional experience.
  • 5+ years of experience in data engineering roles.
  • Strong hands-on experience with: Data modeling, schema evolution, and pipeline development.
  • Microsoft Fabric and Azure Data Services.
  • Data migration, ingestion, and database programming.
  • Knowledge of Medallion Architecture, event-driven architecture, data lakehouse concepts, and data governance principles.
  • Experience with AI/ML data pipelines.
  • Proficiency in Python, SQL, and Java for building scalable data solutions.
  • Strong understanding of data structures and algorithms for efficient data processing and optimization.
  • Knowledge of API development and integration for data ingestion and service-oriented architecture.
  • Hands-on experience with CI/CD practices and automated testing for data pipelines.
  • Understanding of streaming and event-driven architectures (e.g., Kafka, Kinesis) for real-time data processing.
  • Familiarity with Power BI integration and optimization.
  • Proven experience working in Agile environments (Scrum/Kanban).
  • Excellent communication and documentation skills.
  • Understanding and practical use of AI coding tools such as Claude Code, Copilot and Codex.

Nice To Haves

  • Knowledge of SSRS (SQL Server Reporting Services) and SSIS (SQL Server Integration Services)preferred.

Responsibilities

  • Design and implement robust data models (conceptual, logical, physical) to support analytics and AI workloads.
  • Architect and maintain data pipelines leveraging modern architecture concepts for managing structured and unstructured data.
  • Establish and enforce data governance frameworks, including data quality, lineage, metadata management, and compliance.
  • Build and manage data lakes, lakehouses, and warehouses using Microsoft Fabric and Azure services.
  • Develop and optimize ETL/ELT processes for batch and real-time data ingestion.
  • Plan and execute data migration strategies from SQL Server databases to Microsoft Fabric, ensuring data integrity and minimal disruption.
  • Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality datasets.
  • Integrate data solutions with Power BI for reporting and visualization.
  • Ensure compliance with cybersecurity, data privacy, and regulatory requirements.
  • Define and enforce best practices for data performance, scalability, and maintainability.
  • Implement CI/CD pipelines and automated testing for data workflows.

Benefits

  • competitive pay
  • healthcare coverage
  • 401K
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service