Data & Software Engineer

Blue Heron CompaniesEnterprise, NV
just now

About The Position

Overview: The Data & Software Engineer will be responsible for the company’s data foundation end to end, including integrations, data quality, data modeling, governance enablement, and analytics delivery. This is a cross-functional role with enterprise-wide responsibility for data and serves as the subject matter expert for the standardized unified data platform. The role partners with data product owners to ensure data is entered and maintained consistently and enables teams to produce reliable, automatically updating dashboards and reports.

Requirements

  • Bachelor’s degree in Computer Science or a related field (or equivalent practical experience).
  • 5+ years building data platforms and analytics solutions using Microsoft technologies such as Power BI, Azure Data Factory/SSIS, Azure Synapse, and Azure Data Lake/SQL (or equivalent modern data stack experience).
  • Proven ability to build and operate automated, production-grade data pipelines and models (ELT/ETL, validation, monitoring).
  • Solid understanding of system integration, including REST APIs and approaches for systems without APIs (vendor exports, file drops, DB connections, etc.).
  • Ability to code in relevant languages for data engineering and software development (e.g., Python, SQL, C#, JavaScript/TypeScript or similar).
  • Experience improving data quality by working upstream with business teams.
  • Ability to communicate clearly with technical and non-technical stakeholders; comfortable training users and setting standards.
  • High-ownership mindset: you build it, you own it, you improve it.

Nice To Haves

  • Direct experience with Microsoft Fabric (Lakehouse/Warehouse, pipelines, notebooks, semantic models, governance).
  • Application development experience (internal tools, lightweight web apps, workflows).
  • Experience using AI-assisted coding tools productively (accelerating delivery while maintaining code quality).
  • AI app building experience (Azure OpenAI, copilots, RAG patterns, orchestration, etc.).
  • Experience in construction/homebuilding or project-based services data.

Responsibilities

  • Data Platform Ownership
  • Design, build, operate, and maintain our Microsoft data platform: Fabric / OneLake / Power BI / Azure components as appropriate.
  • Establish automated patterns for ingestion, transformation, modeling, security, monitoring, and cost management.
  • Create and maintain semantic models that support consistent dashboarding and reporting across the business and enable teams to use them effectively.
  • Software Engineering & Integrations
  • Integrate data across multiple systems using APIs, files, databases, middleware/iPaaS, and approaches for systems without APIs.
  • Build and maintain reliable ingestion pipelines (batch and/or near real-time where needed).
  • Write and maintain production-quality code to support pipelines, transformations, integrations, and internal tooling.
  • Develop supporting tools/services where needed to automate workflows and improve data reliability.
  • Data Quality, Standards & Governance Enablement
  • Define practical data standards (definitions, naming, required fields, validation rules, ownership, etc.).
  • Partner with business stakeholders and data product owners to improve upstream data entry processes and reduce downstream cleanup.
  • Implement data validation, reconciliation, and lineage so stakeholders can trust the data.
  • Power BI Enablement & Reporting
  • Build (and enable others to build) core dashboards and reporting patterns in Power BI.
  • Train and guide stakeholders on dashboarding best practices, KPI definitions, and self-service reporting within guardrails.
  • Create reusable templates/standards for metrics, visuals, and report structure.
  • Data Strategy & Continuous Improvement
  • Serve as the subject matter expert for enterprise data.
  • Think proactively about Blue Heron’s organizational data strategy - prioritizing what matters most, aligning stakeholders, and adapting as systems/processes evolve.
  • Identify opportunities to improve data quality, automation, and reporting maturity in a constantly evolving environment.
  • Future-State
  • Build internal applications/tools (including custom proprietary platforms for Blue Heron).
  • Prototype and deliver AI-enabled apps/workflows (e.g., copilots, RAG, intelligent search, automated insights).
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service