Data Engineer

HYERTEK INCRockville, MD
just now$125,000 - $140,000

About The Position

HyerTek is seeking a Data Engineer to build ETL pipelines, dataflows, and analytics foundations for federal government clients. You'll extract data from legacy systems, transform it, load it into Microsoft Dynamics 365 and Dataverse, and support the reporting layer in Power BI and Microsoft Fabric. This role covers the full data lifecycle — from source extraction through pipeline development to analytics-ready datasets. You'll work with sources such as Oracle, Jira, Confluence, SharePoint, Planner, and legacy databases, migrating data to modern platforms and making it usable for reporting and insights for Microsoft technologies.

Requirements

  • 4+ years of ETL development and data pipeline experience
  • Strong hands-on experience with Azure Data Factory
  • Experience with Power Query / Dataflows
  • Strong SQL skills
  • Experience building Power BI datasets and semantic models
  • Understanding of data modeling for analytics (star schema, relationships)
  • Ability to troubleshoot pipeline failures and data quality issues
  • This position requires U.S. citizenship and the ability to successfully obtain and maintain a U.S. Department of Defense (DoD) clearance.
  • Candidates must be authorized to work in the United States without the need for employment-based visa sponsorship now or in the future. HyerTek will not sponsor applicants for U.S. work visa status for this opportunity, and no sponsorship is available for H-1B, L-1, TN, O-1, E-3, H-1B1, F-1, J-1, OPT, CPT, or any other employment-based visa.

Nice To Haves

  • Experience with Microsoft Fabric (lakehouses, pipelines, warehouses)
  • DAX proficiency
  • Experience with D365 Data Migration Framework (DMF)
  • Experience loading data into Dataverse / Power Platform
  • Familiarity with Jira, Confluence, SharePoint, Oracle APIs
  • Python or PowerShell scripting
  • Federal or DoD project experience
  • Azure/Fabric certifications (DP-203, DP-600, PL-300)

Responsibilities

  • Build and maintain ETL pipelines using Azure Data Factory
  • Develop Power Platform Dataflows for data transformation and loading
  • Extract from diverse sources — Oracle, SQL Server, Jira, Confluence, SharePoint, Planner, REST APIs, flat files
  • Write complex transformations using Data Factory expressions, Power Query M, and SQL
  • Implement incremental load and change data capture patterns
  • Schedule, monitor, and troubleshoot pipeline runs
  • Build semantic models and datasets in Power BI and Microsoft Fabric
  • Develop and optimize dataflows that feed Power BI reports
  • Support report developers with data preparation and modeling
  • Create Fabric lakehouses and data pipelines where applicable
  • Write DAX measures and optimize dataset performance
  • Ensure data quality and consistency across reporting layers

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

1-10 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service