Data Engineer

DLA PiperWashington, DC
$100,787 - $160,255Hybrid

About The Position

DLA Piper is, at its core, bold, exceptional, collaborative and supportive. Our people are the backbone, heart and soul of our firm. Wherever you are in your professional journey, DLA Piper is a place you can engage in meaningful work and grow your career. Let’s see what we can achieve. Together. Summary The Data Engineer, Solutions & Data role designs, builds, and operates data pipelines and data integration processes that translate raw data into trusted, usable datasets for analytics, reporting, and downstream solutions. The role focuses on operationalizing pipelines with governance and service expectations (SLAs), improving data quality and reusability, and enabling secure access to integrated data in support of business initiatives. In current initiatives, data engineering includes consolidating data from multiple sources into a central SQL-based integration point and performing field mapping and transformations, so solution teams can consume data consistently.

Requirements

  • Proficiency in SQL and Python.
  • Data pipeline tooling and cloud data services experience (Azure Data Factory, Azure Databricks, Azure Event Hubs, SSIS).
  • Data warehousing experience (Azure Synapse Analytics) and strong fundamentals in data modeling, warehousing, and governance.
  • Scripting/automation skills (PowerShell and related tooling) for platform operations and troubleshooting.
  • 3 years of experience in data engineering and/or data platform engineering (pipelines, integration, and operational support).

Nice To Haves

  • Familiarity with additional programming languages such as Java, Scala, or Go
  • Experience integrating data from multiple enterprise source systems into a central SQL-based integration layer
  • Familiarity with DataOps concepts and operating in cross-functional teams that include data engineering personas

Responsibilities

  • Data Pipeline Engineering & Integration Build and operationalize data pipelines across heterogeneous environments, aligning to governance principles and service expectations (SLAs).
  • Build and maintain ingestion, transformation, and publication of pipelines (data engineering practice) to deliver analytics-ready data.
  • Consolidate data from multiple sources into a centralized integration point (e.g., a single SQL Server instance) and manage field mappings and transformations to support consistent downstream consumption.
  • Data Platform & Storage Design and implement data pipelines using Azure data technologies (e.g., Azure Data Factory, Azure Databricks, Azure Event Hubs, SSIS) to ingest, process, and deliver data from sources such as APIs and other systems.
  • Build and maintain data warehousing capabilities (e.g., Azure Synapse Analytics) to support analytics and reporting workloads.
  • Data Quality, Reliability & Operations Identify, troubleshoot, and resolve data issues including data quality, integrity, latency, and security concerns; apply monitoring and operational best practices to keep pipelines reliable and performant.
  • Contribute to data quality and governance practices, including profiling datasets, defining quality rules, and establishing monitoring/remediation approaches.
  • Collaboration & Delivery (Agile Pod Model) Work cross-functionally with engineers, analysts, and stakeholders to understand requirements and deliver data solutions that support sprint-based delivery.
  • Support pod-level delivery by producing reusable data assets and integration components that can be leveraged across multiple initiatives.

Benefits

  • We offer a comprehensive package of benefits including medical/dental/vision insurance, and 401(k).
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service