Data Solutions Engineer

Eversheds SutherlandAtlanta, GA
20h$150,000 - $205,000Hybrid

About The Position

We have an exciting opportunity for a Data Solutions Engineer at Eversheds Sutherland (US) LLP. We are searching for someone who wants to be a valued contributor and member of a talented and dynamic team of lawyers, paralegals and business professionals. The Data Solutions Engineer is a hands-on technical role responsible for designing and delivering secure, reliable, and scalable data integration and automation solutions across the firm. This role enables the firm to move, synchronize, and operationalize data across enterprise systems, Microsoft 365 (including SharePoint), and third-party platforms, while leveraging the firm’s SaaS-based lakehouse and Operational Data Store (Azure SQL) as the central hub for analytics, reporting, operational intelligence, and AI-ready datasets. The Data Solutions Engineer focuses on optimizing how the firm uses its data platform by building durable pipelines, integration patterns, automation, and data-ready structures that support business needs. This includes pushing and pulling data into and out of the lakehouse, improving data accessibility and usability, and designing strategies to integrate additional internal and external data sources. This position blends data integration, SQL engineering, Power Platform development, API and integration engineering, and software engineering practices. The successful candidate will be comfortable working in both low-code and code-first environments and will write business logic and integration code to support supportable, enterprise-grade data solutions.

Requirements

  • A Bachelor’s degree is required in Computer Science, Information Systems, Engineering or equivalent from an accredited college or university.
  • A minimum of four (4) years of experience in a role combining data engineering, systems integration, automation, and software development.
  • Experience working with Microsoft-based data, automation, and integration platforms.
  • Strong proficiency in SQL (T-SQL) and relational data concepts.
  • Experience building ETL/ELT pipelines and repeatable integration workflows.
  • Familiarity with API-based integration patterns and authentication mechanisms.
  • Proficiency in Python (or similar language) for business logic, automation, and integration services.
  • Experience with SharePoint, Power Platform, and Azure services.
  • Strong troubleshooting, documentation, and stakeholder communication skills.

Responsibilities

  • Data Integration & Platform Enablement Maintains and enhances integrations to ensure the reliable movement, unification, and availability of data supporting reporting, analytics, operational workflows, and AI initiatives.
  • Publishes curated and governed datasets from the lakehouse to operational systems, business applications, and reporting tools (including Power BI) to support consistent metrics and shared definitions.
  • Partners with system and data owners to recommend improvements to data structures, ingestion patterns, and usage strategies that increase performance, reliability, and usability.
  • Collaborates with business stakeholders to understand reporting and operational data needs and translate them into reusable, reliable data flows and datasets.
  • Enables self-service reporting by ensuring lakehouse outputs are discoverable, understandable, and consumable across teams.
  • Develops integration strategies and implementation plans for onboarding new internal and third-party data sources, balancing speed, governance, and long-term maintainability.
  • Power Platform & Automation Engineering Builds and maintains automation workflows that reduce manual effort and improve reliability of data movement, validation, notifications, and operational processes.
  • Develops and supports solutions using: Power Automate for workflow orchestration, triggers, approvals, scheduled processes, connector-based integrations, and custom connectors. Power Apps for data capture applications, operational tools, and lightweight front-end solutions supporting structured data collection and workflows. Azure Functions for serverless integration services, scheduled jobs, event-driven synchronization, API endpoints, and transformation services.
  • Implements resilient automation patterns including logging, retry logic, alerting, error handling, and operational runbooks.
  • Software Engineering & Code-Based Integration Writes integration code and business logic supporting data workflows and system interoperability using Python and other appropriate languages.
  • Develops API clients, middleware services, and custom integration logic where connector-based solutions are insufficient or enterprise requirements demand code-first approaches.
  • Applies sound software engineering practices including version control, modular design, documentation, code reviews, and maintainable architecture.
  • Builds reusable components, templates, and standard patterns to accelerate integration delivery across systems.
  • API & Third-Party Data Source Integration Integrates data from external platforms into the lakehouse using REST APIs, vendor SDKs, file-based feeds, webhooks, and secure managed transfer patterns.
  • Evaluates third-party platforms for integration feasibility, data quality, authentication constraints, and operational reliability.
  • Partners with vendors and internal stakeholders to resolve integration issues and support enhancements to the firm’s data ecosystem.
  • SQL Engineering, Data Modeling & Data Quality Writes and optimizes SQL queries and transformations supporting ingestion, validation, modeling, and reporting needs.
  • Designs and maintains curated data outputs including views, mapping tables, reference data, and transformation logic.
  • Implements practical data quality checks such as completeness, duplication detection, referential integrity validation, reconciliation, and anomaly detection, with associated monitoring and alerting.
  • Analytics & AI Readiness Enablement Delivers clean, well-documented datasets designed for consumption by analytics teams, BI tools, and AI initiatives.
  • Supports AI-ready data practices including clear entity definitions, consistent keys, usable metadata, and traceable source-to-target logic.
  • Provides limited support for Power BI enablement, including dataset readiness, performance considerations, and refresh behavior.
  • SharePoint & Microsoft 365 Data Integration Develops and maintains a strong working understanding of SharePoint as a data source, including list structures, content types, metadata, permissions, and governance considerations.
  • Builds and supports integrations involving SharePoint Lists and Microsoft 365 data sources, ensuring accurate synchronization, validation, and operational reliability.
  • Designs integration patterns connecting SharePoint data with Azure SQL, Microsoft Fabric, and other systems, addressing common challenges such as schema drift, lookup fields, attachments, and change tracking.

Benefits

  • comprehensive benefits such as healthcare
  • paid time off
  • discretionary merit bonuses
  • life and disability insurance
  • retirement plans
  • tailored learning opportunities
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service