Data Engineer Lead - Contract

GallagherRolling Meadows, IL
Remote

About The Position

The Data Engineering Lead will be responsible for the design, development, implementation, and support of the Data Initiatives throughout Gallagher, ensuring that optimal data delivery architecture is consistent across ongoing projects. You will lead and mentor data engineers and data scientists, ensuring alignment with business goals and fostering collaboration across multiple teams, systems, and products. You will also oversee deliverables and provide ongoing support to ensure project success and operational excellence. Do you find the prospect of optimizing or even re-designing our company's integration and data architecture to support our next generation of products and data initiatives most exciting? We really should explore together. Please note additional position details below: This is a Temp-To-Hire, W-2 position. We are not able to do 1099 or C2C. It is a fully remote role that will need to be based in the U.S. You must meet our U.S. Eligibility requirements for work authorization as noted under "Additional Information" at the bottom of the job description.

Requirements

  • A relevant technical BS Degree in Information Technology.
  • Proven experience in leading cross-functional teams, managing business users, and driving alignment between technical and business objectives.
  • 3+ years of experience leading a technical team.
  • 7+ years of data engineering experience leveraging technologies such as Snowflake, Azure Data Factory, ADLS Gen 2, Logic Apps, Azure Functions, Databricks, Apache Spark, Scala, Synapse, SQL Server.
  • Experience with scripting tools such as Power Shell, Python, Scala, Java, and XML.
  • Understanding the pros and cons, and best practices of implementing Data Lake, using Microsoft Azure Data Lake Storage.
  • Experience structuring Data Lake for reliability, security, and performance.
  • Experience implementing ETL for Data Warehouse and Business Intelligence solutions.
  • Skills to read and write effective, modular, dynamic, parameterized, and robust code, establish and follow already established code standards, and ETL framework.
  • Strong analytical, problem-solving, and troubleshooting abilities.
  • Good understanding of unit testing, software change management, and software release management.
  • Knowledge of DevOps processes (including CI/CD) and Infrastructure as Code fundamentals.
  • Experience performing root cause analysis on data and processes to answer specific business questions and identify opportunities for improvement.
  • Experience working within an agile team.
  • Exceptional communication and interpersonal skills, with the ability to influence and guide stakeholders at all levels.
  • Demonstrated ability to manage relationships with business users, address their concerns, and ensure their needs are met through effective communication and collaboration.
  • Experience in navigating ambiguity and uncertainty in projects, with a track record of delivering successful outcomes.
  • Proven experience managing deliverables across multiple teams and providing ongoing support for data systems and integrations.

Responsibilities

  • Lead requirements gathering, scope definition, and technical design of integration workflows, ensuring alignment with business objectives and providing strategic direction to the team.
  • Experience managing and mentoring a team of Data Engineers (onsite and offshore).
  • Proactively identify risks and uncertainties in data initiatives, and develop strategies to mitigate them while ensuring project success.
  • Provide leadership, direction, and coordination for development and support teams, including globally located resources, ensuring effective communication and collaboration.
  • Act as a liaison between technical teams and business users, ensuring that data solutions meet business needs and addressing concerns or uncertainties effectively.
  • Manage deliverables across multiple teams, ensuring timely completion and alignment with business priorities.
  • Seek out, design, and implement internal process improvements: automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability.
  • Build the infrastructure required for optimal ETL/ELT pipelines to ingest data from a wide variety of data sources using Microsoft Azure technologies such as Azure Data Factory and Databricks.
  • Construct and maintain enterprise-level integrations using the Snowflake platform, Azure Synapse, Azure SQL, and SQL Server.
  • Create data tools for data analytics and data science team members that assist them in building and optimizing our product into an innovative industry leader.
  • Lead troubleshooting efforts, driving root-cause analysis, and coordinating with infrastructure teams to resolve incidents while maintaining transparency with stakeholders.
  • Design analytics tools that utilize the data pipeline to deliver actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
  • Provide ongoing support for data systems and integrations, ensuring reliability and performance while addressing issues promptly.

Benefits

  • Medical/dental/vision plans, which start from day one!
  • Life and accident insurance
  • 401(K) and Roth options
  • Tax-advantaged accounts (HSA, FSA)
  • Educational expense reimbursement
  • Paid parental leave
  • Digital mental health services (Talkspace)
  • Flexible work hours (availability varies by office and job function)
  • Training programs
  • Gallagher Thrive program – elevating your health through challenges, workshops and digital fitness programs for your overall wellbeing
  • Charitable matching gift program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service