Senior Data Engineer - (Hartford, CT)

The HartfordHartford, CT
22h

About The Position

Staff Data Engineer - GE07CE We’re determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals – and to help others accomplish theirs, too. Join our team as we help shape the future. Hartford Fire Insurance Company in Hartford, CT has the following opening for a Senior Data Engineer. Summary of Duties: Provide technical leadership by enabling the vision of the application architecture and safeguard the integrity of the application environment. Assist architects in designing and implementing application integration involving a range of applications from third party off premise cloud applications to on premise legacy applications. Responsible for end-to-end technical solution, goes beyond borders to ensure success of overall technical solution. Work closely with vendor software providers to drive optimal solutions. Develop application components and oversee technical deliverables from junior Developers through the software development life cycle. Identify and validate internal and external data sources for availability and quality. Work with SMEs to describe and understand data lineage and suitability for a use case. Design, code, and develop SSIS, SSRS, SSAS & SQL Server packages, scripts, reports, and cubes. Create and support data assets and build data pipelines that align to modern software development principles for further analytical consumption. Perform data analysis to ensure quality of data assets. Monitor and address production issues proactively. Create summary statistics/reports from SQL Server, Snowflake databases. Support to Convert SSIS, SSRS, SSAS jobs onto new tech stack (AWS S3, EMR, Pyspark, and Snowflake). Extract data from source systems, and data warehouses, and deliver in a pre-defined format using standard database query and parsing tools. Understand ways to link or compare information already in our systems with new information. Perform preliminary exploratory analysis to evaluate nulls, duplicates, and other issues with data sources. Produce code artifacts and documentation using GitHub for reproducible results and hand-off to other data science teams.

Requirements

  • Position requires a Bachelor’s degree in Computer Science, Engineering, or a related field and 6 years of progressive, post baccalaureate experience in the job offered or in a related occupation.
  • Programming or systems analysis developing integration solutions
  • Web technologies, Data analysis, or manipulation
  • C#, SQL server, Visual Studio, SSMS, SSIS, SSRS, SSAS, SVN, Github, Rally, JSON, XML, or Rest
  • Working with Agile development teams, methodologies, and toolsets
  • Developing unit test suites and test driven development
  • Application performance testing and tuning

Responsibilities

  • Provide technical leadership by enabling the vision of the application architecture and safeguard the integrity of the application environment.
  • Assist architects in designing and implementing application integration involving a range of applications from third party off premise cloud applications to on premise legacy applications.
  • Responsible for end-to-end technical solution, goes beyond borders to ensure success of overall technical solution.
  • Work closely with vendor software providers to drive optimal solutions.
  • Develop application components and oversee technical deliverables from junior Developers through the software development life cycle.
  • Identify and validate internal and external data sources for availability and quality.
  • Work with SMEs to describe and understand data lineage and suitability for a use case.
  • Design, code, and develop SSIS, SSRS, SSAS & SQL Server packages, scripts, reports, and cubes.
  • Create and support data assets and build data pipelines that align to modern software development principles for further analytical consumption.
  • Perform data analysis to ensure quality of data assets.
  • Monitor and address production issues proactively.
  • Create summary statistics/reports from SQL Server, Snowflake databases.
  • Support to Convert SSIS, SSRS, SSAS jobs onto new tech stack (AWS S3, EMR, Pyspark, and Snowflake).
  • Extract data from source systems, and data warehouses, and deliver in a pre-defined format using standard database query and parsing tools.
  • Understand ways to link or compare information already in our systems with new information.
  • Perform preliminary exploratory analysis to evaluate nulls, duplicates, and other issues with data sources.
  • Produce code artifacts and documentation using GitHub for reproducible results and hand-off to other data science teams.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service