Senior Data Engineer

Fidelity InvestmentsDurham, NC
Hybrid

About The Position

This is a Data engineer position for a development Chapter in ET. The role is for a hardworking, highly motivated Lead Engineer with strong expertise in analyzing existing solutions, designing new solutions and helping team members in building robust and scalable solutions using the best software design & development practices. In this role, you will be responsible for the leading, engineering & developing quality software components and applications for brokerage products. You will build, modernize, and maintain Core & Common tools and Data Solutions. You will also apply and adopt a variety of cloud-native technologies for the products. In addition to building software, you will have an opportunity to help define and implement development practices, standards and strategies. This position can be located in NC, Merrimack, Smithfield or Westlake. Fidelity is passionate about making financial expertise broadly accessible and effective, is a privately held company that values creating a work environment that attracts the best talent and reflects a commitment to associates, and is proud of its diverse and inclusive workplace.

Requirements

  • Strong expertise in analyzing existing solutions, designing new solutions and helping team members in building robust and scalable solutions.
  • Strong experience with Oracle databases and Snowflake data warehousing.
  • Proficiency in Unix and shell scripting.
  • Hands-on experience with AWS Data Engineering services including AWS Batch, S3, EMR, EC2.
  • Expertise in Informatica for data integration and ETL processes.
  • Solid understanding of data modeling and data architecture principles.
  • Strong SQL skills for data querying and manipulation.
  • Familiarity with CI/CD tools such as Jenkins, Git, and Artifactory.
  • Knowledge of data governance and data quality best practices.
  • Knowledge of Python for data processing and automation.
  • Solid understanding of data warehousing concepts and ETL processes.
  • Excellent analytical and problem-solving skills.
  • Ability to work independently and as part of a team in a fast-paced environment.
  • Strong communication and collaboration skills.
  • Ability to translate business requirements into data-driven solutions.
  • Strong communication skills to convey data insights to non-technical stakeholders.
  • Ability to analyze large datasets to identify trends and insights.
  • Detail-oriented with a focus on delivering high-quality work.
  • Adaptability to work in a fast-paced and dynamic environment.
  • Collaborative mindset with a willingness to share knowledge and mentor team members.
  • Ability to work effectively in cross-functional teams and contribute to team success.
  • Open to feedback and continuous improvement.

Nice To Haves

  • Experience with big data technologies (e.g., Hadoop, Spark).
  • Proficiency preferred in developing and maintaining dashboards and reports using data visualization tools (e.g., Tableau, Power BI).

Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using Oracle, Snowflake, and Aws batch, Python.
  • Analyze system requirements and identify system specifications.
  • Implement and manage data integration solutions to ensure seamless data flow across systems.
  • Develop and optimize Unix shell scripts for automation and data processing tasks.
  • Utilize AWS Batch for scheduling and executing ETL pipelines and batch processing jobs.
  • Implement CI/CD pipelines using tools like Jenkins, Git, and Artifactory to automate deployment and integration processes.
  • Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs.
  • Ensure data quality, integrity, and security across all data platforms.
  • Troubleshoot and resolve data-related issues in a timely manner.
  • Document data engineering processes and maintain comprehensive technical documentation.
  • Help define and implement development practices, standards and strategies.
  • Lead, engineer & develop quality software components and applications for brokerage products.
  • Build, modernize, and maintain Core & Common tools and Data Solutions.
  • Apply and adopt a variety of cloud-native technologies for the products.

Benefits

  • Diverse and inclusive workplace where we respect and value our associates for their unique perspectives and experiences.
  • Reasonable accommodations for applicants with disabilities who need adjustments to participate in the application or interview process.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service