Sr. Data Engineer

Beyond FinanceChicago, IL
11h

About The Position

As a Senior Data Engineer you will work on developing a modern data pipeline integrating data from a variety of data sources in order to bring the data into the analytics platform using highly scalable and extensible pipeline technologies. You’ll be owning small to medium sized projects which ultimately will help the business grow and make more informed decisions.

Requirements

  • Bachelor's degree in Computer Science or other technical field or equivalent work experience
  • At least 5 years of working in development, data pipelines or devops; with at least two years of working in a cloud environment.
  • 3+ years of relevant experience working with data tools which might include Airflow, Lambda’s, Kafka, Spark, Presto, MapReduce, AWS Glue, or similar platforms
  • Experience using cloud analytics platforms which could include Google Big Query, AWS Redshift or Snowflake and standard relational databases such as Postgres, MySQL or other transactional databases
  • Exposure in developing batch systems and real time streaming platforms and a sense of the benefits of each approach.
  • Excellent SQL experience with working through complex queries.
  • Solid Python or Java skills; with UNIX shell scripting being helpful
  • Experience in setting up configuration-as-code tools such as Ansible, Terraform, or Chef.
  • Experience with continuous integration, testing, and deployment using tools such as Git and Jenkins; nice to have some Docker experience as well.
  • Expertise in the design, creation, management, and business use of large datasets.
  • Excellent organizational, interpersonal, analytical, and problem-solving skills
  • Strong multitasking skills with the ability to balance competing priorities
  • Ability to work in a fast-paced environment where continuous innovation is desired, and ambiguity is the norm
  • Experience with agile or other rapid application development methods
  • Demonstrated experience in coaching and mentoring other team members as appropriate
  • A willingness to being in an on-call rotation

Nice To Haves

  • Experience with continuous integration, testing, and deployment using tools such as Git and Jenkins; nice to have some Docker experience as well.

Responsibilities

  • Research various data tools and gain consensus on a tool which will be used not only for a particular use case, but generically for other upcoming use cases.
  • You’ll not only build the data pipeline, but think hard about how data pipelines can be standardized and tools created with minimal amount of customization per pipeline.
  • Self manage projects, plan ahead and think through second and third order consequences while creating project milestones.
  • Understand the overall need, design a solution, and develop a data pipeline to integrate data from a source to a target.
  • Develop automated test cases to validate data integrity and consistency.
  • Willingness to be on-call and troubleshoot issues to understand the root cause.
  • Think about how to handle security issues surrounding access, PII, and business-sensitive data.
  • Design for scalability and robustness of availability ensuring the data pipelines work or provide a clear error when they can’t proceed.
  • Be a team player in contributing your thoughts and ideas to the overall goals of the team.

Benefits

  • Considerable employer contributions for health, dental, and vision programs
  • Generous PTO, paid holidays, and paid parental leave
  • 401(k) matching program
  • Merit advancement opportunities
  • Career development & training
  • our team spirit and culture! We cultivate an environment of community, connection, and belonging across our entire organization.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service