Data Engineer (Intermediate)

AmplifiNaperville, IL
5d$95,000 - $110,000Hybrid

About The Position

The Intermediate Data Engineer is responsible for designing, building, and maintaining scalable, high-performance data pipelines and solutions. This role works closely with business stakeholders and development teams to transform complex data into actionable insights. The position requires hands-on experience with Big Data technologies, cloud platforms (especially AWS), and multiple programming languages. The ideal candidate is analytical, detail-oriented, and capable of architecting solutions that drive business value.

Requirements

  • 3+ years of experience in data engineering, ETL development, or a related field using Big Data methodologies.
  • Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related technical field.
  • Proficiency in Python, Pandas, Spark, and Java.
  • Hands-on experience with AWS Big Data services such as EC2, S3, EMR, Kinesis, DynamoDB, Athena, AWS Glue, and Redshift.
  • Experience with Unix/Linux environments.
  • Knowledge of source code management systems like Git.
  • Strong understanding of programming best practices, testing, and version control.
  • Experience building scalable, high-performance data pipelines and data processing workflows.
  • Ability to communicate complex technical concepts to both technical and non-technical stakeholders.

Nice To Haves

  • Master’s degree in Computer Science, Data Engineering, Information Technology, or related field.
  • Experience with Terraform or other infrastructure-as-code (IaC) tools.
  • Familiarity with Agile development methodologies and software lifecycle management.
  • Experience with data visualization tools and reporting frameworks.
  • Exposure to machine learning pipelines or analytics platforms.
  • Knowledge of cloud security, data privacy, and compliance best practices.

Responsibilities

  • Design, develop, and maintain modular, reusable, and efficient code to solve complex, real-world data problems.
  • Conduct regular peer code reviews to ensure quality, maintainability, and compliance with industry best practices.
  • Collaborate with cross-functional teams to understand business requirements and ingest diverse data sources.
  • Research, experiment, and implement leading Big Data technologies on AWS.
  • Develop and maintain ETL pipelines, data warehouses, and streaming data solutions.
  • Assist in defining and enforcing data architecture, governance, and platform standards.
  • Participate in developing thought leadership and best practices for data engineering at ampliFI.
  • Transform and visualize data for both ad hoc analyses and automated product-level solutions.
  • Troubleshoot, optimize, and monitor data workflows and pipelines for performance, reliability, and security.
  • Collaborate with internal teams and stakeholders to communicate technical concepts clearly and manage project interdependencies.

Benefits

  • Competitive pay plus 401(k) with employer match
  • Medical, dental, vision, and life insurance
  • Voluntary café plans, including voluntary life, accident, hospital, critical care, and parking/transit options
  • Tuition Reimbursement
  • Paid time off, company holidays, and parental leave
  • Employee Assistance Program
  • Hybrid work environment with flexible hours
  • Onsite perks including gym access and snacks
  • Employee recognition programs celebrating milestones and achievements
  • Growth opportunities within a supportive, team-oriented environment
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service