Data Engineer

Barton AssociatesPeabody, MA

About The Position

The Data Engineer will embed with Barton’s Data Engineering team to work on a variety of data related projects. This role will be exposed to the full data lifecycle- including, but not limited to, understanding & documenting requirements, working with transactional system data, ensuring data quality, building ETL packages, integrating new sources into the data warehouse and creating reporting & business intelligence. A successful applicant will have a natural curiosity about data & systems, enjoy solving complex challenges, and love all things data.

Requirements

  • Strong communication (written and verbal) and interpersonal skills.
  • 1-3 Years of Professional experience as a Data Engineer/BI Developer or similar roles.
  • Experience with scripting languages like Python.
  • Experience architecting and building data pipelines and optimizing them for speed (of imports, exports, and queries) and reliability.
  • Understanding of ETL, Data Modeling, and large-scale data processing concepts is a must.
  • Must be able to work independently as well as part of cross-functional teams.
  • Extremely proficient in writing performant SQL working with large data volumes
  • Experience with data visualization tools and software like Tableau, Domo or Looker.
  • Bachelor’s degree in engineering, computer science, mathematics, economics or relevant field

Nice To Haves

  • Experience working directly with business stakeholders to translate between data and business needs
  • Working knowledge of cloud-native data engineering infrastructure
  • Experience with Salesforce, Domo, Databricks, Snowflake, DBT and Airflow

Responsibilities

  • Work with internal stakeholders to deliver business intelligence, from requirements documentation through report / dashboard delivery and creating a data pipeline.
  • Work with API to ingest and process data
  • Translate business use cases into a set of technical deliverables.
  • Build data pipelines and data ingestions using SQL and Python
  • Work with industry standard ETL and data orchestration tools such as Airflow, Informatica.
  • Build and scale a robust data infrastructure that serves the entire organization
  • Work with stakeholders including the Executive, Product, Customer teams to support their data infrastructure needs while assisting with data-related technical issues
  • Keep current on big data and data visualization technology trends, evaluate, work on proof-of-concept and make recommendations on the technologies based on their merit
  • Support data engineering efforts, including database and API design, data extraction/transformation/load, and data aggregation/integration.

Benefits

  • Vibrant and energetic team environment
  • Consistent Monday- Friday schedule
  • Paid time off
  • Paid holidays
  • Team events and fundraisers
  • 401k with match
  • Excellent health insurance (low deductible PPO, dental, vision) with discounted gym membership
  • Promote-from-within philosophy
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service