Data Engineer

Massachusetts Bay Transportation AuthorityBoston, MA
Onsite

About The Position

The Data Engineer will be responsible for expanding, enhancing, and maintaining MBTA’s data warehouse and related infrastructure for effective utilization by internal and external stakeholders. The position will be a vital partner in implementing the MBTA’s broader data strategy. The role involves data warehouse management (including SQL-based programming to structure data flows and data sources, providing input on data architecture, as well as managing the Extract, Transform and Load (ETL) process, automated data flows, error handling, and other related tasks), and working closely with other members of the Information Management and Technology teams team to promote effective data utilization. This individual will ensure a positive end-user experience, application availability, and address errors efficiently, as well as oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality.

Requirements

  • Bachelor’s Degree from an accredited institution in Computer Science, Information Systems, Business Administration or another related field.
  • Three (3) years of experience in data engineering or a related role.
  • Experience working within an application or Database Production support role and/or experience with scripting and object-oriented programming languages and/or experience with Relational Databases.
  • Experience using ETL tools in a data analyses, data engineering, or similar role.
  • Business Intelligence (BI) knowledge with exposure to BI from an ETL perspective.
  • Demonstrable experience SQL and core data management principles (such as ETL/ELT, data warehousing architecture, query design, and performance tuning).
  • Experience defining/modeling/implementing an Enterprise Data Warehouse using best practices.
  • Experience with database schema, objection management, data modeling & architecture, and data warehouse design.
  • Experience with interactive reporting platforms (Tableau, Power BI, etc.).
  • Proven track record of self-starting and growth in related field(s).
  • Developing and maintaining formal documentation that describes data and data structures including data modelling.
  • Superior attention to detail skills and an ability to think critically and conceptually.
  • Team oriented and flexible with proven track record in collaborating with multiple stakeholders.
  • Strong verbal and written skills.
  • Ability to quickly learn new technologies.
  • A High School Diploma or GED with an additional seven (7) years of directly related experience substitutes for the Bachelor’s degree requirement.
  • An Associate’s degree from an accredited institution an additional three (3) years of directly related experience substitutes for the Bachelor’s degree requirement.
  • A Master’s degree in a related subject substitute for two (2) years of general experience.
  • A nationally recognized certification, or statewide/professional certification in a related field substitutes for one year of experience.

Nice To Haves

  • Hands-on experience and expertise in Advanced SQL, Advanced Python programming, AWS Services (such as S3, Redshift, Glue ETL, IAM, DMS, Appflow, VPC and others), SnowFlake, FiveTran, Informatica, KAFKA, Airflow, Oracle Cloud and other open-source tools.
  • Six (6) years of DWH experience.
  • Three (3) years of exposure to Business Intelligence (BI) from an ETL perspective.
  • Six (6) years of ETL development experience.
  • Three (3) years of Cloud based technological experience.
  • Proficiency with interactive reporting platforms (Tableau, Power BI, etc.).
  • Proven success with documentation of processes, distributing to end-users and supporting those users.
  • Five (5) years' experience with advanced data management systems (e.g., PostgreSQL, etc.).
  • Deep expertise in data modeling and structuring using Python/R.
  • Experience with systems migration experience is highly desirable.
  • Experience in high volume data environments.

Responsibilities

  • Design, develop, and document ETL processes, technical architecture, data pipelines, and performance.
  • Develop integration process data flows and data mapping analyses.
  • Address application and data-related problems regarding systems integration, compatibility, and multiple-platform integration.
  • Develop and promote data integration methodologies and standards.
  • Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of application and data architecture.
  • Identify and develop opportunities for code and data reuse, migration, or retirement of systems.
  • Work independently and collaboratively to identify issues, perform root-cause analysis that would lead to proper and timely issue resolutions.
  • Take ownership over data flow errors, communicating about them and working to resolve them as efficiently as possible.
  • Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement.
  • Work closely with the rest of the Business Intelligence team and other internal stakeholders to understand the needs of the business and build data resources to best meet those needs.
  • Perform all other duties and projects that may be assigned.

Benefits

  • Accrued paid sick leave
  • Monthly transportation pass
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service