Engineer Analytics 1

Community TransitEverett, WA
1dHybrid

About The Position

As an Analytics Engineer, you will bridge data architecture, data engineering, data analysis, and business intelligence into one role to build a robust and scalable data infrastructure that puts the right data in the right place, in the right format, at the right time. Under limited supervision, you will build reports, run analyses, develop insights, and provide business recommendations to guide strategic decision-making across the agency. This position has a wide and deep scope requiring extensive knowledge and skill across all four domains. Furthermore, this position requires a complex breadth of knowledge of several agency subject matters – such as operations, employees, customers, digital, maintenance, finance, and more.

Requirements

  • Three years of experience managing and querying data in a relational database environment (Snowflake, SQL Server, Oracle etc.).
  • Three years of experience in data analysis using advanced tools (R, Python etc.).
  • Two years of experience in Microsoft Power BI and Fabric.
  • Two years of experience modeling data in Star Schema.
  • Bachelor’s degree in statistics, computer science, mathematics, engineering, or a similar field.
  • An equivalent combination of education and experience to successfully perform the job duties is also accepted.
  • Strong knowledge of assigned department’s work program and data sources.
  • Sufficient knowledge of two other departments work programs and data sources to provide review and backup assistance as needed.
  • Basic understanding of DevOps processes.
  • Fostering an inclusive workplace by upholding Community Transit’s core values in support of the organization’s vision and mission.
  • Inspiring team commitment, pride, and trust while promoting cooperation and motivating members to achieve shared goals.
  • Demonstrating courtesy, sensitivity, and respect in all interactions.
  • Ability to frame business questions into data analysis problems and develop solutions.
  • Proficient in intermediate data manipulation (data validation, window or custom functions, pivot/unpivot) and analysis using Excel, SQL, Dax, and Python/R.
  • Strong skills in Power BI report creation and Microsoft Fabric infrastructure (data lakes, workspaces, permissions, pipelines, etc.).
  • Data cleaning, validation, and review techniques.
  • Intermediate statistical skills for more complex data analysis (regression, clustering, hypothesis testing).
  • Strong problem-solving skills and critical thinking abilities.
  • Ability to work with limited supervision.
  • Ability to write technical documentation for standard processes.
  • Ability to communicate complex technical information to various audiences through reports and presentations.
  • Willingness to stay updated with the latest developments in the field of data analysis and engineering, and to continue learning and improving skills.
  • Strong organization and prioritization skills for handling multiple tasks and projects.
  • Team player: encouraging, communicative, supporting, contributing, and enthusiastic.
  • Certain knowledge and skill requirements may be learned or further developed after employment begins.

Responsibilities

  • Data Engineering Design, implement, and maintain data processing workflows within the data warehouse, including developing new transformations, defining dependencies, and ensuring efficient, reliable execution across the pipeline.
  • Develop and optimize data models that transform and prepare data for high performance and alignment with business use cases.
  • Establish monitoring, testing, and validation practices, including automated data quality checks, to proactively detect and resolve errors.
  • Partner with managers across departments to design and deliver curated, actionable reports and dashboards that translate business requirements into scalable data solutions.
  • Develop and publish interactive reports in Power BI and Snowflake Streamlit, including creating compelling visualizations (charts, graphs, dashboards) tailored to stakeholder needs.
  • Prepare, validate, and analyze data to ensure accuracy and derive insights that inform strategic and operational decision-making.
  • Explore complex datasets to uncover trends, patterns, and actionable insights, communicating findings in a clear and business-focused manner.
  • Apply statistical and analytical techniques with tools such as SQL, Python, and R to interpret data trends and generate meaningful recommendations.
  • Enforce enterprise-wide standards for data consistency, integrity, and accuracy across all platforms and business domains.
  • Adhere to governance frameworks for version control, documentation, and lifecycle management.
  • Ensure work is in alignment with organizational architecture and compliance standards.
  • Continuously monitor and validate reporting products, ensuring long-term fitness for business use.
  • Document processes, codebases, data lineage, and reporting logic.
  • Discover, document and, manage data access policies that follow security, business, and regulatory requirements.
  • Enable, support, and encourage the use of data in decision-making.
  • Integrate with teams across the enterprise to understand subject matter and improve data maturity and culture change.
  • Perform other duties of a similar nature or level.

Benefits

  • Employees (and their families) are covered by medical, dental, vision, basic life and disability insurance.
  • Employees participate in the Public Employees Retirement System (PERS) and have the option to enroll in the agency’s deferred compensation plan.
  • In addition to WA Paid Sick Leave, employees in this position, accrue of 24 days of Paid Time off (192 hours) in their first year and ten (10) paid holidays throughout the calendar year.
  • Full list of all benefits and details can be found here
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service