Data Engineer

DEVCO MANAGEMENT COMPANY LLCBellevue, WA
12d$90,000 - $105,000Hybrid

About The Position

DevCo is seeking a full-time Data Engineer to join our team in Bellevue, WA in hybrid in-office capacity. The Data Engineer owns data ingestion, transformation, and warehouse architecture across DevCo’s operational and financial systems. This role designs and maintains scalable, production‑grade pipelines that integrate data from third‑party platforms and internal systems, ensuring high‑quality, reliable datasets for analytics, BI, and decision‑making. About the Company DevCo Residential Group is an integrated development and investment company focused on multi-family communities. Founded in 1994, the company and its affiliates develop, own, and manage over 14,000 affordable and market rate apartment units throughout the United States. Headquartered in Bellevue, Washington, DevCo is one of the largest providers of affordable housing in Washington State. Mission: Devco Residential Group’s mission is to develop, construct and manage high-quality multifamily housing that provides stability, fosters growth and delivers long-term value to our residents and stakeholders. Vision: DevCo’s vision is to be a leading developer, builder and manager of quality multifamily housing throughout the western US. Values: Quality: We deliver excellence in every aspect of our work. Commitment: We honor our promises with unwavering dedication. Teamwork: We achieve more together through collaboration and respect. Integrity: We uphold the highest ethical standards in all we do.

Requirements

  • 2+ years of experience in data engineering or analytics engineering.
  • Strong SQL skills and experience building production data pipelines.
  • Experience integrating with APIs and third‑party SaaS platforms.
  • Hands‑on experience with cloud data warehouses (Snowflake, BigQuery, Redshift, Azure).
  • Familiarity with orchestration and transformation tools (e.g., Airflow, Prefect, dbt, Fivetran, Stitch).
  • Strong understanding of dimensional and analytical data modeling best practices.

Responsibilities

  • Data Integration & Pipeline Engineering Design, build, and maintain automated data pipelines consuming data via APIs, SFTP, flat files, databases, and third‑party connectors.
  • Own integrations with Yardi, Procore, Smartsheet, Northspyre, HappyCo, and other operational platforms.
  • Implement modern ELT/ETL workflows using orchestration and transformation frameworks.
  • Data Warehouse Architecture Architect and maintain the enterprise data warehouse, including schema design, partitioning strategy, indexing, and performance optimization.
  • Develop layered data models (raw, curated, analytics‑ready) that support enterprise reporting and BI.
  • Data Quality, Reliability & Observability Establish data quality checks, reconciliation rules, freshness monitoring, and anomaly detection.
  • Build logging, alerting, and monitoring to ensure pipeline reliability and SLA adherence.
  • Manage scalability and performance as data volume and usage grow.
  • Documentation & Governance Document data sources, pipeline logic, schemas, lineage, and ownership.
  • Support data governance standards, including security, access controls, and handling of sensitive data.
  • Collaboration Partner closely with the Data Analyst to ensure the warehouse supports semantic modeling and reporting needs.
  • Collaborate with Finance, Development, Construction, Property Management, and Accounting teams to validate business logic and requirements.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service