Senior Software Engineer, Data

Attain DataRedwood City, CA
56dHybrid

About The Position

Built for consumers and companies, alike. In a world driven by data, we believe consumers and businesses can coexist. Our founders had a vision to empower consumers to leverage their greatest asset-their data-in exchange for modern financial services. Built with this vision in mind, our platform allows consumers to access savings tools, earned wages and rewards without cost or hidden fees. In exchange, they give permission to use their real-time data for research, insights and targeted advertising. At Attain, your contribution will help us build a more equitable and efficient data sharing ecosystem-whether helping consumers access modern financial services or businesses leverage data to achieve better outcomes. You'll have the opportunity to work directly with hands-on leaders and mission-driven individuals everyday. Attain Office Hybrid Schedule (where applicable): Redwood City, CA: Mondays (in-office for stand-ups, all-hands) and choice of three days between Tues-Friday Chicago, IL & New York, NY: 4 days in-office; 1 day remote You'll be a great fit for the role if you Enjoy building reliable data systems that power high-impact decisions and products Care deeply about data quality, semantics, and business logic - not just pipelines Proactively identify when data is misleading, incomplete, or broken Communicate clearly across technical and non-technical audiences Embrace feedback and thrive in a collaborative, fast-paced environment

Requirements

  • 6+ years of experience in backend systems or data engineering roles
  • Proficiency in Python and familiarity with data orchestration tools (e.g., Airflow, GCP Workflows)
  • Experience working with relational databases and large-scale cloud data warehouses
  • Exposure to cloud infrastructure (GCP, AWS, or Azure)

Nice To Haves

  • Familiarity with gRPC, Protobuf, or GraphQL
  • Experience in AdTech, MarTech, or high-volume data environments is a plus
  • Experience proactively working with modern AI-enhanced development tools (e.g., Cursor, Windsurf, Cody, etc.) and a curiosity for emerging AI workflows

Responsibilities

  • Build and Scale Our Data Infrastructure
  • Design and maintain data pipelines and orchestration layers using modern tools (e.g., Airflow, GCP Workflows)
  • Own data workflows across cloud warehouses (BigQuery, Snowflake) and transactional stores
  • Develop monitoring, alerting, and testing for data health and anomalies
  • Ensure Data Integrity and Meaning
  • Partner with product, analytics, and engineering teams to validate business logic and definitions
  • Define and maintain a clear, documented source of truth for key datasets
  • Own the semantic correctness of our data - not just its arrival
  • Collaborate and Contribute Across the Stack
  • Write thoughtful, quality code that is readable, testable and easily maintainable
  • Participate in architecture reviews, agile ceremonies, and cross-functional planning
  • Help improve our data modeling, cataloging, and governance practices

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Industry

Securities, Commodity Contracts, and Other Financial Investments and Related Activities

Education Level

No Education Listed

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service