Senior Data Engineer - Data Infrastructure and Architecture

C-4 AnalyticsWakefield, MA
19h$120,000 - $180,000Hybrid

About The Position

C-4 Analytics is looking for an Experienced Senior Data Engineer with expertise in data infrastructure and architecture to help shape the future of our data-driven digital marketing platforms. As a part of our growing Product, Engineering, and AI team, you’ll play a critical role in identifying and bringing together our diverse data sources and orchestrating intelligent systems. We need you to lead with an AI-forward mindset—designing and managing pipelines, platforms, and orchestration technologies that transform information into actionable insights. We're not just processing data—we're transforming it into organizational intelligence. As our Data Engineering Virtuoso, you'll build enterprise-grade AI pipelines, turning unstructured data into decision-making gold by creating intelligent data platforms at scale. The ideal candidate will have a strong background in ETL/ELT pipelines, data warehouse connectivity, data cleaning, normalization, database architecture, database optimization, and orchestration processes. This role will be responsible for orchestrating and delivering enterprise-grade AI pipelines. Connecting disparate data sources to Snowflake (our data warehouse), ensuring the cleanliness and normalization of data, and implementing database architecture best practices.

Requirements

  • Academically Grounded: Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.
  • Seasoned Practitioner: 5+ years of experience in data engineering, with a focus on data infrastructure, architecture, and database management.
  • Code Craftsperson: Fluent in Python and SQL, expressing complex logic with elegant simplicity
  • Database Strategist: Understanding when to deploy relational, vector, graph, or document data models. Strong understanding of database architecture principles, including sharding, replication, indexing, and optimization techniques.
  • Data Driven: Proficiency in designing and developing ETL/ELT pipelines for data integration and transformation.
  • Cloud Navigator: Confidently guiding projects through the AWS ecosystem and hands-on experience with Snowflake or similar cloud-based data warehouse platforms.
  • Dynamic Collaborator: Adept Problem-Solver with keen attention to detail. Excellent problem-solving skills, attention to detail, and the ability to work in a fast-paced, collaborative environment. Your data and pipelines will directly power our most core platforms that our client teams rely on to make million-dollar decisions.”
  • Infrastructure Poet – Expressing infrastructure needs as clear, reproducible code. Packaging or containerizing applications for consistency across environments. Knowledge of information architecture best practices, data modeling, and data governance.

Nice To Haves

  • Familiar with data manipulation and experience with Python libraries like Flask, FastAPI, Pandas, PySpark, PyTorch, to name a few…
  • Proficiency in statistics and/or machine learning libraries like NumPy, matplotlib, seaborn, scikit-learn, etc.
  • Experience in building ETL/ELT processes and data pipelines with platforms like Airflow, Dagster, or Luigi
  • Visualization Artist – Creating compelling visual narratives from complex data patterns
  • Statistical Thinker – Grounding engineering decisions in mathematical rigor
  • Framework Explorer – Experience with web frameworks that extend data's utility
  • Security Mindful – Navigating enterprise security with confidence and care

Responsibilities

  • Design, develop, and maintain proof-of-concepts using cutting-edge technologies, then refine them into production-ready solutions.
  • Craft intuitive tools that elevate data scientists and analysts to their highest potential
  • Collaborate with cross-functional teams to ensure that data storage and organization align with business needs and objectives.
  • Implement database architecture best practices, including database sharding, replication strategies, indexing, and optimization techniques to enhance data performance.
  • Orchestrate enterprise-grade AI pipelines for complex data flows that bring harmony to disparate sources through batch and streaming pipelines
  • Evaluate and optimize data storage and retrieval systems based on relationships, data access patterns, cost-effectiveness, and performance requirements.
  • Design elegant solutions and document your vision so others can follow your path
  • Provide leadership and guidance on information architecture decisions, ensuring that data is stored, organized, and accessed in the most efficient and effective manner.

Benefits

  • career development programs
  • unlimited paid time off
  • competitive compensation commensurate with experience and qualifications
  • cold brew on tap
  • free beverages
  • stocked snack pantry
  • free weekly lunch
  • monthly wellness events like Yoga and Paint & Sip
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service