About the position
We are seeking a Data Engineering Lead to join our Technology team at CAIS, a fintech firm focused on building an open marketplace for alternative investments. This role will be responsible for developing and maintaining the solution architecture that makes data a central part of CAIS. The Data Engineering Lead will collaborate with analytics and product teams to improve data models, increase data accessibility, and foster data-driven decision making. Ideal candidates should have strong cross-functional communication skills, problem-solving abilities, and a drive to deliver on critical projects.
Responsibilities
- Design, develop and automate secure, robust, and high-performing data platform infrastructure to drive CAIS business growth
- Use modern approaches to process data through to Snowflake using SQL and code-based ELT (and/or ETL)
- Develop production ready pipelines and robust enterprise class solutions
- Analyze production workloads and develop data workflow strategies to optimize Snowflake warehouse with scale and efficiency
- Design data platform capabilities, pipelines and models for optimal storage and retrieval that represent the product entities and meet business requirements
- Leverage best of breed technologies including open-source and cloud solutions to build sophisticated features that enhance platform capabilities and help support business analytics
- Build and support pipeline orchestration via Airflow
- Provide technical leadership and data architecture expertise, with a preference for Snowflake
- Strong understanding of AWS and how to create secure infrastructure and data pipelines
- Architecture/Technical Design Experience with AWS solutions such as Lambda, S3, and Kafka
Requirements
- Design, develop and automate secure, robust, and high-performing data platform infrastructure
- Use modern approaches to process data through to Snowflake using SQL and code-based ELT (and/or ETL)
- Develop production ready pipelines and robust enterprise class solutions
- Analyze production workloads and develop data workflow strategies to optimize Snowflake warehouse with scale and efficiency
- Design data platform capabilities, pipelines and models for optimal storage and retrieval that represent the product entities and meet business requirements
- Leverage best of breed technologies including open-source and cloud solutions to build sophisticated features that enhance platform capabilities and help support business analytics
- Build and support pipeline orchestration via Airflow
- Technical Leadership and Data Architecture (Snowflake preferred)
- Engineering, Mathematics, or other technical degree(s)
- Strong understanding of AWS and how to create secure infrastructure and data pipelines
- Architecture / Technical Design Experience with AWS solutions such as Lambda, S3, and Kafka
- 7+ years of experience
Benefits
- Leverage best of breed technologies including open-source and cloud solutions
- Build sophisticated features that enhance platform capabilities
- Support business analytics
- Technical Leadership and Data Architecture
- Strong understanding of AWS and how to create secure infrastructure and data pipelines
- Architecture / Technical Design Experience with AWS solutions such as Lambda, S3, and Kafka
- 7+ years of strong programming experience
- Expertise in extracting and transforming data
- Object-oriented programming skills
- Ability to write easy-to-scale, high-quality code
- 2+ years of experience with Python
- 10+ years of data warehousing experience
- Examples of complex SQL/ETL/ELT data workflows
- 7+ years of experience as a technical lead covering design, development, and delivery
- Experience with Apache Airflow
- 7+ years of experience with schema design and dimensional data modeling
- Experience with Integration tools like Fivetran/Airbyte
- Experience with Kubernetes
- Experience with IaC (Terraform)
- Experience with or knowledge of Agile Software Development methodologies
- Experience deploying ML Pipelines / ML Ops (E.g. MLFlow)
- Prior experience working within Alternative Investments and/or Financial Services preferred
- Good knowledge of financial markets and financial instruments
- AWS Certified Solution Architect / Associate or similar
- Experience with GitHub Actions
- Experience in developing integrations with Tableau (or similar Business Intelligence tool)
- REST API Development
- Distributed Computing
- Experience with Athena
- Experience with DBT
- Experience with functional programming
- JVM Stack (Java or Kotlin)
- Machine Learning