This job is closed
We regret to inform you that the job you were interested in has now been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.
About the position
As a Senior Data Engineer at Arcadia, you will be responsible for designing, developing, and maintaining the data infrastructure and pipelines. Collaborating with the Data team and other engineering and operations teams, you will ensure the smooth flow and availability of high-quality data for analysis and reporting purposes. Your expertise will be crucial in optimizing data workflows, ensuring data integrity, and scaling the data infrastructure to support the company's growth. This is an exceptional opportunity to work with cutting-edge technology, influence the development of a world-class data ecosystem, and be part of a fast-paced, tightly knit team.
- Design, develop, and maintain scalable and efficient data pipelines in an AWS environment, using Snowflake, Fivetran, Prefect, and dbt.
- Collaborate with data analysts, analytics engineers, and software engineers to understand data requirements and deliver reliable solutions.
- Design, build, and maintain tooling for data platform interaction, including CI/CD pipelines, testing frameworks, and command-line tools.
- Implement best practices for data ingestion, transformation, and storage to ensure data integrity and accuracy.
- Optimize and tune data pipelines for improved performance, scalability, and reliability.
- Monitor data pipelines and address any issues or bottlenecks to ensure uninterrupted data flow.
- Develop and maintain documentation for data pipelines and facilitate knowledge sharing.
- Implement data governance and security measures to ensure compliance with industry standards.
- Stay updated with emerging technologies and trends in data engineering and recommend their adoption as appropriate.
- 5+ years of experience as a Data Engineer, data-adjacent Software Engineer, or a did-everything small data team member with a focus on building and maintaining data pipelines
- Strong Python skills, especially in the context of data orchestration
- Strong understanding of database management and design, including experience with Snowflake or an equivalent platform
- Familiarity with data integration patterns, ETL/ELT processes, and data warehousing concepts
- Experience with Prefect, Airflow
- Competitive compensation based on market standards
- Hybrid work model
- Flexible leave policy
- Office located in the heart of the city
- Medical insurance for 1+5 family members
- Quarterly team engagement activities and rewards & recognitions
- L&D programs for professional growth
- Commitment to diversity and equal employment opportunity
- Accommodation for individuals with disabilities during the application process and employment
Dev & Engineering
This is some text inside of a div block.