You will be responsible for developing, implementing, and maintaining scalable data pipelines and data models, ensuring high data quality and consistency. Your work will support analytical and reporting needs across the organization.
Develop and maintain robust data pipelines using Fivetran to ingest data from multiple sources.
Implement and manage data transformation processes using dbt, ensuring consistency and quality.
Design, build, and optimize data models in Snowflake to support analytics and reporting.
Collaborate with senior data engineers to troubleshoot data issues and deliver high‑quality solutions.
Contribute to best practices in data engineering and pipeline development.
Support dashboard and report development in Power BI to surface insights for internal teams.
Focus entirely on technical execution without direct stakeholder engagement.
Bachelor’s degree or equivalent work experience in Data Engineering, Computer Science, Information Systems, or related field.
1–2 years of hands-on experience with Snowflake or similar cloud data warehouse technologies.
Proven experience building and maintaining data integration pipelines (using Fivetran or similar tools).
Practical experience with dbt for data transformation and modeling.
SQL proficiency with 2+ years writing complex queries and building materialized tables/views via dbt.
Working knowledge of Power BI for data visualization and reporting.
Able to work effectively as part of a technical data engineering team.
Experience building and deploying dashboards and reports in Power BI.
Familiarity with additional cloud data tooling and platforms (Azure Data Factory, Tableau).
Strong problem-solving skills and attention to data quality.
Fully remote role based in Poland with minimal travel.
Focused technical responsibilities with opportunity to deepen data engineering expertise.