About The Position

Artefact is a new generation of a data service provider, specializing in data consulting and data-driven digital marketing, dedicated to transforming data into business impact across the entire value chain of organizations. We are proud to say that we’re enjoying skyrocketing growth. Our broad range of data-driven solutions in data consulting and digital marketing are designed to meet our clients’ specific needs, always conceived with a business-centric approach and delivered with tangible results. Our data-driven services are built upon the deep AI expertise we’ve acquired with our 300+ client base around the globe. We have over 2000 employees across 16 offices who are focused on accelerating digital transformation. Thanks to a unique mix of company assets: state of the art data technologies, lean AI agile methodologies for fast delivery, and cohesive teams of the finest business consultants, data analysts, data scientists, data engineers, and digital experts, all dedicated to bringing extra value to every client. We are looking for a Jr. Data Engineer to join our dynamic team. This role is ideal for someone with understanding of data engineering and a proven track record of working on data projects in a fast-paced environment.

Requirements

  • Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
  • 2+ years of industry experience in data engineering with a strong technical proficiency in SQL, Python, and big data technologies.
  • Experience with cloud services such as Azure Data Factory and AWS Glue.
  • Excellent problem-solving skills and the ability to work under tight deadlines.
  • Strong communication and interpersonal skills.

Nice To Haves

  • Certifications in Azure, AWS, or similar technologies.
  • Certifications in Databricks, Snowflake or similar technologies
  • Experience in the leading large scale data engineering projects

Responsibilities

  • Design, build, and maintain scalable and robust data pipelines using SQL, Python, Databricks, Snowflake, Azure Data Factory, AWS Glue, Apache Airflow and Pyspark.
  • Lead the integration of complex data systems and ensure consistency and accuracy of data across multiple platforms.
  • Implement continuous integration and continuous deployment (CI/CD) practices for data pipelines to improve efficiency and quality of data processing.
  • Work closely with data architects, analysts, and other stakeholders to understand business requirements and translate them into technical implementations.
  • Oversee and manage a team of data engineers, providing guidance and mentorship to ensure high-quality project deliverables.
  • Develop and enforce best practices in data governance, security, and compliance within the organization.
  • Optimize data retrieval and develop dashboards and reports for business teams.
  • Continuously evaluate new technologies and tools to enhance the capabilities of the data engineering function.

Benefits

  • We are united by our values and strengthened by our hybrid expertise.
  • There is always a way: We're from the breed of does, of diggers, of makers. Because ideas are valuable only if executed.
  • Client trust is won on the field: Addressing client needs flows better hands on at their side.
  • If not used, it is useless: Our love for technology translates into a steep desire for adoption, true brilliance is about impact.
  • If not shared, our work is not done: Sharing knowledge is the best way to button up a mission, benefitting clients and colleagues.
  • We learn everyday: Tech is a land where everything move at the speed of light, you better be ready to challenge yourself.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service