Data Engineer

V4C.ai
4hRemote

About The Position

V4C.ai is seeking a motivated Data Engineer to join our remote team in the United States. In this role, you will support the design, development, and maintenance of data solutions using Databricks, helping clients and internal teams process, transform, and analyze data effectively. You'll work on building reliable data pipelines and workflows in a collaborative environment, gaining hands-on experience with modern data engineering tools and cloud technologies.

Requirements

  • Bachelor's degree in Computer Science, Data Science, Engineering, Information Systems, or a related field (or equivalent practical experience).
  • 1-2 years of professional experience in data engineering, data processing, analytics engineering, or a closely related role (internships, co-ops, or academic projects with relevant tools count toward this).
  • Hands-on experience and comfort building basic data pipelines or transformations.
  • Proficiency in Python and SQL; experience with Scala is a plus but not required.
  • Basic understanding of cloud platforms such as Azure, AWS, or GCP (ex: working with storage, compute, or data services).
  • Solid analytical and problem-solving skills with attention to detail and a focus on writing clean, maintainable code.
  • Strong communication skills and ability to work collaboratively in a remote team environment.
  • Eagerness to learn, take ownership of tasks, and grow within data engineering.

Nice To Haves

  • experience with Scala is a plus but not required.

Responsibilities

  • Collaborate with team members and stakeholders to understand data requirements and contribute to building scalable data pipelines and workflows in Databricks.
  • Develop and implement ETL/ELT processes using Databricks, Python, SQL, and related tools to ingest, transform, and prepare data.
  • Assist in optimizing data workflows for better performance, reliability, and cost-efficiency within Databricks environments.
  • Support the creation and maintenance of data models, tables, and integrations in cloud platforms (Azure, AWS, or similar).
  • Work closely with cross-functional teams (data analysts, scientists, and engineers) to deliver clean, accessible data for analytics and reporting.
  • Monitor data pipelines, troubleshoot basic issues, and contribute to documentation and best practices.
  • Stay curious about new Databricks features and data engineering trends to support ongoing improvements.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service