Sr. Data Engineer

CopartDallas, TX
3h

About The Position

Copart, Inc. a technology leader and the premier online vehicle auction platform globally, with over 200 facilities located across the world, Copart links vehicle sellers to more than 750,000 buyers in over 190 countries. We believe in providing an unmatched experience, every day and everywhere, driven by our people, processes, and technology. The Sr. Data Engineer will be part of the Data & AI Team. The Data & AI Team works very closely with all aspects of applications and data pipelines. We are looking for a Sr. Data Engineer to design, develop, and optimize the flow of data throughout the organization, enabling end user to provide valuable insights of data across Copart. In this role, your work will broadly influence the company's data consumers, executives and analysts.

Requirements

  • Bachelor’s degree or higher in Computer Science, Engineering, or related field.
  • 3–5 years of experience designing, developing, testing, and implementing scalable, high-performing data warehouse and BI solutions.
  • Enterprise development experience with databases (SQL Server, Oracle, MySQL, Vertica, MemSQL, Netezza, Redshift, Google BigQuery.
  • Hands-on experience with real-time data pipelines (Pub/Sub, Kinesis, Kafka) and database architecture (including MPP).
  • Experience with ETL/ELT and data transformation tools (DBT, DataFlow, Informatica).
  • Hands-on experience with workflow orchestration platforms (Apache Airflow, Prefect, Dagster, Cloud Composer).
  • Proficiency in BI/Analytical tools (Tableau, Looker, OBIEE, Power BI).
  • Strong analytical skills to translate complex business problems into actionable insights.
  • Exceptional communication and analytical skills with demonstrated ability to engage both technical and business audiences.
  • Programming: Python, SQL, Java, Scala
  • Cloud Platforms: AWS, Google Cloud Platform, Azure
  • Data Warehousing: BigQuery, Redshift, Snowflake
  • Stream Processing: Apache Kafka, AWS Kinesis, Google Pub/Sub
  • Workflow Orchestration: Apache Airflow, Prefect, Dagster, Cloud Composer
  • BI Tools: Tableau, Pentaho, Looker, Power BI, Thoughtspot

Responsibilities

  • Design and build the next-generation AI ready data platform.
  • Develop and automate data processing systems leveraging DBT to deliver data insights at an enterprise scale.
  • Design and implement workflow orchestration using Airflow / Cloud Composer for seamless data pipeline execution and scheduling across multiple systems.
  • Develop logging, metrics, and alerts for active monitoring of processes.
  • Collaborate with Product Managers and Application teams to create data models and schemas for easy access to complex datasets.
  • Maintain data integrity in production systems.
  • Balance and prioritize multiple conflicting requirements with high attention to detail.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service