Data Engineer (@Remote, Chile and Colombia)
OfferUp
·
Posted:
August 17, 2023
·
Remote
About the position
We are seeking an experienced Data Engineer to build and scale OfferUp's data processing platforms and tools. This role involves designing and developing applications to process large amounts of critical information, resolving data pipeline issues, and collaborating with multiple teams to understand their data needs. The successful candidate will have a strong background in distributed systems, proficiency in SQL and Python, and experience with open-source data infrastructure projects. Additionally, familiarity with cloud environments like AWS and GCP, as well as operational tools, is a plus. This is an opportunity to contribute to the largest mobile marketplace and help drive data-driven capabilities throughout the organization.
Responsibilities
- Design and develop applications to process large amounts of critical information for analytics and user-facing features.
- Monitor and resolve data pipeline or data integrity issues.
- Work with multiple teams to understand their data needs.
- Maintain and expand the data infrastructure.
- Have 3+ years of professional software development experience.
- Possess strong ability in distributed systems for processing large-scale data.
- Communicate technical information effectively to technical and non-technical audiences.
- Proficient in SQL and Python.
- Experience leveraging open-source data infrastructure projects such as Airflow, Kafka, Avro, Parquet, Hadoop, Hive, HBase, Presto, or Druid.
- Experience building scalable data pipelines and real-time data streams.
- Experience building software in AWS or a similar cloud environment.
- Experience with AWS services like Kinesis, Firehose, Lambda, Sagemaker is a big plus.
- Experience with GCP services like BigQuery, Cloud Functions is a big plus.
- Experience with operational tools like Terraform, Datadog, and Pagerduty is a big plus.
- Computer Science or Engineering degree required, Masters degree preferred.
- Excellent communication skills, both written and spoken (fluency in English required).
Requirements
- 3+ years of professional software development experience
- Strong ability in distributed systems for processing large scale data processing
- Ability to communicate technical information effectively to technical and non-technical audiences
- Proficiency in SQL and Python
- Experience leveraging open source data infrastructure projects, such as Airflow, Kafka, Avro, Parquet, Hadoop, Hive, HBase, Presto or Druid
- Experience building scalable data pipelines and real-time data streams
- Experience building software in AWS or a similar cloud environment
- Experience with AWS services like Kinesis, Firehose, Lambda, Sagemaker is a big plus
- Experience with GCP services like BigQuery, Cloud Functions is a big plus
- Experience with operational tools like Terraform, Datadog and Pagerduty is a big plus
- Computer Science or Engineering degree required, Masters degree preferred
- Excellent communication skills, both written and spoken (fluency in English required)
Benefits
- Highly visible opportunity to build systems that drive the end-user experience
- Directly supporting backend engineers, business intelligence dashboards and reports, data analysts, and data scientists
- Opportunity to work on building, operating, and scaling data processing platforms and tools
- Opportunity to work on data-driven capabilities throughout the entire organization
- Opportunity to work on unique data challenges
- Opportunity to leverage the latest developments in data infrastructure
- Opportunity to design and develop applications to process large amounts of critical information
- Opportunity to work across multiple teams and understand their data needs
- Opportunity to maintain and expand data infrastructure
- Opportunity to work with open-source data infrastructure projects
- Opportunity to build scalable data pipelines and real-time data streams
- Opportunity to work in AWS or a similar cloud environment
- Opportunity to work with AWS services like Kinesis, Firehose, Lambda, and Sagemaker (a big plus)
- Opportunity to work with GCP services like BigQuery and Cloud Functions (a big plus)
- Opportunity to work with operational tools like Terraform, Datadog, and Pagerduty (a big plus)
- Equal employment opportunities provided
- Compliance with applicable state and local laws governing nondiscrimination in employment
- Prohibition of workplace harassment based on various factors