At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description: We are seeking a self-driven Data/Pipeline Engineer with a strong background in building scalable batch and streaming data pipelines. The ideal candidate will have deep expertise in Google Cloud Platform (GCP), particularly in developing solutions using Dataflow, BigQuery, Java, and Python. Familiarity with orchestration tools like Apache Airflow is essential. In this role, you will serve as the anchor engineer for a team focused on creating generic pipeline capabilities that are leveraged across Cotality. You will be instrumental in designing and implementing reusable, robust, and efficient data processing frameworks that support a wide range of business applications.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level