Role summary: We are seeking a skilled Data Engineer to design, develop, and maintain data pipelines for processing and analyzing large-scale datasets. This role will focus on orchestrating data workflows in a containerized environment, primarily using Python, to build scalable ETL solutions. The ideal candidate will have experience working with large-scale datasets, optimizing data workflows, and integrating with distributed computing frameworks. About the team: Our UEBA team is comprised of Data Scientists and Analysts with a main goal to mitigate attack risks with proactive threat detection. To accomplish this, we leverage many tools including advanced analytics and machine learning while working closely with many teams across Information Security. What you'll do: You will design, develop, and maintain scalable data pipelines to process and analyze log data, leveraging workflow orchestration tools in a containerized environment. Using your advanced expertise in Python and SQL, you will build and optimize ETL processes for efficient log parsing, transformation, and storage. You will work with distributed computing frameworks; integrate with both on-prem, and cloud infrastructure to support high-volume data ingestion. Additionally, you will work closely with data scientists, security analysts, and engineers to provide clean, structured data that supports analytics and decision-making.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Mid Level