Sr. Data Engineer

VisaFoster City, CA
2hHybrid

About The Position

As a Senior Data Engineer, you'll join our Value Added Services – Digital Marketing & Engagement organization. You will help design, enhance, and build our data infrastructure and pipelines within an agile development environment, collaborating with colleagues who will support and challenge you daily. Key Responsibilities: Design, develop, and maintain robust and scalable data pipelines and ETL processes. Ensure data services are highly available, secure, scalable, and resilient. Drive innovation to differentiate our data products and accelerate time-to-market delivery. Utilize containerization technologies such as Docker and Kubernetes, and expertise in Java, Python, and both relational and non-relational databases. Apply your data engineering skills with Hadoop, Spark, and Scala. Represent the team in various technical forums and build deep partnerships with product management. Analyze business requirements to architect highly secure, robust, and scalable data solutions. Lead internal proof of concept initiatives and quickly design and implement prototypes. Champion efforts to design and implement components of our global data processing systems. Follow and create software best practices and processes. Mentor team members and create an atmosphere of mutual accountability. Play a key role in meetings and discussions with cross-functional and non-technical teams. Essential Functions: Collaborate with customers to understand their requirements and build solutions that deliver real value. Architect, design, and implement secure, robust, and scalable data solutions. Drive proof of concept initiatives and lead implementation. Mentor team members and foster a culture of mutual accountability. Engage in meetings and discussions with cross-functional teams. Work in a hybrid environment, alternating between remote and office work. This is a hybrid position. Expectation of days in office will be confirmed by your hiring manager.

Requirements

  • 2+ years of relevant work experience and a Bachelors degree, OR 5+ years of relevant work experience

Nice To Haves

  • 3 or more years of work experience with a Bachelor’s Degree or more than 2 years of work experience with an Advanced Degree (e.g. Masters, MBA, JD, MD)
  • Demonstrated leadership in delivering high-quality, large-scale, enterprise-class data applications.
  • Solid experience in big data engineering, with knowledge of Hadoop, Apache Spark, Python, and SQL.
  • Expertise in Java, REST APIs, and container-based technologies (Docker, Kubernetes).
  • Proficiency in creating and managing large-scale data pipelines and ETL processes.
  • Experience developing and maintaining Spark pipelines and productizing AI/ML models.
  • Proficient in technologies like Kafka, Redis, Flink, TensorFlow, Triton, and AWS services.
  • Skilled in Unix/Shell or Python scripting and scheduling tools like Airflow and Control-M.
  • Strong experience with data storage technologies and databases (e.g., MySQL, PostgreSQL, MongoDB).
  • Familiarity with Agile development, TDD, CI/CD, and various data warehousing solutions.
  • Proven track record of building reliable, scalable, and operable data applications.
  • Ability to manage component security analysis and collaborate with security teams.
  • Strong work ethic, focus on immediate goals, and proven experience as a technical leader.
  • Passion for mentoring and helping juniors grow professionally.
  • Excellent communication and interpersonal skills, and a strong team player.

Responsibilities

  • Design, develop, and maintain robust and scalable data pipelines and ETL processes.
  • Ensure data services are highly available, secure, scalable, and resilient.
  • Drive innovation to differentiate our data products and accelerate time-to-market delivery.
  • Utilize containerization technologies such as Docker and Kubernetes, and expertise in Java, Python, and both relational and non-relational databases.
  • Apply your data engineering skills with Hadoop, Spark, and Scala.
  • Represent the team in various technical forums and build deep partnerships with product management.
  • Analyze business requirements to architect highly secure, robust, and scalable data solutions.
  • Lead internal proof of concept initiatives and quickly design and implement prototypes.
  • Champion efforts to design and implement components of our global data processing systems.
  • Follow and create software best practices and processes.
  • Mentor team members and create an atmosphere of mutual accountability.
  • Play a key role in meetings and discussions with cross-functional and non-technical teams.
  • Collaborate with customers to understand their requirements and build solutions that deliver real value.
  • Architect, design, and implement secure, robust, and scalable data solutions.
  • Drive proof of concept initiatives and lead implementation.
  • Mentor team members and foster a culture of mutual accountability.
  • Engage in meetings and discussions with cross-functional teams.

Benefits

  • Medical
  • Dental
  • Vision
  • 401 (k)
  • FSA/HSA
  • Life Insurance
  • Paid Time Off
  • Wellness Program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service