Broadcom Corporation-posted about 2 months ago
Full-time • Mid Level
San Diego, CA
5,001-10,000 employees
Computer and Electronic Product Manufacturing

We're a dynamic and innovative tech company looking for a talented Software Engineer to join our team. We're on a mission to build robust, scalable data pipelines and infrastructure that power our cutting-edge applications. If you're passionate about data, automation, cloud technologies, and working with a variety of modern tools, we want to hear from you.

  • Design, develop, and maintain scalable and reliable data pipelines using Python and Go.
  • Develop and maintain robust CI/CD pipelines using Jenkins to automate the testing and deployment of data applications.
  • Work within a Linux environment to manage and optimize data processing jobs.
  • Automate complex infrastructure and deployment tasks through effective scripting (e.g., shell scripts).
  • Build, schedule, and monitor complex data workflows using Apache Airflow.
  • Process large datasets efficiently with distributed computing frameworks like Apache Spark.
  • Manage and configure infrastructure as code using Ansible for automation and consistency.
  • Package and deploy applications on Kubernetes using Helm charts for streamlined versioning and management.
  • Work with relational databases, specifically Postgres DB, for data storage and retrieval.
  • Analyze and manipulate data using popular Python libraries like Pandas, Polars, Matplotlib and other packages.
  • Bachelor's degree and 12+ years of related experience required
  • Proven experience as a Senior level Software Developer, Data Engineer, DevOps Engineer, or in a similar role.
  • Strong programming skills in Python, Go and C++.
  • Expertise in Linux and command-line tools.
  • Hands-on experience with cloud platforms, particularly Google Cloud Platform (GCP) and its data services.
  • Solid experience building and managing CI/CD pipelines with Jenkins.
  • Proficiency with containerization (Docker) and orchestration (Kubernetes), including application deployment with Helm charts.
  • Experience with workflow orchestration tools like Apache Airflow.
  • Solid understanding of big data technologies, particularly Apache Spark.
  • Familiarity with configuration management tools like Ansible.
  • Proficiency in working with Postgres or other relational databases.
  • Experience with data analysis libraries such as Pandas, Polars, or Matplotlib.
  • Collaborative and supportive work environment.
  • Opportunities for professional growth and skill development.
  • Competitive salary and benefits package.
  • The chance to work on exciting and challenging projects that make a real impact.
  • Medical, dental and vision plans
  • 401(K) participation including company matching
  • Employee Stock Purchase Program (ESPP)
  • Employee Assistance Program (EAP)
  • company paid holidays
  • paid sick leave and vacation time
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service