Data Engineer

AmgenTampa, FL
9h

About The Position

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Mid-Level Data Engineer What you will do Let’s do this. Let’s change the world. In this vital role you will build and optimize our data infrastructure. As a key contributor, you will collaborate closely with cross-functional teams to design and implement robust data pipelines that efficiently extract, transform, and load data into our AWS-based data lake and data warehouse. Your expertise will be instrumental in empowering data-driven decision making through advanced analytics and predictive modeling.

Requirements

  • Master’s degree OR Bachelor’s degree and 2 years of data engineering experience Or Associate’s degree and 6 years of data engineering experience Or High school diploma / GED and 8 years of data engineering experience
  • Proficient in SQL for extracting, transforming, and analyzing complex datasets from both relational and columnar data stores.
  • Proven ability to optimize query performance on big data platforms.
  • Proficient in leveraging DataWeave to access and transform data across various formats (JSON, XML, CSV, etc.)
  • Proficient in leveraging Python, PySpark, and Airflow to build scalable and efficient data ingestion, transformation, and loading processes.
  • Ability to learn new technologies quickly.
  • Strong problem-solving and analytical skills.
  • Excellent communication and teamwork skills.

Nice To Haves

  • Experienced with SQL/NOSQL database, vector database for large language models
  • Experienced with data modeling and performance tuning for both OLAP and OLTP databases
  • Experienced with Apache Spark, Apache Airflow
  • Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops
  • Experienced with AWS, GCP or Azure cloud services
  • Experienced with Java technology
  • AWS Certified Data Engineer preferred
  • Databricks Certificate preferred
  • Excellent analytical and troubleshooting skills.
  • Strong verbal and written communication skills
  • Ability to work effectively with global, virtual teams
  • High degree of initiative and self-motivation.
  • Ability to manage multiple priorities successfully.
  • Team-oriented, with a focus on achieving team goals
  • Strong presentation and public speaking skills.

Responsibilities

  • Develop and maintain back-end API using python, fastapi frameworks on Databricks platforms
  • Managing and maintaining the API in MuleSoft environments.
  • Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring.
  • Maintain system uptime and optimal performance
  • Working closely with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Exploring and implementing new tools and technologies to enhance ETL platform performance.

Benefits

  • A comprehensive employee benefits package, including a Retirement and Savings Plan with generous company contributions, group medical, dental and vision coverage, life and disability insurance, and flexible spending accounts
  • A discretionary annual bonus program, or for field sales representatives, a sales-based incentive plan
  • Stock-based long-term incentives
  • Award-winning time-off plans
  • Flexible work models where possible.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service