Varite-posted 2 months ago
$54 - $68/Yr
Jacksonville, FL
251-500 employees
Professional, Scientific, and Technical Services

We are looking for an astute, determined professional like you to fulfill a Data Engineering role within our Technology Solutions Group. As a Cloud Data Engineer Developer, you will be responsible for designing, developing, and maintaining data pipelines and data infrastructure on cloud platforms. You will work closely with cross-functional teams to ensure the efficient extraction, transformation, loading, and analysis of data from various sources into our cloud-based data systems.

  • Design and implement scalable and efficient data architectures on cloud platforms such as Amazon Web Services (AWS) and Microsoft Azure.
  • Develop and maintain robust and scalable data pipelines to extract, transform, and load (ETL) data from diverse sources into cloud-based data systems.
  • Apply data transformation techniques to clean, normalize, and enrich data for analysis and reporting purposes.
  • Implement data quality checks, data profiling, and data validation processes to ensure data accuracy, completeness, and consistency.
  • Design and implement data warehousing solutions on cloud platforms, utilizing technologies like AWS Redshift, Snowflake.
  • Implement data security measures, access controls, and encryption mechanisms to ensure the confidentiality and integrity of sensitive data.
  • Monitor and optimize the performance of data pipelines and data processing workflows.
  • Collaborate closely with cross-functional teams, including data architects, data analysts, and business stakeholders.
  • Document data engineering processes, data pipelines, and data infrastructure architectures.
  • Stay up to date with the latest cloud technologies, data engineering tools, and best practices.
  • Bachelor's degree in Computer Science, Software Engineering, or a related field (Master's degree preferred).
  • Proven experience as a Data Engineer or similar role, with a focus on cloud-based data engineering.
  • Strong expertise in AWS cloud platforms and related services like S3, EC2, Lambda, Glue etc.
  • Proficiency in programming language Python, for data engineering and scripting tasks.
  • Experience with data integration and ETL tools like Apache Spark, DBT or cloud-native solutions.
  • Familiarity with data warehousing concepts and technologies like Redshift, Snowflake, or Synapse Analytics.
  • Solid understanding of data modeling, data governance, and data quality principles.
  • Knowledge of SQL, database systems, and data querying optimization techniques.
  • Experience working in Hadoop or other big data platforms.
  • Exposure to deploying code through pipeline.
  • Good exposure to Containers like ECS or Docker.
  • Direct experience supporting multiple business units for foundational data work and sound understanding of capital markets within Fixed Income.
  • Knowledge of Jira, Confluence, SAFe development methodology & DevOps.
  • Excellent analytical and problem-solving skills with the ability to think quickly and offer alternatives both independently and within teams.
  • Proven ability to work quickly in a dynamic environment.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service