Data Integration Engineer

Norwegian Cruise Line Holdings Ltd.•Miami, FL

About The Position

At Norwegian Cruise Line Holdings (NCLH), the company aims to attract and retain top talent to deliver exceptional vacation experiences through its brands: Norwegian Cruise Line, Oceania Cruises, and Regent Seven Seas Cruises. The Data Integration Engineer role is crucial for developing, implementing, and supporting robust data integration solutions. This involves designing, building, and maintaining data pipelines using various technologies, supporting the Data Warehouse team, optimizing performance, collaborating with cross-functional teams, ensuring data quality and compliance, and staying updated with technical innovations. NCLH is a leading global cruise company operating 32 ships, employing over 35,000 crew, and visiting approximately 700 port destinations annually, with plans to expand to 34 ships and add 13 more through 2036. The company values People Excellence, Innovation, Collaboration, Transparency, and Passion.

Requirements

  • Minimum 5 years of relational database experience in data integrations required, including backend integration using batch ETLs, API, and data streaming technologies like Kafka, Striim, Fivetran, Informatica, Matillion, and SSIS.
  • Strong background in database programming and system integrations.
  • Experience with relational database technologies, T-SQL, PL/SQL, and thorough experience with SQL Server or Oracle RDBMS platform.
  • Experience in troubleshooting OS and network issues impacting data integration processes and platforms.
  • Bachelor's degree (BS) Degree in Computer Science, Engineering or related field of study and 2 to 4 years related experience and/or training or equivalent combination of education and experience.
  • Mid-to-advanced level in developing and implementing system integrations using database platforms (e.g., SQL Server, Oracle, Snowflake, Redshift, Hadoop).
  • Experience in object-oriented programming languages such as C# and Java, and scripting languages like Python and PowerShell.
  • Experience with technologies and standards such as Microsoft Message Queue, Kafka, .NET, JSON, ASP.NET.
  • Familiarity with Agile SDLC.
  • Ability to read, analyze, and interpret common scientific and technical journals, financial reports, and legal documents.
  • Ability to respond to common inquiries or complaints from customers, regulatory agencies, or members of the business community.
  • Ability to effectively present information to cross-functional teams and stakeholders.
  • Ability to apply advanced mathematical concepts such as exponents, logarithms, quadratic equations, and permutations.
  • Ability to apply mathematical operations to such tasks as frequency distribution, determination of test reliability and validity, analysis of variance, correlation techniques, sampling theory, and factor analysis.
  • Ability to apply principles of logical or scientific thinking to a wide range of intellectual and practical problems.
  • Ability to deal with nonverbal symbolism (formulas, scientific equations, graphs, musical notes, etc.,) in its most difficult phases.
  • Ability to deal with a variety of abstract and concrete variables.
  • Proven track record of working in several organizational models, such as matrixed resources.
  • Expertise in data analytics skill sets, including data mining, regression analysis, and data extraction.
  • Excellence in a variety of competencies, including teamwork/collaboration, analytical thinking, communication and influencing skills, and technical expertise.

Responsibilities

  • Develop, Implement, and Support Data Integration Solutions: Design, build, and maintain robust data integration pipelines using batch, API, and data streaming technologies like Kafka, Striim, Fivetran, Informatica, Matillion, and SSIS.
  • Ensure these solutions meet business requirements and integrate seamlessly with existing systems.
  • Provide 24/7 on-call support in rotating schedules.
  • Data Warehouse Support: Work closely with the Data Warehouse team to implement data ingestion solutions, ensuring efficient and reliable data availability from source systems.
  • Performance Monitoring and Optimization: Work closely with Ops and DevOps teams to improve data integration performance, identify bottlenecks, and implement optimizations to ensure efficient data processing and movement.
  • Collaboration and Cross-Functional Support: Collaborate with Data DevOps & Ops teams to support the daily operation of the data integration platform.
  • Work closely with direct and indirect EDM leadership roles, such as Product and QA, to align data strategies with organizational goals.
  • Data Quality and Compliance: Ensure the highest standards of data quality and compliance with relevant data governance and privacy policies.
  • Technical Innovation and Continuous Improvement: Stay abreast of emerging technologies and industry trends in data integration.
  • Recommend and implement improvements to enhance system performance and data integration capabilities.
  • Perform other job-related functions as assigned.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service