Sr. Data Integration Engineer

Red LobsterOrlando, FL
5hRemote

About The Position

The Senior Data Integration Engineer brings advanced expertise in Informatica, SQL and Python to design, develop, and maintain enterprise-grade ETL solutions that support scalable, secure, and high-performing data pipelines. This role plays a critical part in integrating both structured and unstructured data across on-premises and cloud platforms using a blend of traditional ETL design and modern data engineering practices. As the organization’s Informatica subject matter expert and administrator, this engineer owns the full Informatica environment—production, testing, and development. They are responsible for monitoring and troubleshooting pipelines, optimizing performance using Informatica Administrator and other diagnostic tools, and implementing enhancements that align with architectural standards and business needs. The engineer is also a key contributor to data governance efforts, ensuring data integration processes are well-documented, quality-checked, and built to scale. They collaborate with the broader Business Intelligence (BI), MuleSoft Integration, Database Administrators, and other IT teams to align integration work with enterprise data warehouse strategies, enabling trustworthy analytics and reporting. This individual partners closely with business stakeholders to understand data requirements, determine what data should be warehoused, and design integration solutions that support data-driven decision-making. They work cross-functionally with Systems Engineers, Data Architects, MuleSoft developers, and participate in the Architecture Review Committee (ARC) to validate that data pipeline changes comply with enterprise architecture and data management policies. The Sr. Engineer follows Software Development Life Cycle (SDLC) best practices, leads code reviews, and mentors junior team members, fostering a culture of technical excellence and continuous improvement. They are expected to proactively share knowledge, advocate for integration best practices, and promote the intelligent use of enterprise data. Success in this role is measured by the delivery of high-quality data solutions, adherence to timelines, increased operational efficiency, and improved stakeholder satisfaction through reliable and accessible data services.

Requirements

  • Capacity to utilize experience, technical knowledge, and business sense to resolve issues and questions and plan and accomplish BI Data goals.
  • Ability to develop process documentation and develop solutions using diagrams based on customer collaboration.
  • Translate business objectives and requirements into technical specifications.
  • Minimum six (6) years’ experience developing ETL solutions that integrate into data warehouses for large, global organizations, preferably using Informatica.
  • Minimum six (6) years’ experience programming in SQL.
  • Minimum six (6) years’ experience managing data warehouses.
  • Minimum five (5) years’ experience performance tuning.
  • Minimum five (5) years’ experience with stored procedures.
  • Experience integrating data from a variety of data sources (flat files, APIs and relational databases).
  • Knowledge of best practices through relevant experience across data-related disciplines and technologies particularly for enterprise-wide data architectures and data warehousing/business intelligence.
  • Understanding of all forms or data storage including PostgreSQL, relational, snowflake, and star schemas.
  • Ability to work independently, take ownership of tasks and follow through to implementation/resolution.
  • Demonstrated ability to analyze, verify, and document the accuracy of the developed BI pipelines through self-directed testing.
  • Resolve end user data problems through collaboration with both technical and functional personnel in a team environment.
  • Ability to prepare design documentation to express necessary details for review and ongoing maintenance.
  • Ability to prioritize and multi-task across numerous work streams.
  • Bachelor’s Degree in MIS, Computer Science, Business, Mathematics or Engineering.
  • Experience with SQL, including stored procedures, functions and triggers.
  • Experience with Informatica Intelligence Data Management Cloud (IDMC), especially the Cloud Data Integration (CDI) component.
  • Excellent problem-solving, communication and documentation skills.

Nice To Haves

  • Experience with Informatica in large, complex environments.
  • Deep experience with Informatica Cloud (IDMC), including Informatica certification.
  • Working experience with revision control systems, Linux platform, Shell scripting, python, and large database systems (Oracle and MS SQL preferred).
  • Experience in UNIX operating system.
  • Experience with customer/guest data and systems; knowledge of restaurant/hospitality business.
  • Experience with various visual representations that would provide the most informative view of data.
  • Experience with JAMS scheduling tool, Azure Data Factory, Azure Scheduler or similar.
  • Experience with Erwin Data Modeler, ER/Studio, Talend or similar data modeling tool.
  • Experience with cloud platforms such as Azure, AWS, GCP or OCI (Oracle).
  • Experience with cloud data platforms such as Snowflake, Redshift, BigQuery and storage services (S3, blob storage, etc.)
  • Experience building data pipelines with MuleSoft.
  • Agile/Scrum experience.

Responsibilities

  • Collaborate with key stakeholders and project team members to gather, prioritize, and translate business requirements into functional and technical specifications.
  • Design, develop, and maintain scalable ETL pipelines using Informatica Intelligent Cloud Services (CDI/IDS/IDMC), Python, SQL stored procedures, PowerShell, and other relevant tools.
  • Create and maintain comprehensive technical specifications and design documentation, setting standards for business and functional requirements.
  • Anticipate future needs to ensure solutions are flexible, extensible, and aligned with long-term goals.
  • Integrate data from a wide range of sources, including APIs, relational databases, file systems, MuleSoft payloads, and cloud services such as AWS, Azure, and Google Cloud Platform (GCP).
  • Provide hands-on development and programming expertise to ensure data pipelines meet business and technical requirements.
  • Prepare and deliver technical documentation to support long-term maintainability.
  • Optimize Informatica workflows and SQL-based transformations to improve data processing efficiency and performance.
  • Ensure all code and configurations are thoroughly unit-tested.
  • Participate in and support system/integration testing and perform code reviews to ensure quality and adherence to standards.
  • Maintain operational stability and reliability of data pipelines and BI systems, including proactive monitoring and auditing to ensure data integrity, accuracy, and timely delivery.
  • Serve as the Informatica application administrator, ensuring proper management of the platform across development, testing, and production environments.
  • Participate in the team’s 24/7/365 on-call support rotation to provide timely resolution for critical production incidents.
  • Prepare change requests for any production updates, including documentation of, change process, implementation tests, successful testing and validation.
  • Proactively identify and resolve technical and data-related issues; recommend and implement appropriate solutions with minimal supervision.
  • Adhere to organizational architecture guidelines and contribute to the evolution of data architecture principles and standards.
  • Accurately estimate work efforts, track progress, and report status updates to project managers and leadership as needed.
  • Utilize Azure DevOps to document code changes, manage version control, and track revisions in alignment with SDLC best practices.
  • Leverage Azure DevOps to manage and track project tasks, features and workflows in alignment with agile sprint cycles, ensuring visibility, accountability and timely delivery.
  • Maintain open, clear communication with stakeholders throughout the project lifecycle, ensuring expectations are managed and issues are resolved promptly.
  • Continuously improve technical skills and domain knowledge by researching emerging technologies, reading industry publications, participating in professional communities, and maintaining a strong personal learning network.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service