Analyst, Information Technology (Data Engineer)

Oriental BankCharlotte, PR
Hybrid

About The Position

OFG Bancorp is looking for an experienced Mid-Level Data Engineer to design, build, and operate scalable data pipelines and data platforms supporting banking and financial services data domains, including customer, account, transaction, and product data. This role operates within a regulated financial services environment and partners closely with data architecture, analytics, risk, and technology teams to deliver secure, reliable, and analytics ready data products using Snowflake and AWS. Position works hybrid and reports to San Juan, PR.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or a related field required.
  • Three (3)+ years of professional experience in Data Engineering or a related role required.
  • Minimum education and experience required can be substituted with the equivalent combination of education, training and experience that provides the required knowledge skills and abilities.
  • Strong hands‑on experience with Snowflake, including data ingestion, transformations, and performance optimization.
  • Experience working within AWS cloud environments, particularly with S3 and serverless services.
  • Advanced SQL skills and experience working with large, complex datasets.
  • Proficiency in Python for data processing and pipeline automation.
  • Experience with modern data engineering tools and frameworks (e.g., dbt, Airflow, Spark/PySpark, or equivalent).
  • Familiarity with Git, CI/CD practices, and production support workflows.
  • Legally authorized to work in the US is required.
  • This position is of indefinite duration and requires candidates to have permanent or ongoing work authorization.
  • Employee is responsible for maintaining eligible work authorization throughout his tenure with the organization.

Nice To Haves

  • Experience working in banking, financial services, fintech, or other regulated industries, as well as financial data domains and data governance concepts.
  • Experience supporting analytics or BI tools (e.g., Tableau, Looker, Amazon QuickSight).
  • Exposure to streaming or near‑real‑time data (Kafka, Kinesis, or event‑driven architectures).
  • Certifications in SnowFlake, AWS or others preferred (Databricks and dbt).

Responsibilities

  • Design, develop, and maintain ELT/ETL pipelines for batch and event‑driven data ingestion using Snowflake and AWS.
  • Engineer and optimize Snowflake data models (tables, views, secure data shares) to support analytics, reporting, and downstream consumption.
  • Develop scalable data processing solutions using SQL and Python.
  • Integrate with AWS services such as S3, Lambda, Glue, and IAM to support data ingestion, processing, and access management.
  • Build and maintain pipelines handling financial transactions and customer data, ensuring accuracy, completeness, and auditability.
  • Apply data quality checks, reconciliation logic, and monitoring appropriate for regulated financial data.
  • Partner with data architects to align pipelines with enterprise data models, standards, and governance expectations.
  • Support production deployments using Git‑based version control, CI/CD pipelines, and infrastructure‑as‑code practices.
  • Collaborate in Agile delivery teams, contributing to sprint planning, design reviews, and technical documentation.
  • Other duties may be assigned.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service