Fintech-posted 7 months ago
Full-time • Entry Level
Tampa, FL
Real Estate

Join Fintech in Tampa as a Junior Data Engineer! We are seeking a talented Junior Data Engineer with Data Engineering skills to join our data platforms team. The ideal candidate will have a strong background in programming and data engineering. The Junior Data Engineer will work closely with data engineering lead and other developers to ensure projects are completed on time and meet the requirements of stakeholders. The Junior Data Engineer position assists Fintech Clients, Partners, Third Parties and internal team members with development requests, troubleshooting, and technical inquiries.

  • Design, develop, and optimize scalable ETL/ELT pipelines using Snowflake / Databricks for batch and streaming data processing.
  • Develop and maintain data models that support analytical and operational workloads.
  • Implement best practices for performance tuning, cost optimization, and query efficiency in Snowflake or Databricks.
  • Optimize storage, compute, and data-sharing strategies for cost-effective and performant solutions.
  • Leverage platform-specific features (e.g., Snowflake's Time Travel, Zero-Copy Cloning, Streams & Tasks or Databricks' Delta Lake, Apache Iceberg, Delta sharing, and Unity Catalog).
  • Ensure data security, access control, and governance frameworks are implemented and adhered to.
  • Monitor data quality and performance, identifying areas for optimization and improvement.
  • Ensure code quality and implement test cases for data pipeline code.
  • Stay updated with industry trends in cloud data platforms and evaluate their application to improve our data ecosystem.
  • Contribute to the design of data architecture and delivery of innovative and engaging data engineering solutions.
  • Continuously learn and develop your skills to become a more proficient and valuable member of the development team.
  • Conduct data discovery activities and make recommendations for the remediation of data quality issues.
  • Carry out an incident investigation to find and report the root cause.
  • Learn and apply best practices in data engineering, data governance, and data security.
  • Bachelor's degree in Computer Science, related technical degree or commensurate experience is required.
  • 2+ years of experience in data engineering with a focus on large-scale cloud-based data processing.
  • Expertise in Snowflake / Databricks, including performance tuning, cost optimization, and best practices.
  • Experience in batch and streaming data processing.
  • Experience in Python/PySpark for data processing and automation.
  • Experience in data modeling for both Operational and Analytical data.
  • Hands-on experience with data security, access control, and governance frameworks in Snowflake or Databricks.
  • Experience with Apache Druid/ MSSQL/ MySQL/ MongoDB/ Oracle/ PostgreSQL.
  • Experience with Graph Databases to manage complex relationships within data.
  • Experience in working with the Open-Source community.
  • Familiarity with cloud platforms such as Azure.
  • Experience with source control tools such as Git.
  • Knowledge of big data technologies such as Spark and Kafka.
  • Familiarity with Agile development methodologies.
  • Understanding of software testing methodologies and tools.
  • Employer Matched 401K
  • Company Paid Medical Insurance Option for Employee and Dependent Children
  • Company Paid Dental Insurance for Employee
  • Company Paid Vision Insurance for Employee
  • Company Paid Long and Short-Term Disability
  • Company Paid Life and AD&D Insurance
  • 18 Paid Vacation Days a Year
  • Six Paid Holidays
  • Employee Recognition Programs
  • Incentive Compensation
  • Community Outreach Opportunities
  • Business Casual Dress Code
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service