APEX Fintech Services-posted about 10 hours ago
Full-time • Mid Level
Hybrid • Austin, TX
1,001-5,000 employees

Apex Fintech Solutions (AFS) powers innovation and the future of digital wealth management by processing millions of transactions daily, to simplify, automate, and facilitate access to financial markets for all. Our robust suite of fintech solutions enables us to support clients such as Stash, Betterment, SoFi, and Webull, and more than 20 million of our clients' customers. Collectively, AFS creates an environment in which companies with the biggest ideas in fintech are empowered to change the world. As a global organization, we have offices in Austin, Dallas, Chicago, New York, Portland, Belfast, and Manila. If you are seeking a fast-paced and entrepreneurial environment where you'll have the opportunity to make an immediate impact, and you have the guts to change everything, this is the place for you. AFS has received a number of prestigious industry awards, including: 2021, 2020, 2019, and 2018 Best Wealth Management Company - presented by Fintech Breakthrough Awards 2021 Most Innovative Companies - presented by Fast Company 2021 Best API & Best Trading Technology - presented by Global Fintech Awards ABOUT THIS ROLE Apex Fintech Solutions is seeking a Senior Data Engineer to join our data engineering team, responsible for designing, developing, and maintaining robust data solutions that empower decision-making and enhance business processes. You will leverage your expertise in Python, MS SQL, BigQuery SQL, and cloud platforms (GCP, AWS, or Azure) to optimize data pipelines, enhance data models, and contribute to analytical insights. Proficiency in Airflow will be critical to orchestrate workflows efficiently. While this is not a direct management position, you will play a leadership role by guiding junior data engineers, promoting best practices, and fostering technical excellence in the team. You will also collaborate with cross-functional teams to deliver scalable and reliable solutions aligned with business objectives.

  • Data Engineering and Pipeline Development: Design and implement scalable data pipelines to extract, transform, and load (ETL) structured and unstructured data across multiple platforms. Utilize Airflow to orchestrate complex workflows and ensure efficient task execution. Optimize data workflows using Python, MS SQL, and BigQuery SQL for performance and reliability.
  • Cloud Technologies and Infrastructure: Build and optimize data solutions leveraging cloud platforms, including GCP (Google Cloud Platform), AWS, or Azure. Drive adoption of cloud-based tools and techniques for scalability, reliability, and cost-effectiveness.
  • Data Modeling and Database Management: Develop and maintain data models and schemas that align with business needs and support analytics initiatives. Troubleshoot and resolve database performance issues, collaborating with DBAs when necessary.
  • Data Quality and Analysis: Conduct detailed analysis of datasets to validate their accuracy, completeness, and logical consistency. Design automated data validation and quality checks to ensure consistent data integrity.
  • Leadership and Mentorship: Provide mentorship and technical guidance to junior data engineers, ensuring their development aligns with team goals and standards. Promote best practices in coding, data engineering, and technical documentation through active collaboration and code reviews.
  • AI Tools and Data Innovation: Leverage AI-powered tools to enhance productivity, automate data engineering workflows, and drive innovation within the team. Stay informed about emerging trends in AI and data technologies to implement cutting-edge solutions.
  • Cross-Functional Collaboration: Collaborate with product managers, analysts, and other technical teams to identify data requirements and craft impactful solutions. Partner with software engineers to integrate and scale data pipelines across broader product architectures.
  • Problem Solving and System Reliability: Investigate and resolve data and pipeline-related issues in production environments, minimizing downtime and ensuring system reliability. Implement recovery and monitoring procedures for pipelines, leveraging cloud-based tools.
  • Continuous Learning and Process Improvement: Stay updated with industry best practices, trends, and emerging technologies related to data engineering and cloud computing. Proactively recommend process and tooling improvements to enhance development methodologies and team capabilities.
  • Bachelor’s degree in Computer Science, Engineering, Data Analytics, or a related field (or equivalent professional experience); advanced degree preferred.
  • 5+ years of experience in data engineering or related roles with expertise in Python, MS SQL, BigQuery SQL, and Airflow.
  • Experience designing and optimizing data pipelines and ETL workflows for large datasets in cloud environments (GCP, AWS, or Azure).
  • Proven ability to mentor and guide junior engineers while contributing as an individual contributor to team success.
  • Experience with automated testing frameworks for data integrity and workflow reliability.
  • Experience with CI/CD pipelines and modern data engineering tools.
  • Expertise in Airflow for workflow orchestration and pipeline management.
  • In-depth knowledge of data engineering processes, including ETL, data modeling, and cloud solutions (GCP, AWS, or Azure).
  • Strong proficiency in SQL-based technologies and ability to write and optimize sophisticated queries.
  • Proficiency in Python for data processing and task automation.
  • Excellent troubleshooting and problem-solving abilities in production environments.
  • Effective communication and collaboration skills, with the ability to align cross-functional teams on data-driven solutions.
  • Knowledge of Agile methodologies (e.g., Scrum, Kanban) and tools like Jira for project tracking.
  • Financial services background is a bonus but not required.
  • We offer a robust package of employee perks and benefits, including healthcare benefits (medical, dental and vision, EAP), competitive PTO, 401k match, parental leave, and HSA contribution match.
  • We also provide our employees with a paid subscription to the Calm app and offer generous external learning and tuition reimbursement benefits.
  • At AFS, we offer a hybrid work schedule for most roles that allows employees to have the flexibility of working from home and one of our primary offices.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service