Dimensional Fund Advisors-posted 4 months ago
Mid Level
Austin, TX
Funds, Trusts, and Other Financial Vehicles

Dimensional was built around a set of ideas bigger than the firm itself. With a confidence in markets, deep connections to the academic community, and a focus on implementation, we go where the science leads, and continue to pursue new insights, both large and small, that can benefit our clients. The Technology Department at Dimensional leverages the rapidly evolving state of the art to engineer the platforms that power the innovative, research-driven financial and technical products to improve our client's financial lives. As a Senior Python Engineer within the Data Distribution team you will participate in the management of Dimensional's enterprise investment data warehouse which supports Research, Portfolio management, Trading, and Analytic functions. You will have the opportunity to understand our client's needs, collaborate on the design of solutions, and work with emerging data engineering tools and best practices. In this role, you will design, develop, document, and test multiple application services focusing on building scalable data platform and services. You will also expand and optimize our data and data pipeline architecture. A successful candidate will demonstrate strong technical and analytical ability across multiple tech stacks as well as bring passion for optimizing and building data applications from the ground up.

  • Build and deliver investment data technology solutions in support of Research, Portfolio Management, Trading, Analytics and Reporting functions.
  • Formulate, design, develop, test, and deliver data technology solutions with a balanced focus on speed and quality.
  • Collaborate with business analysts, product owners, and project managers to develop user stories, estimates, and work plans.
  • Work with minimal supervision and advise business clients and IT management of technology capabilities and recommend strategies to maximize the benefits of new technologies.
  • Identify, design, and implement changes to data pipelines at various stages including data ingestion, data validation, and quality control, data integration, storage, management, and data delivery.
  • Write unit/integration tests, contribute to engineering wiki, and write detailed documentation.
  • Build high-performance and scalable data-transfer toolsets which reliably transfer datasets between endpoints within established SLA's.
  • Focus on data consistency, refresh rates and caching requirements while keeping the data current across a variety of interfaces.
  • Build and enhance CI/CD pipelines and develop supportable solutions. Participate in code and design reviews.
  • Provide technical troubleshooting and support for production systems.
  • Bachelor's degree in engineering, math, computer science, or a related field, or equivalent work experience.
  • 4-5 years of programming experience in Python (open source) or equivalent.
  • Proficiency in building RESTful APIs and web services.
  • 4-5 years of SQL experience.
  • Proven track record of leveraging SOLID principles and Domain Driven Design to drive successful outcomes.
  • Experience in high performance and high availability data applications including expertise in performance optimization and tuning.
  • Experience with automated acceptance testing and ability to write unit-tested, maintainable code.
  • Strong understanding of cyber security risks and demonstrated ability to design and build highly secure applications.
  • Experience working in a dynamic and interactive team environment to build world-class software implementations.
  • Knowledge of best practices and IT operations in an always-up, always-available service.
  • Experience working with both Agile/Scrum and waterfall methodologies with a software development and integration focus.
  • Master's degree in engineering, math, computer science, or a related field.
  • Proficiency with NoSQL database implementation and optimization.
  • Ability to work on multiple programming languages and platforms is strongly preferred.
  • Financial services industry knowledge or experience.
  • Experience with Kafka.
  • Experience with Airflow.
  • Experience with PostgreSQL.
  • Experience with Ansible.
  • Experience with Elastic Stack.
  • Experience with RabbitMQ.
  • Experience with Redis.
  • Experience with Docker.
  • Experience with Okta, OAuth2, PlainID.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service