BlackRock-posted 8 months ago
Full-time • Entry Level
Remote • Atlanta, GA
Funds, Trusts, and Other Financial Vehicles

When BlackRock was started in 1988, its founders envisioned a company that combined the best of financial services with cutting edge technology. They imagined a company that would provide financial services to clients as well as technology services to other financial firms. The result of their vision is Aladdin, our industry leading, end-to-end investment management platform. Data is at the heart of Aladdin and of everything we do. The Data Engineering team is responsible for building and maintaining a cutting-edge data platform that provides quality data for Investors, Operations Teams, Data Scientists, and all users of the platform. Data is at the core of the Aladdin platform, and increasingly, our ability to consume, store, analyze, and gain insight from data is a key component of our competitive advantage. The Data Engineering team is responsible for the data ecosystem within BlackRock. We engineer high performance data pipelines, provide a fabric to discover and consume data, and continually evolve our data storage capabilities. We believe in writing small, testable code with a focus on innovation. We are committed to open source, and we regularly contribute our work back to the community. We are seeking top tier Cloud Native DevOps Platform Engineers to augment our Enterprise Data Platform team. Our objective is to extend our data lifecycle management practices to include structured, semi structured and unstructured data. This role requires a breadth of individual technical capabilities and competencies, though, most important, is a willingness and openness to learning new things across multiple technology disciplines. This role is for practitioners and not researchers.

  • Work alongside our systems engineers and UI developers to help design and build scalable, automated CI/CD pipelines.
  • Help prove out and productionize infrastructure and tooling to support scalable cloud-based applications.
  • Working/Unlocking myriad generative AI/ML use cases for Aladdin Data and thus for BlackRock.
  • Comfortable reading and writing python code for data acquisition, ETL/ELT.
  • Experience orchestrating data pipelines with AirFlow and/or Argo Workflows.
  • Experience implementing and operating telemetry-based monitoring, alerting, and incident response systems.
  • Experience supporting database or datastores e.g. MongoDB, Redis, Cassandra, Ignite, Hadoop, S3, Azure Blob Store; and various messaging and streaming platforms such as NATS or Kafka.
  • Knowledge of the Kubernetes (K8s) APIs with a strong focus on stateful workloads.
  • Understanding of the K8s Operator and Controller Patterns.
  • Templating with Helm and Kustomize.
  • Comfortable reading and writing golang code for integrating and extending K8s, Operators and controllers.
  • GitOps with ArgoCD and CIOps via Ansible and Terraform.
  • Comfortable building atop K8s native frameworks including service mesh (Istio), secrets management (cert-manager, HashiCorp Vault), log management (Splunk), observability (Prometheus, Grafana, AlertManager).
  • Experience in creating and evolving CI/CD pipelines with GitLab or Github following GitOps principles.
  • Experience with NLP coding tasks like tokenization, chunking, tagging, embedding, and indexing supporting subsequent retrieval and enrichment.
  • Experience with basic prompt engineering, LLM fine tuning, and chatbot implementations in modern python SDKs like langchain and/or transformers.
  • Experience moving AI based data products and/or Agentic systems to production.
  • Strong retirement plan.
  • Tuition reimbursement.
  • Comprehensive healthcare.
  • Support for working parents.
  • Flexible Time Off (FTO).
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service