BlackRock-posted 7 months ago
$99,750 - $120,000/Yr
Full-time • Entry Level
Remote • Atlanta, GA
Funds, Trusts, and Other Financial Vehicles

At BlackRock, technology has always been at the core of what we do - and today, our technologists continue to shape the future of the industry with their innovative work. We are not only curious but also collaborative and eager to embrace experimentation as a means to solve complex challenges. Here you'll find an environment that promotes working across teams, businesses, regions and specialties - and a firm committed to supporting your growth as a technologist through curated learning opportunities, tech-specific career paths, and access to experts and leaders around the world. The Data Engineering team is responsible for building and maintaining a cutting-edge data platform that provides quality data for Investors, Operations Teams, Data Scientists, and all users of the platform. We are seeking top tier Cloud Native DevOps Platform Engineers to augment our Enterprise Data Platform team. Our objective is to extend our data lifecycle management practices to include structured, semi structured and unstructured data. This role requires a breadth of individual technical capabilities and competencies, though, most important, is a willingness and openness to learning new things across multiple technology disciplines. This role is for practitioners and not researchers.

  • Work alongside our systems engineers and UI developers to help design and build scalable, automated CI/CD pipelines.
  • Help prove out and productionize infrastructure and tooling to support scalable cloud-based applications.
  • Working/Unlocking myriad generative AI/ML use cases for Aladdin Data and thus for BlackRock.
  • Comfortable reading and writing python code for data acquisition, ETL/ELT.
  • Experience orchestrating data pipelines with AirFlow and/or Argo Workflows.
  • Experience implementing and operating telemetry-based monitoring, alerting, and incident response systems.
  • Experience supporting database or datastores e.g. MongoDB, Redis, Cassandra, Ignite, Hadoop, S3, Azure Blob Store; and various messaging and streaming platforms such as NATS or Kafka.
  • Knowledge of the Kubernetes (K8s) APIs with a strong focus on stateful workloads.
  • Understanding of the K8s Operator and Controller Patterns.
  • Templating with Helm and Kustomize.
  • Comfortable reading and writing golang code for integrating and extending K8s, Operators and controllers.
  • GitOps with ArgoCD and CIOps via Ansible and Terraform.
  • Comfortable building atop K8s native frameworks including service mesh (Istio), secrets management (cert-manager, HashiCorp Vault), log management (Splunk), observability (Prometheus, Grafana, AlertManager).
  • Experience in creating and evolving CI/CD pipelines with GitLab or Github following GitOps principles.
  • Experience with NLP coding tasks like tokenization, chunking, tagging, embedding, and indexing supporting subsequent retrieval and enrichment.
  • Experience with basic prompt engineering, LLM fine tuning, and chatbot implementations in modern python SDKs like langchain and/or transformers.
  • Experience moving AI based data products and/or Agentic systems to production.
  • Strong retirement plan.
  • Tuition reimbursement.
  • Comprehensive healthcare.
  • Support for working parents.
  • Flexible Time Off (FTO).
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service