Ness Digital Engineering (India) Private Limited-posted 3 months ago
$150,000 - $170,000/Yr
Full-time • Senior
1,001-5,000 employees

Ness is a full lifecycle digital engineering firm offering digital advisory through scaled engineering services. Combining our core competence in engineering with the latest in digital strategy and technology, we seamlessly manage Digital Transformation journeys from strategy through execution to help businesses thrive in the digital economy. As your tech partner, we help engineer your company’s future with cloud and data. For more information, visit www.ness.com. We are problem-solvers, architects, strategists, implementors, and lifelong learners. We collaborate with each other and with our clients to help them meet their short- and long-term technology goals. Our culture is open, transparent, challenging, and fun. We hire smart, self-starters who thrive in an open-ended environment to figure out what needs to be done and take ownership in delivering quality results.

  • Design and implement data lakehouse solutions on AWS using Medallion Architecture (Bronze/Silver/Gold layers).
  • Build and optimize real-time and batch data pipelines leveraging Apache Spark, Kafka, and AWS Glue/EMR.
  • Architect storage and processing layers using Parquet and Iceberg for schema evolution, partitioning, and performance optimization.
  • Integrate AWS data services (S3, Redshift, Lake Formation, Kinesis, Lambda, DynamoDB) into enterprise solutions.
  • Ensure data governance, lineage, cataloging, and security compliance in line with financial regulations (Basel III, MiFID II, Dodd-Frank).
  • Partner with business stakeholders (trading, risk, compliance) to translate requirements into technical architecture.
  • Provide technical leadership and guidance to engineering teams.
  • Strong hands-on skills in AWS Data Services (S3, Redshift, Glue, EMR, Kinesis, Lake Formation, DynamoDB).
  • Expertise in Apache Kafka (event streaming) and Apache Spark (batch and streaming).
  • Proficiency in Python for data engineering and automation.
  • Strong knowledge of Parquet, Iceberg, and Medallion Architecture.
  • Experience with trading systems, market data feeds, risk analytics, and regulatory reporting.
  • Familiarity with time-series data, reference/master data, and real-time analytics.
  • Exposure to Delta Lake, DBT, Databricks, or Snowflake.
  • AWS Certifications (Solutions Architect – Professional, Data Analytics Specialty).
  • Exciting and challenging projects across a diverse range of industries.
  • Opportunity to collaborate with a group of forward-thinking, capable partners around the globe.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service