Software Engineering III

EquitableCharlotte, NC
3hHybrid

About The Position

Equitable Financial Life Insurance Company seeks a Software Engineering III for its Charlotte, NC location. Duties: · Provide Big Data Solutions for Datalake design, data ingestion and processing on Hadoop Cluster using Spark, Map Reduce, Hive/Hbase, Flume, Kafka, Syncsort along with programming language such as python/scala. · Work with Customer Data Product Owner to gather and translate business requirement, study the priority and criticality of those to advise the order of deliverables that fits the customer focus on key deliverables intact. · Working on Databricks platform to execute and optimize the data pipelines end to end. · Responsible for data gathering and analysis; systems design and implementation; logical design; detailed design; ensuring data security in the design; and system evaluation, integration, vetting, modification, troubleshooting, and optimization. · Serve as subject matter expert (SME) for DataLake infrastructure and services. · Maintain current DataLake applications and develops procedures, where necessary, to improve the environment as required. Complies with all security and audit standards. · Provide technical expertise to the development and implementation of DataLake solutions. · Liaise with business unit customers and vendors depending on assignment and interact with IT Senior Executives. · Responsible for design specifications of one or more large or critical applications or systems. · Provide technical, functional and systems design for all work related to a system development project. · Lead the process of compiling, analyzing, designing, testing and prioritizing system design components and implementation. · Assists with technical testing, ensuring that the system and unit tests were performed and reviews the test results. · Provide production support for new/existing systems of high complexity and scope. · Use Linux, Hadoop, Scoop, HIVE, Impala, Tableau, Python and Databricks to carry out job duties.

Requirements

  • Requires a Master’s or foreign equivalent degree in Computer Science, Elec. Engineering or related technical field plus at least 5 years of experience as Software Engineer, or related occupation involving development of Big Data solutions including performing design, data ingestion and developing pipelines in Hadoop DataLake.
  • Experience must include: Sourcing & ETL Development support to build multiple Data Products for analytical and actuarial purpose; Hadoop Technologies (Sqoop, Python, Databricks); Azure Data Platform Handling; Building predictive models using sentiment scores to forecast market trends and assess the correlation between market movements; HDFS, Hive, Impala, Sqoop, Spark, Python, Azure; and ETL, Vertica Map Reduce, Spark, Kafka Hive, Impala, flume, Storm, Zookeeper, Java, PL/SQL Oracle, Teradata, Scala, MySQL, and Eclipse

Responsibilities

  • Provide Big Data Solutions for Datalake design, data ingestion and processing on Hadoop Cluster using Spark, Map Reduce, Hive/Hbase, Flume, Kafka, Syncsort along with programming language such as python/scala.
  • Work with Customer Data Product Owner to gather and translate business requirement, study the priority and criticality of those to advise the order of deliverables that fits the customer focus on key deliverables intact.
  • Working on Databricks platform to execute and optimize the data pipelines end to end.
  • Responsible for data gathering and analysis; systems design and implementation; logical design; detailed design; ensuring data security in the design; and system evaluation, integration, vetting, modification, troubleshooting, and optimization.
  • Serve as subject matter expert (SME) for DataLake infrastructure and services.
  • Maintain current DataLake applications and develops procedures, where necessary, to improve the environment as required. Complies with all security and audit standards.
  • Provide technical expertise to the development and implementation of DataLake solutions.
  • Liaise with business unit customers and vendors depending on assignment and interact with IT Senior Executives.
  • Responsible for design specifications of one or more large or critical applications or systems.
  • Provide technical, functional and systems design for all work related to a system development project.
  • Lead the process of compiling, analyzing, designing, testing and prioritizing system design components and implementation.
  • Assists with technical testing, ensuring that the system and unit tests were performed and reviews the test results.
  • Provide production support for new/existing systems of high complexity and scope.
  • Use Linux, Hadoop, Scoop, HIVE, Impala, Tableau, Python and Databricks to carry out job duties.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service