ICW Group-posted 2 days ago
$121,625 - $217,711/Yr
Full-time • Mid Level
Hybrid • San Diego, CA
1,001-5,000 employees

Are you looking to make an impactful difference in your work, yourself, and your community? Why settle for just a job when you can land a career? At ICW Group, we are hiring team members who are ready to use their skills, curiosity, and drive to be part of our journey as we strive to transform the insurance carrier space. We're proud to be in business for over 50 years, and its change agents like yourself that will help us continue to deliver our mission to create the best insurance experience possible. Headquartered in San Diego with regional offices located throughout the United States, ICW Group has been named for ten consecutive years as a Top 50 performing P&C organization offering the stability of a large, profitable and growing company combined with a focus on all things people. It's our team members who make us an employer of choice and the vibrant company we are today. We strive to make both our internal and external communities better everyday! Learn more about why you want to be here! PURPOSE OF THE JOB The Data Engineer III will design, develop, and implement data pipelines, data integration and data storage solutions such as data warehouses, data lakes, relational and non-relational databases. The role will partner closely with IT Managers, Enterprise Business Intelligence, Data Governance and Data Science teams to solve business-significant data problems and enable data-driven decision-making, automation, and optimization. In the process, the role will have the opportunity to act as a Data Advisory member for the Actuarial, Business Intelligence (BI), Data Science, and Data Engineering teams, as well as a collaborator and contributor to Enterprise Data Architecture. Data is central to ICW Group’s business strategy and digital evolution, and the Data Engineer ensures optimal data delivery architecture is consistent throughout all projects.

  • Builds data solutions that ensure data integrity and information usability for enterprise-wide digital solutions and decision making.
  • Leads the design and build of scalable data pipelines and solutions (batch and/or streaming) that make the best use of traditional and cloud platforms (AWS or similar) by understanding the business, technology, and data landscape including real time data processing.
  • Provides analysis of complex data elements and systems, data flow and in development of conceptual, logical, and physical data models as well as verification and implementation of ETL/ELT mappings and transformation logic.
  • Designs, supports and peer reviews the data models and schemas for new and existing data sources for the data warehouse.
  • Conducts unit, integration, and system tests on our data sources to validate data against source systems, and continuously optimize performance to improve query speed and reduce cost.
  • Builds the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using ‘big data’ technologies and tools.
  • Leads data initiatives to ensure pipelines are reliable, efficient, testable, and maintainable.
  • Maintains a strong understanding of the current landscape and proactively leads the analysis of the current environment to detect data deficiencies, gaps, and opportunities.
  • Drives the improvement of engineering team processes via data architecture, engineering, test, and operational excellence best practices.
  • Collaborates in implementing components of data strategy – Master Data Management (MDM), data virtualization etc.
  • Partners with various teams in delivering overall data solutions.
  • Works closely with BI and Data Science teams in implementing various data streams.
  • Partners with Enterprise Architecture, Technology, and Project teams to ensure consistency of solutions approach while maintaining data governance requirements.
  • Contributes to data governance and data quality best practices including design reviews, unit testing, code reviews, and continuous integration and deployment.
  • Collaborates with data architects to ensure the output of the physical models meet required needs— e.g., collaborates on data definition, data structure, data content and data usage.
  • Evaluates and conducts POC on new technologies for fitment in next generation data platform ecosystem.
  • Collaborates with the Enterprise Architecture team to drive tooling and standards to improve the productivity and quality of output for data engineers across the company.
  • Bachelor's degree in Computer Science, Applied Mathematics, Engineering, or any other technology related field required, or equivalent combination of education and experience.
  • Minimum 6 years of experience in a data integration (Cloud/Traditional) engineering related role required.
  • Experience with data practices (security, data management and governance) preferred.
  • Expertise with database & data warehouse design, using MS-SQL, PostgreSQL etc.
  • Knowledge of Model and Design of DB schemas for read and write performance.
  • Working knowledge of API or stream-based data extraction processes.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Background in data modeling and performance tuning in relational and no-SQL databases.
  • Experience with AWS technologies like Redshift, S3, EC2, Glue, EMR, Kinesis, Lambda, DynamoDB, etc.
  • Experience with modern data architectures and modern data platforms like Snowflake, Databricks etc.
  • Experience with data technologies: Hadoop, Spark, Kafka, Spark & Kafka Streaming, Python, Scala, Talend etc.
  • Knowledge and experience with data movement tools – SSIS, Profisee, Alteryx, Informatica.
  • Working knowledge of multiple data management domains such as data modeling, integration, warehousing, data quality, security, and governance.
  • A self-starter mentality that thrives in a rapidly changing, fast-paced environment and tolerates ambiguity while demonstrating problem-solving with limited supervision.
  • Strong analytical and time management skills.
  • Self-motivated and able to handle tasks with minimal supervision.
  • Must be organized, detail oriented, and able to multi-task.
  • Ability to work well under pressure and deliver results with tight deadlines and under changing priorities.
  • Ability to cross collaborate with multiple teams and offer value-added solutions to meet objectives.
  • Ability to motivate and inspire team for maximum performance.
  • Strong verbal and written communication skills.
  • This job operates in a professional office environment.
  • While performing the duties of this job, the employee is regularly required to talk or hear.
  • The employee frequently will sit, stand, walk, and bend during working hours.
  • Requires manual and finger dexterity and eye-hand coordination.
  • Required to lift and carry relatively light materials.
  • Requires normal or corrected vision and hearing corrected to a normal range.
  • Ability to work additional hours, as required.
  • This position operates in an office environment and requires the frequent use of a computer, telephone, copier and other standard office equipment.
  • We are currently not offering employment sponsorship for this opportunity.
  • Experience in operations research, machine learning or optimization a plus.
  • Insurance experience a plus.
  • Data architecture or data engineering related certification strongly desired.
  • AWS Cloud Practitioner or more advanced AWS certification preferred.
  • DAMA certifications preferred.
  • We offer a competitive benefits package, with generous medical, dental, and vision plans as well as 401K retirement plans and company match
  • Bonus potential for all positions
  • Paid Time Off with an accrual rate of 5.23 hours per pay period (equal to 17 days per year)
  • 11 paid holidays throughout the calendar year
  • Want to continue learning? We’ll support you 100%
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service