eSimplicity-posted 4 months ago
Columbia, MD
51-100 employees

eSimplicity is modern digital services company that work across government, partnering with our clients to improve the lives and ensure the security of all Americans—from soldiers and veteran to kids and the elderly, and defend national interests on the battlefield. Our engineers, designers and strategist cut through complexity to create intuitive products and services that equip Federal agencies with solutions to courageously transform today for a better tomorrow for all Americans.

  • Develop, expand and optimize data and data pipeline architecture.
  • Optimize data flow and collection for cross functional teams.
  • Support software developers, database architects, data analysts and data scientists on data initiatives.
  • Ensure optimal data delivery architecture is consistent throughout ongoing projects.
  • Create new pipeline and maintain existing pipeline.
  • Update Extract, Transform, Load (ETL) process and create new ETL features.
  • Build PoCs with Redshift Spectrum, Databricks, AWS EMR, SageMaker, etc.
  • Implement large dataset engineering with data augmentation, data quality analysis, data analytics, data profiling, and data maturity models.
  • Operate large-scale data processing pipelines and resolve business and technical issues.
  • Assemble large, complex sets of data that meet non-functional and functional business requirements.
  • Identify, design, and implement internal process improvements.
  • Build required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies.
  • Build analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics.
  • Work with stakeholders including data, design, product and government stakeholders.
  • Write unit and integration tests for all data processing code.
  • Work with DevOps engineers on CI, CD, and IaC.
  • Read specs and translate them into code and design documents.
  • Perform code reviews and develop processes for improving code quality.
  • Perform other duties as assigned.
  • All candidates must pass public trust clearance through the U.S. Federal Government.
  • Minimum of 8 years of previous Data Engineer or hands on software development experience with at least 4 of those years using Python, Java and cloud technologies for data pipelining.
  • A Bachelor’s degree in Computer Science, Information Systems, Engineering, Business, or other related scientific or technical discipline, or 10 years of general IT experience with at least 8 years of specialized experience.
  • Expert data pipeline builder and data wrangler who enjoys optimizing data systems.
  • Self-sufficient and comfortable supporting the data needs of multiple teams, systems, and products.
  • Experienced in designing data architecture for shared services, scalability, and performance.
  • Experienced in designing data services including API, meta data, and data catalogue.
  • Experienced in data governance process to ingest, curate, and share data.
  • Ability to build and optimize data sets, big data pipelines and architectures.
  • Ability to perform root cause analysis on processes and data.
  • Excellent analytic skills associated with working on unstructured datasets.
  • Ability to build processes that support data transformation, workload management, data structures, dependency and metadata.
  • Demonstrated understanding and experience using big data tools like Spark and Hadoop, relational databases, workflow management tools, and AWS cloud services.
  • Flexible and willing to accept a change in priorities as necessary.
  • Ability to work in a fast-paced, team-oriented environment.
  • Experience with Agile methodology, using test-driven development.
  • Experience with GitHub and Atlassian Jira/Confluence.
  • Excellent command of written and spoken English.
  • Federal Government contracting work experience.
  • Databricks Certification, Google’s Certified Professional-Data-Engineer certification, IBM Certified Data Engineer – Big Data certification.
  • Centers for Medicare and Medicaid Services (CMS) or Health Care Industry experience.
  • Experience with healthcare quality data including Medicaid and CHIP provider data, beneficiary data, claims data, and quality measure data.
  • Highly competitive salaries.
  • Full healthcare benefits.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service