rockITdata-posted about 1 month ago
Full-time • Mid Level
Remote • Arlington, VA
101-250 employees

Driven by Innovation and built on Trust, rockITdata is a unique SDVOSB services company that partners with leading commercial healthcare/life sciences organizations on cutting edge innovations - think AI, automation and data transformation. We then bring those commercially tested solutions to government entities to deliver predictable, measurable impact for the American taxpayer and consumer. We are seeking a talented and experienced Full Stack Data Engineer to join our team. The ideal candidate will have a strong background in both data engineering and software development, with expertise in building end-to-end data solutions. This role offers an exciting opportunity to work on diverse data projects and contribute to the development of innovative data-driven applications. This is a Remote position.

  • Data Ingestion and Integration:
  • Design and implement scalable data ingestion pipelines to efficiently collect and process data from various sources.
  • Integrate data from different systems and platforms to create unified datasets for analysis and reporting.
  • Data Storage and Management:
  • Develop and maintain data storage solutions such as data lakes, data warehouses, and NoSQL databases.
  • Optimize data storage and retrieval mechanisms for performance, scalability, and cost-effectiveness.
  • Data Processing and Transformation:
  • Implement data processing workflows for cleaning, transforming, and enriching raw data into usable formats.
  • Apply data transformation techniques such as ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes.
  • Data Modeling and Optimization:
  • Design and implement data models to support analytical and reporting requirements.
  • Optimize data models for query performance, data integrity, and storage efficiency.
  • Software Development and Integration:
  • Build software applications and APIs to expose data services and functionality to other systems and applications.
  • Integrate data engineering workflows with existing software systems and platforms.
  • Monitoring and Maintenance:
  • Establish monitoring and alerting mechanisms to track the health and performance of data pipelines and systems.
  • Conduct regular maintenance activities to ensure the reliability, availability, and scalability of data infrastructure.
  • Documentation and Collaboration:
  • Document data engineering processes, architectures, and solutions to facilitate knowledge sharing and collaboration.
  • Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to understand requirements and deliver solutions.
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • Proficiency in programming languages such as Python, Java, or Scala for data engineering and software development.
  • Strong understanding of database concepts, data modeling techniques, and SQL programming.
  • Hands-on experience with cloud platforms such as AWS, Azure, or GCP for building and deploying data solutions.
  • Knowledge of data warehousing concepts and technologies (e.g., Redshift, BigQuery, Snowflake).
  • Familiarity with version control systems (e.g., Git) and software development best practices (e.g., Agile, CI/CD).
  • Experience building solutions for Commercial clients in Pharma, Biotech, CPG, Retail or Manufacturing industries.
  • Experience with containerization technologies such as Docker and orchestration tools like Kubernetes.
  • Knowledge of streaming data processing frameworks (e.g., Apache Flink, Apache Kafka Streams).
  • Familiarity with data governance and security practices for protecting sensitive data.
  • Strong problem-solving skills and the ability to troubleshoot complex data engineering issues.
  • Excellent communication skills and the ability to collaborate effectively in a team environment.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service