Hiring DevOps Engineer with Data Engineering Experience | Scispot

Devops Engineer With Data Engineering Experience


Date listed

1 month ago

Employment Type

Full time



About Scispot

Scispot is building the world's first scientific ETL for unstructured, semi-structured and structured data. Scispot staging lakehouse connects with all the popular R&D sources such as Benchling, Quartzy, Lab Instruments, and others & converts them into machine-readable data to support downstream pipelines.

The Opportunity:

We're searching for a seasoned DevOps Engineer with experience in Data Engineering. The right candidate will have an opportunity to become a senior architect as we are moving really fast and scaling our Infrastructure. In this role, you'll be instrumental in ensuring the scalability and stability of our infrastructure as we continue to grow. This position involves a balance of maintaining robust AWS infrastructure, implementing DevOps practices, and handling ETL tasks.

What You'll Do:

  • Design, manage, and optimize a scalable, secure AWS infrastructure focusing on services such as EC2, ECS, S3, and Lambda.
  • Implement strategies and solutions for scaling and stabilizing infrastructure.
  • Maintain high-availability systems, targeting 99.99% uptime.
  • Drive the adoption of Infrastructure as Code (IAC) practices.
  • Work with pub-sub architecture and Rabbit MQ.
  • Develop and maintain ETL processes using open-source tools like Apache Airflow, Apache Nifi, and Airbyte.
  • Contribute to AI infrastructure, including ML pipelines using Next Flow.
  • Collaborate with cross-functional teams to translate their needs into technical requirements.

What You'll Need:

  • Proven experience as a DevOps Engineer with a focus on infrastructure management and scalability.
  • Experience with Bioinformatics pipelines
  • Extensive experience with AWS infrastructure including services like EC2, ECS, API Gateway, S3, and Lambda.
  • Strong familiarity with maintaining high-availability systems.
  • Proficiency in Infrastructure as Code (IAC) practices.
  • Experience in a Data Engineering or similar role involving ETL processes.
  • Experience with Rabbit MQ and pub-sub architectures.
  • Familiarity with ETL tools, specifically Apache Airflow, Apache Nifi, and Airbyte.
  • Experience in building AI infrastructure with ML pipelines, particularly with Next Flow.
  • Experience with Elastic Search
  • Familiarity with Java and Spring Boot is a plus.

Findwork Copyright © 2023


Let's simplify your job search. Receive your tailored set of opportunities today.

Subscribe to our Jobs