Data Engineer

| Diemen

Date listed

2 weeks ago

Employment Type

Full time

Found on:

Responsive image Github

Data Engineer, Everyday people employed through Randstad make sure your luggage gets onto the plane, your takeaway into your kitchen and that our most vulnerable are cared for. Everytime we place someone into a new role, they make an impact on our society, and we make an impact on their and their employer’s future. As a more and more data-driven organisation, data is a vital part of delivering on the promises we make to our candidates and customers.

Randstad, Yacht and Tempo-Team possess an enormous quantity of data: the characteristics of millions of candidates and business data on clients, potentials and our own organisation. Your team, DataHub, is responsible for providing the basis for our data infrastructure. Our data engineers work on a self-service data platform, making sure our data makes its way from a vast array of sources to the right place, enabling our Data Scientists and Operational Insights teams to for example build AI solutions.


The DataHub team is an agile team of 8 data engineers. The datalake we maintain is partly in Redshift, partly in S3. We created a Django based metadata catalog, that functions in the same time as a portal to monitor the data and to provide services for our consumers.

For general usage we offer the functionality of data subscriptions through scheduled unloads to a project space on S3. Furthermore we offer tools to work with machine learning models using Sagemaker.

The Data Engineer will be designing and setting up infrastructure on AWS for the expanding services of our platform and develop airflow dags that represent our data pipelines. Most of our coding is done in python.


You will be part of an agile Data Engineer team and play a vital role in the design and development of a cloud-based data platform.

You will build and manage the DataHub frontend, which includes;

  • A data catalog;
  • DataHub users and projects management;
  • Data subscriptions;
  • etc.

Develop ways to improve self-service data consumption and data publishing:

  • Build and manage ETL pipelines in Airflow, which are responsible of ingesting the data and making the data available to users;
  • Develop standard ways to deliver data in the DataHub;
  • Develop CI/CD pipelines for data consuming teams to let them develop their products;
  • etc.

The Data Engineer will be responsible of producing high quality code and reusable components. Using containerization, CI/CD and other automation technologies, you will be responsible for creating a backend for high availability and scalability, while at the same time being easily deployable, manageable and secure.

Together with the rest of the team you will be involved in the full product development process, from design, implementation, to testing, documentation and automated deployment.

Respond to and resolve operational incidents, performing root cause analysis and managing changes required to prevent future occurrences.

In this team you will have a wide range of responsibilities and should be willing to adapt to many different challenges.

Discuss with the users of the platform requirements and future improvements, but also come with proposals for our user on how to use the platform.

Manage and develop our data persistence environments (data lake, storage, etc) to ensure that data is properly available to users and secured

Monitor systems for uptime and performance.


  • Python;
  • ECS, Docker;
  • CloudFormation, Jenkins;
  • Airflow;
  • AWS (S3, EC2, ECS, lambda, RDS, Redshift, EMR Spark, Athena, Glue).


  • Our small, agile teams are just that: agile. You should be able to adapt to a variety of challenges and like quick feedback loops;
  • Experience working in our tech stack, including Python, Linux, AWS cloud, Jenkins and GIT;
  • Experience with scripting/automation of tasks;
  • Experience with common devops and CI/CD practices to make your own work as easy as possible and guarantee the quality of our products;
  • Be comfortable working in test-driven development and know what this means;
  • Like working in a team, but take responsibility for your own tasks;
  • If you really want to impress us, you can do so by having experience in:containerization platforms like ECS, EKS and Docker; Redshift; AWS serverless services (Lambda, API GW, SNS); SQL databases; Django; Airflow.


  • Plenty of training and development opportunities within the group. A significant share of our employees have held several roles in their years in the business, with RGN giving you the tools you need to challenge and develop yourself;
  • A monthly salary between €2.666 and €5.406, depending on your experience;
  • 8,5% holiday allowance;
  • A generous monthly benefit budget on top of your salary and holiday pay that you can choose to spend on extra time off, perks such as a bike, tablet, gym subscription or simply get paid out;
  • 25 days holiday with the option to buy additional 25 days off;
  • A generous sabbatical program;
  • A good mobility scheme, laptop and everything you need to perform your job well;
  • An attractive bonus scheme and the option to earn an outperformance bonus twice per year.

Does this sound like the right next step for you? Fantastic! We are looking forward to your application.


Findwork Copyright © 2021


Let's simplify your job search. Receive your tailored set of opportunities today.

Subscribe to our Jobs