Data Engineer (Apache Airflow)

As a Data Engineer, you will get the opportunity to work in beating heart of data within the Rabobank. You will work together with people who have an insatiable curiosity in technology, data, and self-development. You will heavily invest in your career. You will make memories. 

You can make a difference 

Within the Tribe Data & Analytics you will work in the centre of the data driven enterprise. The tribe contains 6 area’s
•    Global Data Platform 
•    Analytics Platform 
•    Data & Factory Services 
•    Data Science  
•    Business Intelligence 
•    Customer Analytics 
 
For about 40 squads are dived over these areas. All squads have dedicated business analyst. We need your help to design, build and run ground breaking solutions that are as valuable to our 7 million customers. 
 
As a Data Engineer within the area Global Data Platform (GDP), you will work specifically as member of the squad that develops and maintains Apache Airflow application for all Rabobank’s engineers working on the data lake. As part of your daily tasks you will responsible for


•    Running, maintaining and extending the Apache Airflow application on Kubernetes
•    Building and running the Apache Airflow scheduling and orchestration tool and its supporting components like, GitSync, the Windmill API and the notification app.
•    Building new features as: building new workflow sensors, building interfaces with Azure Data Factory, Azure Synapse and Azure Databricks.
•    Creating Python based applications that automate business processes. Examples include: REST API (Python, FastApi) for creating new user accounts. Azure Function that sends e-mails through Microsoft Graph API, event driven Azure Function for code synchronisation between git repository and a target application.
•    Implementing everything you do using a CI/CD pipeline which includes static code analysis, static security tests and automated unit and integration tests.
•    Managing the application in the Azure environment using Azure Kubernetes Service, Azure Functions, Azure WebApp and Azure PostgreSQL
•    Monitoring and run your own applications together with your team in a true DevOps fashion
 
Requirements 
Above all we are looking for new colleagues with an insatiable curiosity in data, technology and self-development on a medior/senior level. 

•    Programming language Python (object oriented)
•    Kubernetes experience, either as a developer or system administrator
•    Good knowledge of CI/CD Azure DevOps, Azure CLI and/or PowerShell
•    Microsoft Certified: Azure Administrator Associate (AZ-104) 
•    At least certified on Azure Foundation AZ-900 and Azure Data Engineer Associate DP-203. 
  
Competences 
•    Strong communication skills  
•    Critical thinker 
•    Open communication   
•    Pro active 
•    Working together  
•    Providing feedback   
•    Willing to develop further in Azure  
•    Strong information/data analysis skills  
•    A customer focused mind-set and having a structured way of working are key talents 
•    Quick learner 
•    Curiosity 

What do we offer? 
We would love to help you achieve this by focusing firmly on your growth, development, and investing in an environment where you keep learning every day. We give you the space to innovate and initiate. In this way, we offer you numerous opportunities to grow and help you exceed your expectations, to do the right thing exceptionally well, and to therefore grow as a professional. In addition, with us (on the basis of a 36-or 40 hour working week), you can also expect: 


•    Based on your experience: up to € 6.200,- gross per month (scale 9)based on a 40hrs working week
•    A guarantee (13th) Thirteenth month's salary and (8%) holiday allowance 
•    an extra budget of 11% of your gross salary to be used at your discretion. Buy extra holiday hours, add more to your pension savings or ask for part of the extra budget to be paid out. 
•    a budget of € 750- to set up your home working space and a monthly Net home working allowance of up to € 40- 
•    a personal development budget of € 1,400- 
•    a combination of working from home and at the office (hybrid)
•    100% reimbursement of commuting costs if you travel by public transport 
•    A pension scheme to which you contribute 5.5% 
•    168 hours holiday per year  
 

This is a selection of the terms of employment for a Data Engineer based on a 36/40-hour working week. You can find all terms of employment on rabobank.jobs/en/conditions-of-employment. 

You and the job application process 
Reply to the vacancy for Data engineer at Rabobank.  Any questions about working at Rabobank and the process? Jonathan Moreno Gonzalez, IT Recruiter, Jonathan.Moreno.Gonzalez@rabobank.nl
•    We will hold the interviews through a video call.
•    You can find answers to the most frequently asked questions on rabobank.jobs/nl/veelgestelde-vragen.
•    A security check is part of the process.
•    We respect your privacy.
 

Data Engineer (Apache Airflow)

Type contract Fulltime
Vakgebied IT / Data & Analytics
Locatie Utrecht (Nederland)
Werkplek Hybride
Referentienummer JR_00080654
Publicatiedatum 24 januari 2023

Jonathan Moreno Gonzalez

Recruiter

Sollicitatieprocedure

    Upload
    Upload

    Waar we het verschil maken

    Vacatures die misschien ook bij je passen

    Nog niet de baan gevonden die bij je past?