Position and missions
Are you passionate about Data and DevOps? Do you have a strong interest in development, data plumbery using the latest technologies related to Cloud solutions? If you are eager to work on Data Lake, Lake House and Data mesh solutions exploiting huge volume of data, you should probably keep on reading!
As a Data Engineer, you will use an analytical, data-driven approach to support business cases.
You will work with clients and partners to design data processing systems, data pipelines and deliver insightful analytics while ensuring data integrity. You might also perform builds and automate key components of our infrastructure.
Thanks to experience acquired on projects and continuous training through our Data Academy program, you will have the possibility to grow your Data Engineering competencies, and specialize in either Data Ops or Data Warehousing
In addition, if you have a teaching streak, we will offer you the possibility to give training in our Data Academy.
What we've done:
- BRIC.BRUSSELS: Building a data platform (Snowflake, DBT, Mulesoft) for the Brussels Region to improve citizens' daily life and embrace tomorrow's city challenges.
- Pharma company: Using public genetics data (Snowflake, microservices, Azure) in research for drugs development.
- Metal industry company: Building a data analytics platform centralizing manufacturing data in near real time (Azure Databricks, Azure Datafactory, Power BI)
And many more!
We challenge you to directly apply if you recognize yourself in some of the following attributes. See you soon!
- Master’s or bachelor’s degree in IT, a related technical field, or equivalent practical experience.
- At least one relevant professional experience in IT projects, preferably data related.
- Data oriented mindset, you possess a good knowledge of data-architecture and/or data-infrastructure.
- Strong SQL level
- You are fluent in English and ideally with a very good level of either Dutch or French
- Conceptual and creative thinking
- Experience with a cloud provider or data processing tools (Databricks, Snowflake, BigQuery, etc.)
- Experience with a least one programming language: Python, Scala , Java,…
- Experience working with deployment and orchestration technologies (i.e., Docker, Kubernetes, Terraform, Airflow, etc.).
- Understanding of object-oriented programming concepts, data structures, and algorithms.