Position and missions
As a Data Architect, you will take part in the B&D’s Data Engineering & Architecture Faction. Your strong development and data modeling/architecture skills will allow you to work on data pipeline implementation, storage, data crunching and manipulation.
We are a knowledge company that operates at the forefront of the Cloud revolution. Through our Data Academy you have access to numerous self-learning online courses and classroom trainings. We encourage you to learn and get certified in technologies that are relevant for you and our clients.
Besides data modeling, which is like a second nature, you will source, capture, transform and prepare data using the newest technologies. Some examples are Kafka, Spark, DBT, Snowflake, BigQuery, Synapse, Databricks, Airflow, Elastic search… and many others!
What we've done:
- BRIC.BRUSSELS: Building a data platform (Snowflake, DBT, Mulesoft) for the Brussels Region to improve citizens' daily life and embrace tomorrow's city challenges.
- Pharma company: Using public genetics data (Snowflake, microservices, Azure) in research for drugs development.
- Metal industry company: Building a data analytics platform centralizing manufacturing data in near real time (Azure Databricks, Azure Datafactory, Power BI)
And many more!
Advantages
The company
Profile
We challenge you to directly apply if you recognize yourself in some of the following attributes. See you soon!
Degree in Computer Science, Engineering or any related field
At least 4 years professional experience with data related ICT projects
SQL has no more secrets for you
Excellent data modelling skills (Kimball, normalization)
Experience with at least one Cloud platform (AWS, Azure, Google cloud)
Experience with Cloud data warehouses (Snowflake, BigQuery, Synapse Analytics, Redshift)
Be able to define a vision with the client and to accompany him to become a "Data Driven company"
Coach the Data Engineers and guide them in the execution of various tasks
Experience with a least one programming language: Python, Scala , Java,…
Knowledge of continuous developments / deployment concepts & technologies (GIT, Azure Devops, Gitlab, ...)
Knowledge of Unix environments
You are fluent in English and have good communication skills
Conceptual and creative thinking
Nice to have
Knowledge Docker and orchestration platform (Kubernetes, Openshift, AKS, GKE...)
Experience with Hadoop ecosystem (Spark, Databricks, Kafka, HDFS, Hive, HBase, …)
Experience with micro-services apps development
Knowledge of scripting languages such as Bash, Powershell…
Experience with noSQL DBs’ (MongoDB, Cassandra, Neo4j)
Experience with the DataVault 2.0 modeling methodology
Experience or interest with Data Mesh.
Additional talents? Strange hobby? Weird humor? Our data heroes are like you, they share respect, modesty and excellence as key values and they know how to put fun in & after work.