Data Architect

Business et Decision, an Orange Business Services affiliate since 2018, is looking for consultants to reinforce its Data Intelligence Chapter. Are you passionate about data? Do you have a strong interest in data modeling, scripting ETL/ELTs, transforming data with the newest Cloud or OnPrem technologies? If you are eager to master the organization, storage and exploitation of PBs (petabytes) of data, you should probably keep on reading!

Position and missions

As a Data Architect, you will take part in the B&D’s Data Engineering & Architecture Faction. Your strong development and data modeling skills will allow you to work on data pipeline implementation, storage, data crunching and manipulation. 

We are a knowledge company that operates at the forefront of the Cloud revolution. Through our knowledge center you have access to numerous self-learning online courses and classroom trainings. We encourage you to learn and get certified in technologies that are relevant for you and our clients.

Besides data modeling, which is like a second nature, you will source, capture, transform and prepare data using the newest technologies. Some examples are Kafka, Spark, DBT, Snowflake, BigQuery, Synapse, Databricks, Airflow, Elastic search… and many others!

Profile

We challenge you to directly apply if you recognize yourself in some of the following attributes. See you soon! 

  • Master’s degree in Computer Science, Engineering or any related field 
  • Excellent data modelling skills (Kimball, normalization)
  • Strong interest in data manipulation (storage, crunching, pipeline implementation)
  • SQL has no more secrets for you
  • At least 4 years professional experience with data related ICT projects
  • Strong development experience in Java, Scala, Python
  • Conceptual knowledge of continuous developments / deployment concepts & technologies (GIT, Azure Devops, Gitlab, CircleCI, ...)
  • Knowledge of Unix environments
  • Good communication skills 
  • Conceptual and creative thinking  

Nice to have

  • Experience with Cloud platforms (AWS, Azure, Google cloud) 
  • Experience with Cloud data warehouses (Snowflake, BigQuery, Synapse Analytics, Redshift)
  • Experience with Hadoop ecosystem (Spark, Kafka, HDFS, Hive, HBase, …)
  • Knowledge Docker and orchestration platform (Kubernetes, Openshift, AKS, GKE...)
  • Experience with micro-services apps development 
  • Knowledge of scripting languages such as Bash, Powershell…
  • Experience with noSQL DBs’ (MongoDB, Cassandra, Neo4j)
  • Experience with the DataVault 2.0 modeling methodology

Additional talents? Strange hobby? Weird humor? Our data heroes are like you, they share respect, modesty and excellence as key values and they know how to put fun in & after work.