GCP Cloud Data Engineer
Within DataSight, you will be joining a team of technology experts from a broad business intelligence spectrum. Together, you will help clients harness their data to create insights that help them make informed decisions, improves business outcomes and accelerate their performance.
- Eager to work as a team
- Ready to take on a challenge
- Curious for innovative technologies
- Fond of data
- Familiar with GCP Cloud Storage, BigQuery and Cloud Functions
- Building data pipelines with streaming and batch ETL
- Automation of builds, deployments, testing and configuration
- Knowledge of programming languages like Java/Scala and Python
- Knowledge of Spark (Scala/Python)
- Docker, Apache Airflow/Beam
Meet your future colleague Bart
On a day-to-day basis I connect different systems together using tools like Fivetran or NiFi. Crunch some data using Spark or Kafka transformation. To further unlock a variety of data sources, I enrich it by using DBT or Python. I create my staging area where I store the data in a datavault model where it is accessible for data scientists. After which I create a dimensional data model or a datamart depending on the reporting requirements.
What’s in it for you?
- Challenging projects with a variety of technologies
- Continuous (peer) learning
- Possibility to participate on a tactical level within our company
- Open and informal company culture
- Great colleagues