Data Architect, Big Data Engineering, contract, work from home
experience in;
- Big Data Architecture on Google Cloud
- Hadoop eco-system
- Python, Scala or Java programming experience
- NoSQL database (Mongo DB, Cassandra, HBase, DynamoDB, Big Table etc.)
- Big data ingestion tools (Sqoop, Flume, NiFI etc.), distributed messaging and ingestion frameworks (Kafka, Pulsar, Pub/Sub etc.)
- Data processing framework eg Spark (Core, Streaming, Pyspark SQL), Storm, Flink etc.
- Scalable data models
- Performance tuning, optimization
- pipelines for data solutions
- containerization, orchestration and Kubernetes engine
- Big data cluster security
- orchestration tools Oozie, Airflow, Ctr-M or similar
- multi-dimensional modeling like start schema, snowflakes
- data governance
#dataarchitect #kubernetes #contractjobs #googlecloudplatform #bigquery #googledatastudio #gcp #aws #hadoop #spark #hbase #dataengineering #gcp #dataengineeringjobs #datapipelines #staffworx #recruitmentpartner #collibra #pyspark
This advert was posted by Staffworx Limited – a UK based recruitment consultancy supporting the global digital, E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an Agency and/or an Employment Business.
Staffworx operate a referral scheme of £500 or new iPad for each successfully referred candidate, if you know of someone suitable please forward for consideration
https://www.cloud.google.com/bigquery https://www.staffworx.co.uk/vacancies/