Java Spark Big Data Engineer, excellent rates, work from home UK wide – new workstreams on digital banking Google Cloud transformation programme
Experience in many of the following;
- build data pipelines using Hadoop technologies.
 - Experience on Java8 or above and Apache Spark
 - Apache Hadoop, Kafka, Apache Spark
 - GCP BigQuery, Spanner, Pub/Sub, and Kafka
 - Strong SQL
 - Python (nice to have)
 - Software development life cycle and experience running a team on Agile
 - Google Cloud Platform Data Studio
 - Unix/Linux Platform
 - Version control tools (Git, GitHub), automated deployment tools
 
6 months initial, likely long term extensions
INSIDE IR35 – this assignment will fall within scope of IR35 legislation and appropriate umbrella company should be utilised during this assignment.
#applyatstaffworx https://www.staffworx.co.uk/job/java-spark-big-data-engineer/
#dataengineer #sparkstreaming #apachespark #pyspark #apacheairflow #contractjobs #workfromhome #remote #java #kafka #dataengineering #hadoop #python #bigdata
Other suitable opportunities are available at www.staffworx.co.uk/vacancies
This advert was posted by Staffworx Limited – a UK based recruitment consultancy supporting the global E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an Agency and/or an Employment Business.
Staffworx operate a referral scheme of £500 or new iPad for each successfully referred candidate, if you know of someone suitable please forward for consideration
			    
			    			             
