GCP DATA ENGINEER

GCP DATA ENGINEER

GCP DATA ENGINEER
Java Spark Big Data Engineer, excellent rates, work from home UK wide

Experience in many of the following;

• build data pipelines using Hadoop technologies.
• Experience on Java8 or above and Apache Spark
• Apache Hadoop, Kafka, Apache Spark
• GCP BigQuery, Spanner, Pub/Sub, and Kafka
• Strong SQL
• Python (nice to have)
• Software development life cycle and experience running a team on Agile
• Google Cloud Platform Data Studio
• Unix/Linux Platform
• Version control tools (Git, GitHub), automated deployment tools

6 months initial, likely long term extensions
INSIDE IR35 – this assignment will fall within scope of IR35 legislation and appropriate umbrella company should be utilised during this assignment.

#dataengineer #sparkstreaming #apachespark #pyspark #apacheairflow #contractjobs #workfromhome #remote #java #kafka #dataengineering

This advert was posted by Staffworx Limited – a UK based recruitment consultancy supporting the global E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an Agency and/or an Employment Business.

Staffworx operate a referral scheme of £500 or new iPad for each successfully referred candidate, if you know of someone suitable please forward for consideration

Country United Kingdom
Location work from home, london HQ
Job Type Contract
Industries IT & Telecommunications
Reference JSswx5341PYSPARK
Start Date ASAP
Duration 6 months initial, long term program
Rate/Salary excelllent rates
Visa Requirement Applicants must be eligible to work in the specified location

Start typing and press Enter to search