PySpark Big Data Engineer

  • Contract
  • London
  • solutions consulting
  • 800 day
  • 5156pyspark

PySpark Big Data Engineer – contract, work from home, PySpark experience

Experience in many of the following;

  • PySpark Data Engineering experience
  • Python, Spark, SQL & noSQL experience experience
  • Data Engineering tech
  • Big Data, Apache Airflow workflow automation for data pipelines
  • AWS ideally or GCP, Azure
  • AWS Services, AWS EMR clusters
  • DevOps culture, jenkins, k8s, terraform, helm, docker
  • TDD/BDD methodologies, Git
  • Data ingestion, data pipelines (Apache Airflow workflow automation)

6 months initial, likely long term extensions

INSIDE IR35 – this assignment will fall within scope of IR35 legislation and appropriate umbrella company should be utilised during this assignment.

#dataengineer #sparkstreaming #apachespark #pyspark #apacheairflow #contractjobs #workfromhome #remote

This advert was posted by Staffworx Limited – a UK based recruitment consultancy supporting the global E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an Agency and/or an Employment Business.

Staffworx operate a referral scheme of £500 or new iPad for each successfully referred candidate, if you know of someone suitable please forward for consideration

Upload your CV/resume or any other relevant file. Max. file size: 2 MB.

Start typing and press Enter to search

HDINSIGHT AZURE ENGINEERazure sentinel