PySpark AWS Data Engineer, excellent rates, home based
Experience in the following;
- PySparkData Engineering experience
- AWS Services,AWS EMR clusters
- Python, Spark, SQL & noSQL experience
- Data Engineering tech
- Big Data, Apache Airflow workflow automation for data pipelines
- DevOps culture, jenkins, k8s, terraform, helm, docker
- TDD/BDD methodologies, Git
- Data ingestion, data pipelines (Apache Airflow workflow automation)
3-6 months initial, likely long term extensions
INSIDE IR35 – this assignment will fall within scope of IR35 legislation and appropriate umbrella company should be utilised during this assignment.
#dataengineer #sparkstreaming #apachespark #pyspark #apacheairflow #contractjobs #workfromhome #remote
This advert was posted by Staffworx Limited – a UK based recruitment consultancy supporting the global E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an Agency and/or an Employment Business.
Staffworx operate a referral scheme of £500 or new iPad for each successfully referred candidate, if you know of someone suitable please forward for consideration.