Big Data Engineer ( Hadoop / Spark / Kafka / MongoDB )

Big Data & Data Science | £550 - 650 per day | England, Greater London, London

Big Data Engineer ( Hadoop / Spark / Kafka / MongoDB )

6 Month Contract

£550 – £650 per day (dependent on experience)

London

Big Data Engineer ( Hadoop / Spark / Kafka / MongoDB ) is required to join a leading technology company within the financial services industry to help drive growth of big data technologies.

The company combine the industries most advanced technologies with big data to predict and anticipate future movements in financial markets.

The Big Data Engineer ( Hadoop / Spark / Kafka / MongoDB ) will be required to work alongside the DevOps team, data architects and data scientists to evaluate and implement new big data frameworks as well as recommend ways to improve existing tools.

Key responsibilities for the Big Data Engineer ( Hadoop / Spark / Kafka / MongoDB ) include:

  • A track-record of designing, developing and testing big data solutions using Hadoop’s Ecosystems.
  • Building data pipelines on top of Kafka.
  • An accomplished team player who can identify inefficiencies in the data systems and then proceed to suggest the relevant solutions.
  • Be able to adopt big data technologies, build high quality and complex data systems focusing on scalability and security.

Essential Skills for the Big Data Engineer ( Hadoop / Spark / Kafka / MongoDB ) include:

  • Hands-on experience with Kafka / Kafka streaming.
  • Hands-experience with programming languages such as C#, C++, Scala & Python.
  • NoSQL experience: HBase / Cassandra / MongoDB.
  • Experience with Storm, Flume & Splunk.
  • 3+ years of experience in Big Data technologies.
  • Capable of explaining technical information to non-technical senior stakeholders.
  • A good understanding and knowledge of basic network protocols (IP, UDP, TCP).

Desirable Skills for the Big Data Engineer ( Hadoop / Spark / Kafka / MongoDB ) include:

  • Strong database experience within a Big Data / Hadoop ecosystem.
  • Existing knowledge and experience of Docker and Kubernetes.
  • Experience with Storm, Flume & Splunk.
  • Cloud technologies experience in either GCP AWS or Azure.
  • Consultancy experience is preferable.

Must have:

  • You must have the legal right/permit to work in the UK

If you’d like to be considered for the role, please apply within.

, ,

Apply Now