Connecting...

W1siziisimnvbxbpbgvkx3rozw1lx2fzc2v0cy9syxdyzw5jzs1oyxj2zxkvanbnl2pvyl9kzwzhdwx0x2jhbm5lci5qcgcixv0

Big Data Engineer

Job Title: Big Data Engineer
Contract Type: Contract
Location: Basel, Switzerland
Industry:
Salary: Swiss Franc700 - Swiss Franc750 per day
Reference: HQ00070951_1523018471
Contact Name: William Wakefield
Contact Email: w.wakefield@lawrenceharvey.com
Job Published: April 06, 2018 13:41

Job Description

Big Data Engineer - Basel, Switzeland - 3 months rolling

I am seeking an experience Big Data Engineer to join a financial client in Basel, Switzerland. You need to be an experienced implementer of productive, business rule and metadata driven Big Data solutions, using Apache Hadoop and Spark.

The successful Big Data Engineer should have demonstrable experience of implementing similar solutions, be capable of working independently as well as comfortable working within an agile project team. The ability to relate complex data problems to business users and vice-versda is crucial.

Responsibilities
*Interact with the architecture team to refine requirements
*Work within project team to implement solutions on the existing Hadoop platform
*Work with Platform Engineers to ensure dependant components are provisioned
*Provide input in defining the design / architecture of the envisaged solution
*Implement business rules for streamlining data feed(s)
*Implement rule based framework to abstract complex technical implementation into re-usable, generic components

Required
*Proven experience implementing solutions to process large amounts of data in a Hadoop ecosystem, utilising Apache Spark
*Experience implementing generic components for the ingestion, validation, and structuring of disparate data sources into a distributed data platform
*Experiencing implementing complex event processing patterns
*Strong programming and scripting skills; e.g. Java, Scala, Python, R
*Strong understanding of Hadoop technologies; e.g. MapReduce, HDFS, HBase, Hive, Sqoop, Flume, Kafka…
*Software Engineering background, with experience using common SDLC tools and practices for agile, ideally including Continuous Delivery
*Experience in BRMS driven solutions
*Experience with master data management (MDM) including ontology curation and data cleansing
*Experience with end-user notepad tools; e.g, Jupyter, Zeppelin

Lawrence Harvey is acting as an Employment Business in regards to this position.
Visit our website www.lawrenceharvey.com and follow us on Twitter for all live vacancies @lawharveyjobs