Connecting...

W1siziisimnvbxbpbgvkx3rozw1lx2fzc2v0cy9syxdyzw5jzs1oyxj2zxkvanbnl2pvyl9kzwzhdwx0x2jhbm5lci5qcgcixv0

Big Data Engineer/Architect

Big Data Engineer/Architect

Job Title: Big Data Engineer/Architect
Contract Type: Contract
Location: Basel, Switzerland
Industry:
ERP
CRM
HCM
Salary: Swiss Franc900 - Swiss Franc950 per day
Start Date: May 2018
Reference: HQ00070987_1523636306
Contact Name: Tom Francis
Contact Email: t.francis@lawrenceharvey.com
Job Published: April 13, 2018 17:18

Job Description

My well recognised financial client are seeking an experience Big Data Engineer/Architect to lead the implementation of solutions on a greenfield POC project!!

You will be an experienced implementer of productive, business rule and metadata driven Big Data solutions, using Apache Hadoop and Spark. The successful candidate should have demonstrable experience of implementing similar solutions, be capable of working independently as well as comfortable working within an agile project team. The ability to relate complex data problems to business users and vice-versa, is crucial.

The Job:

Interact with the architecture team to refine requirements
Work within project team to implement solutions on the existing Hadoop platform
Work with Platform Engineers to ensure Dependent components are provisioned
Provide input in defining the design/architecture of the envisaged solution
Implement business rules for streamlining data feed(s)
Implement rule based framework to abstract complex technical implementation into reusable, generic components

We are looking for candidates with:

Proven experience implementing solutions to process large amounts of data in a Hadoop ecosystem, utilising Apache Spark
Experience implementing generic components for the ingestion, validation, and structuring of disparate data sources into a distributed data platform
Experiencing implementing complex event processing patterns
Strong programming and Scripting skills; eg Java or Scala or Python or R
Strong understanding of Hadoop technologies; eg MapReduce, HDFS, HBase, Hive, Sqoop, Flume, Kafka
Software Engineering background, with experience using common SDLC tools and practices for agile, ideally including Continuous Delivery
Experience in BRMS driven solutions
Experience with master data management (MDM) including ontology curation and data cleansing
Experience with end-user notepad tools; eg, Jupyter, Zeppelin
Ability to operate in a global team and coordinate activities with remote resources
Capable of documenting solutions so they can be re-used
Skills in exploratory statistics to relate to data analysts and statisticians

Nice to have:

Economics or Macroeconomic domain knowledge
Experience in econometrics
Experience with data visualisation tools, eg Tableau

Lawrence Harvey are a preferred supplier for this client! Apply now with an up to date CV for interview slots arranged.

Lawrence Harvey is acting as an Employment Business in regards to this position.
Visit our website www.lawrenceharvey.com and follow us on Twitter for all live vacancies @lawharveyjobs