Maven-build-tool Remote- und Homeoffice Jobs

Hybrid Remote Big Data Engineer Remote Big Data Engineer with verification

Xcede · EMEA (Remote) · Switzerland · Hybrid

About the job

Big Data Engineer - Fully Remote in Africa / EU

As a Big Data Engineer you will have a strong understanding of big data technologies with an exceptional ability to code. You'll be providing technical leadership, working closely with the wider team and focusing on delivering high quality clean code in line with project goals and deliverables. Working closely with other teams and stakeholders to deliver rapid prototypes as well as productionising code ensuring high accessibility standards are upheld.

You must have familiarity with modern frameworks and languages, as well as working practices such as Clean Code, TDD, BDD, continuous integration, continuous delivery, and DevOps

You will, among other tasks:

  • Define, design, and develop services and solutions around large data ingestion, storage, and management such as withRDBMS, No SQL DBs, Log Files, Events.
  • Define, design, and run robust data pipelines/batch jobs in a production environment.
  • Architecting highly scalable, highly concurrent, and low latency systems

Essential and minimum 5 years experience in:

  • Follow Clean Code/Solid principles
  • Adhere and use TDD/BDD.
  • Highly Proficient in either Functional Java or Scala
  • In depth knowledge of Hadoop technology ecosystem - HDFS, Spark, Impala, Hbase, Kafka, Flume, Sqoop, Oozie, SPARK, Avro, Parquet
  • Other languages (Python, Javascript, Clojure, Kotlin etc)
  • Other NoSQL databases such Neo4J, Cassandra, Redis etc.
  • Mocking (any of the following Mockito, ScalaTest Spock, Jasmine, Mocha).
  • Build tools (One of SBT, Gradle, Maven).
  • Knowledge of AWS Big Data/Analytics services - S3, EMR, Glue, Redshift, QuickSight, Kinesis.

Desierables Skills

  • Experience of big data environments (also advising best practices/new technologies to Analytics team)
  • Experience of handling large data sets and scaling their handling and storage.
  • Experience of Storing Data in systems such as Hadoop HFDS, S3, Kafka.
  • Experience of designing, setting up and running big data tech stacks such as Hadoop, Spark
  • Knowledge of relational and non-relational database systems
  • Understanding continuous integration and delivery.

This vacancy is open to those in UK, Africa and EU. You must have a minimum of 5 years experience with the above Essential list of technologies.