Datatonic

Data Engineer (Python)

Vertical
Data
Department
Data Developer
Website
Does real-time processing of millions of rows per minute, working with petabytes of data and running large scale machine learning algorithms on thousands of CPUs sound like something you want to do? Do you enjoy a dynamic working environment with different challenges on a regular basis? Then become Datatonic’s next Data Engineer!

You will be working on the data engineering and architecture part of our consulting projects, building robust pipelines and wrangling data in a way that it becomes easy to visualise and ready to be fed into our machine learning models. In addition to that, you will be helping us to build out the data engineering side of our next-generation machine learning products.

Furthermore, you will:

  • Work with the most innovative and most scalable data processing technologies
  • Build innovative state-of-the-art solutions with our customers
  • Work closely with our tech partners: Google Cloud Platform, Tableau, Looker
  • Work in an agile and dynamic environment together with a small team with our data scientists, machine learning experts, data analysts and data engineers

REQUIREMENTS

  • Strong programming and architectural experience, ideally in Python and SQL
  • 2 years of experience building (big) data solutions
  • Working experience with Google Cloud Platform (GCP) or Amazon Web Services (GCP)
  • Extremely passionate about data and analytics
  • Experience with ETL tools, Hadoop-based technologies (e.g. Spark) and/or data pipelines (e.g. Beam, Flink)
  • Experience building scalable and high-performant code
  • Experience in producing tested, resilient and well documented applications
  • The ability to take ownership, end-to-end and finding creative solutions
  • Experience in architecting, building, maintaining and troubleshooting cloud infrastructure
  • Excellent interpersonal skills, verbal and written communication skills; a team player and keen learner who loves building great things together
  • BSc or MSc degree in Computer Science or a related technical field

Bonus Points:

  • Love for the command line with optional affinity for Linux scripting
  • Experience building scalable REST APIs using Python or similar technologies
  • Experience with Agile methodologies such as Scrum
  • Basic knowledge of and ideally some experience with data science topics like machine learning, data mining, statistics, and visualisation
  • Contributions to open source projects

BENEFITS

  • 25 days holiday plus bank holidays
  • Pension scheme
  • Situated in the innovation hub of Canary Wharf
  • Laptop of your choice
  • Monthly social events and team offsites
  • Generous desk budget
  • Free fruit, cookies, tea/coffee throughout the week
  • Regular networking events, mentoring events and conferences
  • Exposure to experts from a number of industries
  • Freedom to explore the latest tools and technologies
  • Knowledge-sharing activities
Share: