Data Scientist

Website Mojio

Mojio: Connected Car Cloud Platform

Want to be at the forefront of a highly scalable leading, connected car platform?

The team at Mojio work together to bring real time data from your vehicle directly to the palm of your hand….and our R&D teams are in the front seat.

Mojio is backed by Deutsche Telekom, Amazon, Relay Ventures, 500 Startups, BDC Capital, and the BC Tech Fund. We are fast growth, fast-paced and technically-driven while remaining flexible and collaborative. After all, we’re disrupting one of the largest and most established industries in the world!

We launched with TMobile in the US in Nov’16 and TMobile in Czech Republic in Dec’16 which has led to unprecedented growth and a need to to expand our team. We are looking for star server/platform developers to help get our platform ready for its next phase of expansion.

Who you are

As a Data Scientist at Mojio you will be a part of new product features from conception through deployment. You will have the opportunity to apply your knowledge of statistics and your analytical skills to mine data at scale and develop large-scale Machine Learning (ML) models to reveal customer value in data. You will support feature prototyping, and utilize industry best practices to write production-grade code. You will build data pipelines, implement ML-based analytical algorithms, and work closely with our software development team to set up back-end systems and interfaces that will deliver the next-generation analytics.

Specific job duties may include:

  • Writing or modifying big data pipelines to process and mine historical time-series data
  • Providing data insight from massive amounts of data using data cleaning, data visualization, and statistical analysis tools and techniques
  • Prototyping and validating advanced ML and Deep Learning (DL) models and algorithms that transform big data into actionable information
  • Setting up and maintaining databases supporting analytics research and feature prototyping
  • Writing production code to deliver analytics feature content as an Internet-of-Things (IoT) solution

Required Skills and Experience:

  • An advanced degree (M.S. or Ph.D) in Computer Science (CS), Electrical Engineering (EE), Operation Research (OR) and optimization, physics, applied mathematics, or a comparable analytical field from an accredited institution –OR—Bachelors degree plus 5 years of relevant industry experience.
  • 5+ years experience or demonstrated fluency with python and at least one other programming language (Scala, Java, C/C++/C# a plus)
  • Expert in data mining, machine learning, deep learning, statistical modeling and data visualization techniques using data-oriented tools and languages such Python, R, and MATLAB
  • 2+ years experience writing SQL-like as well as NoSql queries and databases
  • Experience dealing with large amounts of unstructured data (experience with GIS data is a bonus)
  • Experience setting up and using large-scale distributed data-processing frameworks such as Apache Spark and Hadoop Map-Reduce
  • Experience working with enterprise-grade cloud computing platforms such as Microsoft Azure and Amazon AWS
  • Demonstrated ability to develop high-quality code adhering to industry best practices (i.e., code review, unit tests, revision control)
  • Experience designing experiments and collecting data for the purpose of deriving data analytics insights and solutions
  • Understanding of error propagation and the limitations of data subject to measurement uncertainties
  • Work/project history reflective of a self-motivated professional who excels when given open-ended problems and broadly-defined goals, having an innate desire to discover the patterns and relationships in data that can be leveraged to provide business value

All qualified applicants will receive consideration for employment without regard to race, sex, color, religion, national origin, protected veteran status, gender identity, social orientation, nor on the basis of disability.

To apply for this job please visit boards.greenhouse.io.