• 1Search for courses by Study Area, Level and Location
  • 2We deliver you all the matched results
  • 3Choose one or more course providers to contact you

Distance from location (kms)

Exact 5 10 25 50 100

Posted since

All 2 Days 1 Week 2 Weeks 1 Month

Sort results by

Relevance Date



Technical Lead

Quantexa - Melbourne, VIC

Source: uWorkin


Founded in 2016 with only a handful of individuals, Quantexa was built with a purpose that through a greater understanding of context, better decisions can be made. 5 years and 300+ employees later we still believe that today. We connect the dots within our Customers data using dynamic entity resolution and advanced network analytics to create context, empowering businesses to see the bigger picture and drive real value from their data.

Due to the continuous success and high demand from our customers, we are looking for a Technical Lead with a proven track record in technical delivery to join the Quantexa family.

Typical responsibilities include:

  • Manage, transform and cleanse high volume data,
  • Write defensive, fault tolerant and efficient code for data processing,
  • Automate data processing to enable on-going alerts on high risk activity,
  • Present project results to clients both face to face and virtually
  • Work very closely with data scientists to ensure efficient and effective delivery of solutions
  • Use leading open source big-data tools, such as Spark, Hadoop, Scala and Elasticsearch. You should be comfortable with working with high profile clients, on their sites.
  • Work with our expert software development team to produce reusable applications:
  • Use emerging and open source technologies such as Spark, Hadoop, and Scala,
  • Collaborate on scalability issues involving access to massive amounts of data and information,
  • Take on ad-hoc tasks as required for the running of a small, yet rapidly expanding business.


You should have the following:

  • Proven big data experience, either from an implementation or a data science prospective,
  • Excellent technical skills including expert knowledge of at least one big data technology such as Spark, Hadoop, or Elasticsearch.
  • Experience of building data processing pipelines for use in production “hands off” batch systems, including either (or preferably both) traditional ETL pipelines and/or analytics pipelines.
  • Strong coding experience in the likes of Scala, Java or Python,
  • Strong client facing, communication and presentation skills,
  • Strong academic qualifications, high quality degree (2:1 or above) or equivalent,
  • Enthusiasm to learn and develop emerging technologies and techniques.
  • Exhibit strong technical communication skills with demonstrable experience of working in rapidly changing client environments.
  • Demonstrate strong analytical and problem-solving skills and the ability to debug and solve technical challenges with sometimes unfamiliar technologies.

Ideal candidates will also:

  • Have worked on a variety of complex data orientated projects for financial services clients
  • Have a good understanding of Computer Science and preferable come from a software engineering background or other scientific degree incorporating IT modules (e.g. Maths/Physics)
  • Have exposure to Agile, especially SCRUM.
  • Be open to short-medium term international travel
  • Arrive with experience at working with a variety of modern development tooling (e.g. Git, Gradle, Jenkins, Nexus) as well as technologies supporting automation and DevOps (e.g. Ansible, Chef, Puppet, Docker and a little bit of good old Bash scripting).
  • Have an excellent appreciation of what makes a high quality, operationally stable system and how to streamline all areas of development, release and operations to achieve this.


  • Competitive salary
  • Company bonus
  • Additional benefits