Qube Research & Technologies

Qube Research & Technologies

Data Engineer - Tooling

Apply Now
🌍London
3d ago
👀 1 views
📥 1 clicks

Job Description

Data Engineer - Tooling


Join our Data team to architect and build the tools used to create new data-sources and monitor key production systems. This role offers hands-on exposure to the core datasets used by QRT, and will partner closely with other technology teams, quants, and traders, building systems that guarantee the integrity of our data and avoid trading losses.

 

Your future role within QRT
We’re looking for a hands-on Data Engineer to join QRT. In this role you’ll build out the tooling that manages and monitors production datasets, keeping them flowing, observable, and reliable. You’ll work closely with other engineering teams and business stakeholders to deliver a mission-critical platform.

 

  • Data Quality Tools
    Your key deliverable is to develop and maintain a platform which ensures the quality of our data-pipelines, and alerts stakeholders to critical production issues.
  • Library Maintenance
    Maintain and improve shared libraries which are used by developers across the data-function and facilitate the rapid onboarding of new datasets.
  • Cross-team Integrations
    Design, build, and document our integrations with other team’s processes, which alert them to relevant issues with our data-pipelines.
  • Observability & Monitoring
    Build out the observability stack (logs, metrics, tracing) to monitor our data assurance system, as well as other tools for the management of our data-pipeline lifecycle.
  • Platform Support
    Support users in best practice usage of our tooling, migrate code to the latest standards as well as investigate, debug and resolve issues with our platform.

 

Your present skillset
You thrive at the intersection of software engineering, data operations, and DevOps. You’re comfortable both architecting solutions and writing the code to deploy and maintain them, and you relish collaborating with diverse stakeholders to turn requirements into production-grade systems.

 

  • 5+ years of hands-on experience in data engineering or data-adjacent engineering roles
  • Strong software engineering fundamentals and proficiency in Python
  • Experience with workflow orchestration frameworks (e.g. Argo, Celery, Airflow)
  • Experience designing and implementing RESTful APIs and internal service integrations
  • Experience with observability tools (e.g. Prometheus, Grafana, CloudWatch)
  • Familiarity with Kubernetes, container orchestration, and cloud infrastructure (AWS preferred)
  • Exposure to the front-end, particularly with regards to custom monitoring dashboards, is a plus
  • Knowledge of data-quality frameworks (Soda/Great Expectations) is a plus
  • Knowledge of ETL/ELT pipelines, and metadata management is a plus
  • Excellent communication skills and a collaborative mindset to work effectively with other engineers, quants and traders
  • Self-starter mentality with the ability to work autonomously within a globally distributed team

 

QRT is an equal opportunity employer. We welcome diversity as essential to our success. QRT empowers employees to work openly and respectfully to achieve collective success. In addition to professional achievement, we are offering initiatives and programs to enable employees achieve a healthy work-life balance.

Apply Now

More Jobs at Qube Research & Technologies