Squarepoint Capital Logo
Squarepoint Capital
Software Developer - Data Pipelines (Python)
🌎London, Montreal, Madrid, Bangalore
1w ago
πŸ‘€ 434 views
πŸ“₯ 6 clicked apply

Job Description

Squarepoint is a global investment management firm that utilizes a diversified portfolio of systematic and quantitative strategies across financial markets that seeks to achieve high quality, uncorrelated returns for our clients. We have deep expertise in trading, technology and operations and attribute our success to rigorous scientific research. As a technology and data-driven firm, we design and build our own cutting-edge systems, from high performance trading platforms to large scale data analysis and compute farms. With offices around the globe, we emphasize true, global collaboration by aligning our investment, technology and operations teams functionally around the world.

Role: Software Developer - Data Pipelines

Team: Alpha Data

Department: Development

Location: London, Montreal, Madrid, Bangalore

We are seeking an experienced Python developer to join our Alpha Data team, responsible for delivering a vast quantity of data served to users worldwide. You will be a cornerstone of a growing Data team, becoming a technical subject matter expert and developing strong working relationships with quant researchers, traders, and fellow colleagues across our Technology organisation.

Alpha Data teams are able to deploy valuable data to the rest of the Squarepoint business at speed. Ingestion pipelines and data transformation jobs are resilient and highly maintainable, while the data models are carefully designed in close collaboration with our researchers for efficient query construction and alpha generation.

We achieve an economy of scale through building new frameworks, libraries, and services used to increase the team's quality of life, throughput, and code quality. Teamwork and collaboration are encouraged, excellence is rewarded and diversity of thought and creative solutions are valued. Our emphasis is on a culture of learning, development, and growth.

Position Overview:

  • Take part ownership of our ever-growing estate of data pipelines,
  • Propose and contribute to new abstractions and improvements - make a real positive impact across our team globally,
  • Design, implement, test, optimize and troubleshoot our data pipelines, frameworks, and services,
  • Collaborate with researchers to onboard new datasets,
  • Regularly take the lead on production support operations - during normal working hours only.

Must haves

  • 3+ years of experience coding to a high standard in Python,
  • Bachelor's degree in a STEM subject,
  • Experience with and knowledge of SQL, and one or more common RDBMS systems (we mostly use Postgres),
  • Practical knowledge of commonly used protocols and tools used to transfer data (e.g. FTP, SFTP, HTTP APIs, AWS S3),
  • Excellent communication skills.

Nice to haves

  • Experience with big data frameworks, databases, distributed systems, or Cloud development.
  • Experience with any of these: C++, kdb+/q, Rust.