Our client, a top systematic hedge fund, are seeking an experienced Python developer to join their Data Team (Alpha Data), which is responsible for delivering vast quantities of data to users worldwide. Heavily responsible for data pipelines.
This role involves becoming a technical subject matter expert and developing strong relationships with quant researchers, traders, and colleagues across the Technology organisation. The Data teams deploy valuable data quickly, ensuring ingestion pipelines and data transformation jobs are resilient and maintainable, with data models designed in collaboration with researchers for efficient query construction and alpha generation. The team builds frameworks, libraries, and services to enhance quality of life, throughput, and code quality. They value teamwork, collaboration, excellence, diversity of thought, and creative solutions, emphasizing a culture of learning, development, and growth.
Responsibilities:
- Manage Data Pipelines: Take part ownership of the expanding estate of data pipelines, ensuring they are robust, efficient, and scalable.
- Innovate and Improve: Propose and contribute to new abstractions and improvements, leveraging your Python expertise to make a significant positive impact across the team globally.
- Technical Development: Design, implement, test, optimize, and troubleshoot data pipelines, frameworks, and services, utilizing your knowledge of SQL and RDBMS systems like Postgres.
- Collaborate with Researchers: Work closely with researchers to onboard new datasets, ensuring seamless integration and efficient data flow.
- Lead Production Support: Regularly take the lead on production support operations during normal working hours, applying your practical knowledge of data transfer protocols and tools such as FTP, SFTP, HTTP APIs, and AWS S3.
Required Skillset:
- 4+ years of experience coding to a high standard in Python.
- Bachelor's degree in a STEM subject.
- Proficiency in SQL and familiarity with one or more common RDBMS systems (primarily Postgres).
- Practical knowledge of commonly used protocols and tools for data transfer (e.g., FTP, SFTP, HTTP APIs, AWS S3).
- Excellent communication skills.
- Experience with big data frameworks, databases, distributed systems, or Cloud development
- Nice to have - Experience with any of these: C++, kdb+/q, Rust
If you feel the above is a good match to your experience, apply today!