Data Architecture Lead
Hiring on behalf of a multi-strategy, multi-manager hedge fund is looking for a Data Architecture Lead to join their team. The firm trades globally across multiple asset classes and investment approaches. The primary focus of this role will be on the buildout and operations of a next generation security master system. In addition to designing and implementing robust data architectures. The successful candidate will lead the security master development team, overseeing the quality, performance, and scalability of the teams’ deliverables.
Key Responsibilities –
- Data Architecture and Design: Lead the design and implementation of the enterprise data architecture, ensuring alignment with the overall business strategy and objectives.
- Develop and maintain the data architecture for the security master system (secmaster), including data models, metadata, and data flow diagrams.
- Team Leadership: Lead and manage the secmaster team of python/sql engineers, providing guidance, mentorship, and support to team members.
- Establish and maintain data governance practices to ensure data quality, consistency, and security. Define data management policies and procedures, including data lineage, data cataloguing, and data retention.
- Monitor and optimize data processes to ensure efficient data integration, transformation, and storage. Technology Leadership: Evaluate and recommend data management tools and technologies to enhance the data architecture.
Qualifications –
- Minimum of 7-10 years of experience in data architecture, data modelling, and data management, preferably within the financial services industry.
- Deep experience with security master systems (secmaster) in a hedge fund or investment management context. Deep knowledge of financial instruments in a variety of asset classes: equities, options, futures, credit products, etc.
- Technical Skills: Strong experience with the SQL Server platform and T-SQL querying. Familiarity with cloud data platforms (e.g., AWS, Azure) and big data technologies (e.g., Kafka, Spark, Databricks, Snowflake).
- Expert-level knowledge of building ETL processes in Python. Expert-level knowledge of performance tuning and optimization in a Python / SQL Server environment
This role offers highly competitive compensation as well as a flexible hybrid working model.