Job Description
The Opportunity
Re-architecting Our Core Data. We are embarking on a multi-year product modernization effort culminating in the 2027 season. We are seeking a skilled Backend/API Engineer with strong data expertise to contribute to the redesign and performance optimization of our core analytical data warehouse and related APIs.
About the Role
This is a hands-on role where you will design and build high-performance APIs and data infrastructure features, working directly with our engineering team to implement our future-state architecture. Your work will directly impact PFF's ability to deliver high-performance, cutting-edge analytics to our global client base.
Responsibilities
- API Development: Design, develop, and maintain high-performance, low-latency APIs (e.g., REST, GraphQL) that serve analytical data to client-facing applications and internal tools.
- Data Modeling and Implementation: Collaborate with Senior Engineers to implement new data models by translating architectural designs into efficient PostgreSQL schemas, focusing on balancing normalization with analytical read performance.
- Database Optimization: Implement and maintain advanced PostgreSQL strategies, including robust indexing, partitioning, caching, and query optimization, to ensure fast data access for APIs.
- Data Pipeline Contribution: Design, implement, and maintain scalable data pipelines and ETL/ELT functions across distributed systems to ensure data reliability and freshness.
- System Reliability: Contribute to the overall stability and scalability of our backend services and data infrastructure through monitoring, testing, and continuous deployment practices.
Qualifications
- 5+ years of professional experience in a Backend Engineering, API Development, or Data Engineering role working with high-scale, read-heavy data systems.
Required Skills
- Strong proficiency in backend development in a modern programming language (e.g., Python, Go, Node.js, Ruby, Java, etc.) and deep experience building and consuming robust APIs.
- Expert-level proficiency in PostgreSQL: Strong practical experience with schema design, query optimization, and performance tuning (e.g., analyzing query plans, configuration tuning, index management).
- Solid understanding of data warehousing concepts: Familiarity with the principles of transforming highly normalized Online Transaction Processing (OLTP) data into efficient analytical/Online Analytical Processing (OLAP) models.
- Experience with data ingestion: Practical experience building and managing data ingestion pipelines and ETL/ELT processes in a modern cloud environment.
- Effective Communication: Clear, articulate, and collaborative communication style, comfortable working with engineering and product stakeholders.
Preferred Skills
- Familiarity with Elixir/Phoenix for API development.
- Prior professional experience working with data systems in sports analytics, betting, or digital media.
- Experience operating within a globally distributed team spanning the US and UK time zones.
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application