The Team
The Events team owns Algolia’s customer-facing Events platform, the entry point for sending user interaction data into our system. These events drive improvements in analytics, personalization, and search relevance for thousands of customers. We are continuing to expand and improve this existing system, which means you’ll play a critical role in onboarding to a mature product, helping a newly built team grow in confidence, and shaping the future of this high-volume, real-time data pipeline.
The role will consist of:
As a Senior Data Engineer, you’ll help scale and evolve the backbone of Algolia’s Events platform. This means:
- Designing and maintaining reliable pipelines for ingesting and processing both real-time and batch data from diverse external sources (including Segment, Google Analytics, and direct customer integrations).
- Owning and optimizing systems that run at massive scale, ensuring low-latency event delivery and high reliability.
- Quickly getting up to speed with an established production system, and helping your teammates do the same.
- Partnering with backend, frontend, and product teams to align technical decisions with customer-facing needs.
- Contributing to architectural improvements that make our event ingestion platform more robust, efficient, and easy to extend.
- Sharing knowledge across the team and mentoring new engineers to help them grow.
You might be a fit if you have:
Must-haves
- Solid experience with data pipelines and event-driven architectures at scale.
- Proficiency in Go or another backend language, with the ability to quickly adapt to new codebases.
- Strong knowledge of distributed systems, APIs, and messaging platforms like Pub/Sub.
- Hands-on experience with BigQuery or similar data warehouses for analytics.
- A track record of collaborating with cross-functional teams and contributing to production-critical systems.
Nice-to-haves
- Familiarity with GCP and Kubernetes in production environments.
- Exposure to frontend systems (React, Rails) and how they interact with backend data pipelines.
- Experience integrating with customer-facing APIs or analytics connectors.
- Background in onboarding to inherited systems and driving re-architecture where needed.
Team’s current stack
Go backend on GCP and Kubernetes, pipelines built with BigQuery and Pub/Sub, integrating with a React + Rails frontend.