Algolia was built to help users deliver an intuitive search-as-you-type experience on their websites and mobile apps. We provide a search API used by thousands of customers in more than 100 countries. Billions of search queries are answered every month thanks to the code we push into production every day.
The Team
The Events team owns Algolia’s customer-facing Events platform, the entry point for sending user interaction data into our system. These events drive improvements in analytics, personalization, and search relevance for thousands of customers. We are continuing to expand and improve this existing system, which means you’ll play a critical role in onboarding to a mature product, helping a newly built team grow in confidence, and shaping the future of this high-volume, real-time data pipeline.
The role will consist of:
As a Senior Data Engineer, you’ll help scale and evolve the backbone of Algolia’s Events platform. This means:
- Designing and maintaining reliable pipelines for ingesting and processing both real-time and batch data from diverse external sources (including Segment, Google Analytics, and direct customer integrations).
- Owning and optimizing systems that run at massive scale, ensuring low-latency event delivery and high reliability.
- Quickly getting up to speed with an established production system, and helping your teammates do the same.
- Partnering with backend, frontend, and product teams to align technical decisions with customer-facing needs.
- Contributing to architectural improvements that make our event ingestion platform more robust, efficient, and easy to extend.
- Sharing knowledge across the team and mentoring new engineers to help them grow.
You might be a fit if you have:
Must-haves
- Solid experience with data pipelines and event-driven architectures at scale.
- Proficiency in Go or another backend language, with the ability to quickly adapt to new codebases.
- Strong knowledge of distributed systems, APIs, and messaging platforms like Pub/Sub.
- Hands-on experience with BigQuery or similar data warehouses for analytics.
- A track record of collaborating with cross-functional teams and contributing to production-critical systems.
Nice-to-haves
- Familiarity with GCP and Kubernetes in production environments.
- Exposure to frontend systems (React, Rails) and how they interact with backend data pipelines.
- Experience integrating with customer-facing APIs or analytics connectors.
- Background in onboarding to inherited systems and driving re-architecture where needed.
We’re looking for someone who can live our values:
- GRIT – Problem-solving and perseverance capability in an ever-changing and growing environment
- TRUST – Willingness to trust our co-workers and to take ownership
- CANDOR – Ability to receive and give constructive feedback.
- CARE – Genuine care about other team members, our clients and the decisions we make in the company.
- HUMILITY – Aptitude for learning from others, putting ego aside.
Team’s current stack
Go backend on GCP and Kubernetes, pipelines built with BigQuery and Pub/Sub, integrating with a React + Rails frontend.