Why This Job is Featured on The SaaS Jobs
Within SaaS, product telemetry, customer interactions, and operational events quickly become core assets. This Data Platform Engineering specialist role stands out because it is anchored in building the lakehouse backbone—AWS plus Databricks—behind a large-scale service software portfolio, with emphasis on both streaming (Kafka/Spark) and governed enterprise data flow.
For a long-term SaaS career, the work maps closely to how modern subscription businesses run analytics and decisioning: reliable CDC pipelines, real-time processing, and repeatable platform patterns that multiple teams can use. Ownership of observability, CI/CD, and infrastructure-as-code also builds the platform engineering discipline that increasingly separates ad-hoc data engineering from durable, internal “data product” delivery across SaaS organizations.
This position suits an engineer who prefers systems thinking over dashboard building, and who enjoys tuning distributed workloads, establishing standards, and reducing operational risk through automation. It will fit someone comfortable partnering with product, security, and regulatory stakeholders, and interested in the practical trade-offs of governance, privacy, and performance in a multi-tenant SaaS data environment.
The section above is editorial commentary from The SaaS Jobs, provided to help SaaS professionals understand the role in a broader industry context.
Job Description
Company Description
Organizations everywhere struggle under the crushing costs and complexities of “solutions” that promise to simplify their lives. To create a better experience for their customers and employees. To help them grow. Software is a choice that can make or break a business. Create better or worse experiences. Propel or throttle growth. Business software has become a blocker instead of ways to get work done.
There’s another option. Freshworks. With a fresh vision for how the world works.
At Freshworks, we build uncomplicated service software that delivers exceptional customer and employee experiences. Our enterprise-grade solutions are powerful, yet easy to use, and quick to deliver results. Our people-first approach to AI eliminates friction, making employees more effective and organizations more productive. Over 72,000 companies, including Bridgestone, New Balance, Nucor, S&P Global, and Sony Music, trust Freshworks’ customer experience (CX) and employee experience (EX) software to fuel customer loyalty and service efficiency. And, over 4,500 Freshworks employees make this possible, all around the world.
Fresh vision. Real impact. Come build it with us.
Job Description
The Specialist – Data Platform Engineering is responsible for designing, building, and maintaining scalable data pipelines and systems that enable analytics, insights, and business intelligence across the organization. This role will own the enterprise Data Lake and Databricks environment, ensuring secure, high-performance data flow across MySQL, Kafka, and Spark. The ideal candidate combines strong technical expertise in distributed data systems with hands-on experience in modern data engineering practices.
Roles & Responsibilities
Data Infrastructure Ownership
- Own and manage the enterprise Data Lake infrastructure on AWS and Databricks, ensuring scalability, reliability, and governance.
- Design, develop, and optimize data ingestion and transformation pipelines from MySQL to Kafka (CDC pipelines) and from Kafka to Databricks using Spark Structured Streaming.
- Build and maintain robust batch and real-time data pipelines to support high-volume, high-velocity data needs.
Data Processing & Optimization
- Design and implement efficient MapReduce jobs to process and transform large-scale datasets across distributed systems.
- Optimize MapReduce workflows for performance, scalability, and fault tolerance in big data environments.
- Develop metadata-driven frameworks for processing consistency, lineage, and traceability.
System Reliability & Automation
- Implement observability and monitoring systems using Prometheus, Grafana, or equivalent tools to ensure proactive detection and resolution of issues.
- Apply best practices in code quality, CI/CD automation (Jenkins, GitHub Actions), and Infrastructure-as-Code (IaC) for consistent deployments.
- Continuously optimize system performance and reliability through monitoring, tuning, and fault-tolerant design.
Collaboration & Compliance
- Work cross-functionally with Product, Regulatory, and Security teams to ensure compliance, data privacy, and quality across the data lifecycle.
- Collaborate with multiple teams to design and deliver end-to-end lakehouse solutions that integrate diverse data sources.
- Stay current with emerging technologies in data engineering, streaming, and distributed systems, and contribute to continuous improvement initiatives.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 4-6 years of experience in data engineering, data platform management, or related domains.
- Strong programming expertise in one or more of the following: Scala, Spark, Java, or Python.
- Proven experience building event-driven or CDC-based pipelines using Kafka (Confluent or Apache).
- Hands-on experience with distributed data processing frameworks such as Apache Spark, Databricks, or Flink.
- Deep understanding of AWS cloud services (S3, Lambda, EMR, Glue, IAM, CloudWatch).
- Solid experience deploying and managing workloads on Kubernetes (EKS preferred).
- Experience designing and managing data lakehouse architectures and implementing data governance principles.
- Familiarity with CI/CD pipelines (Jenkins, GitHub Actions) and monitoring frameworks (Prometheus, Grafana, ELK stack).
- Excellent problem-solving, communication, and collaboration skills.
Skills Inventory
- Data Lakehouse Architecture (AWS, Databricks)
- Real-time Data Streaming (Kafka, Spark Structured Streaming)
- Distributed Data Processing (Spark, MapReduce, Flink)
- Programming (Scala, Python, Java)
- CI/CD Automation & Infrastructure as Code
- Kubernetes (EKS)
- Monitoring & Observability (Prometheus, Grafana)
- Data Governance & Metadata Management
- Cloud Infrastructure (AWS)
- Cross-Functional Collaboration & Problem Solving
Additional Information
At Freshworks, we have fostered an environment that enables everyone to find their true potential, purpose, and passion, welcoming colleagues of all backgrounds, genders, sexual orientations, religions, and ethnicities. We are committed to providing equal opportunity and believe that diversity in the workplace creates a more vibrant, richer environment that boosts the goals of our employees, communities, and business. Fresh vision. Real impact. Come build it with us.