Accelerate Your Career in Cybersecurity
As a leader in Automated Security Validation, we help businesses around the world safely emulate real-world attacks to uncover their vulnerabilities. At Pentera, you will be at the forefront of cybersecurity innovation, working on advanced tools that challenge organizations' defenses and push the limits of security testing.
With over 400 team members and 1,100+ customers in more than 50 countries, Pentera is a growing company supported by top investors like Insight Partners, K1, and The Blackstone Group.
If you are looking to grow your skills, make a difference, and be part of an innovative team, Pentera is the place for you.
About the Role:
We’re looking for a highly skilled and motivated Senior Data Engineer to join the Resolve (formerly DevOcean) team at Pentera. In this role, you’ll be responsible for designing, building, and optimizing the data infrastructure that powers our SaaS platform. You’ll play a key role in shaping a cost-efficient and scalable data architecture while building robust data pipelines that serve analytics, search, and reporting needs across the organization.
You’ll work closely with our backend, product, and analytics teams to ensure our data layer remains fast, reliable, and future-proof. This is an opportunity to influence the evolution of our data strategy and help scale a cybersecurity platform that processes millions of findings across complex customer environments.
Roles and Responsibilities:
- Design, implement, and maintain data pipelines to support ingestion, transformation, and analytics workloads.
- Collaborate with engineers to optimize MongoDB data models and identify opportunities for offloading workloads to analytical stores (ClickHouse, DuckDB, etc.).
- Build scalable ETL/ELT workflows to consolidate and enrich data from multiple sources.
- Develop data services and APIs that enable efficient querying and aggregation across large multi-tenant datasets.
- Partner with backend and product teams to define data retention, indexing, and partitioning strategies to reduce cost and improve performance.
- Ensure data quality, consistency, and observability through validation, monitoring, and automated testing.
- Contribute to architectural discussions and help define the long-term data platform vision.