Ready to become a Wriker?
We're looking for an enthusiastic Data Engineer to organize reliable and scalable infrastructure, and help in designing and maintaining clean data sources for analysis of business purposes.
In this role, you will be responsible for conceptualizing, architecting, and constructing data pipelines and services, while helping to enhance our data platform as a key driver of business decision-making across Wrike. Based on your experience and interests you may work on organizing data warehousing, integrating SaaS applications, developing tools, or implementing practical AI solutions. We encourage candidates with a wide range of interests and are happy to support you in expanding your expertise into new areas.
Job Scope and Accountabilities:
- Pipelines / ETL – architecting and building advanced streaming and batch data pipelines
- DWH – creating, developing, and overseeing robust data warehouse components
- Data Quality – designing and implementing frameworks for validation, monitoring, and alerting
- Data Governance – managing data catalogs and data lineage for development and operations
- Data Protection – researching, developing, and deploying solutions and techniques to enhance the protection of PII and sensitive information
Experience Requirements:
- Good level of written and verbal English
- Work experience building & maintaining data pipelines on data-heavy environments (Data Engineering, Backend with emphasis on data processing, Data Science with emphasis on infrastructure)
- Strong communication and analytical skills
- Solid understanding of SQL
- Experience with Python (Data transformation pipelines, API & database integrations)
- Hands-on experience with Data Warehousing platforms (BigQuery, Redshift, Snowflake, Vertica or similar)
- Familiarity with data pipeline orchestration tools (Airflow, Dagster, Prefect, or similar)
- Understanding of CI\CD and containerization
You will stand out with:
- Good understanding of Database architecture & Data modelling experience
- Development, testing, deployment, and assurance of the reliability of data-intensive applications
- Familiarity with Data Streaming and CDC (Pub/Sub, DataFlow, Kafka, Flink, Spark Streaming or similar)
- Understanding of Kubernetes
- Experience with major B2B vendor integrations (Salesforce/CPQ, NetSuite, Marketo, etc.)
- Experience with Data Quality Tools, Monitoring and Alerting
Perks of working at Wrike
- 25 days of holidays
- Cafeteria bonuses (Benefit plus)
- Meal vouchers (220 CZK/working day)
- Sick leave compensation
- Private healthcare membership (Canadian Medical)
- Pension plan
- Mobile tariffs
- „Lítačka“ transportation annual coupon reimbursement
- Multisport card
- Parental leave
Your recruitment buddy will be Alexandra Vorobyova, Lead Recruiter.
#LI-AV1