Senior Data Engineer at Reaktor
Job Description
📋 Description
- Design scalable batch and streaming data pipelines (ETL/ELT).
- Develop infrastructure, integrations and APIs for data storage and delivery.
- Collaborate with clients and domain experts to define data needs.
- Work with data scientists and ML engineers on end-to-end data products.
- Build end-to-end pipelines for impression data and recommendations.
- Explore data lakes, semantic search and ML pipelines.
🎯 Requirements
- Hands-on data engineering experience.
- SQL/database experience: PostgreSQL, DynamoDB, Redshift, BigQuery.
- Data platforms/lakes/warehouses: Databricks, Snowflake.
- Python required; PySpark and Pandas; Java/TypeScript bonus.
- Cloud providers: AWS, Azure or GCP.
- Infrastructure-as-code: Terraform, CloudFormation or CDK.
🎁 Benefits
- Impact how you work; teams choose approach and technologies.
- Strong supportive community.
- Team that wants you to succeed.
- Sustainable work-life balance and perks like car share and family support.
- Internal training, events and Cloud Academy opportunities.
- DEI and inclusive culture.
