Senior Data Engineer at Hostelworld


Company Logo

Hostelworld is Hiring

Job Info:
  • Company Hostelworld
  • Position Senior Data Engineer
  • Location Remote, Porto (PT)
  • Source LandingJobs
  • Published November 19, 2025(45+ days ago)
  • Category Data
  • Type Full-Time
  • Experience Senior


Job Description

WHO YOU'LL WORK WITH

You will play a key role in a diverse, highly-talented team managing cloud-native data engineering systems to consistently ensure the timely, accurate and secure production and delivery of data from the Hostelworld platform.
You will work closely with other technology groups and business owners to refine quality standards and processes for Data processing design, pipeline design, development and deployment. The underlying purpose is to ensure we have an efficient, robust, secure and performant data service to support our business growth.

WHAT YOU'LL DO

We’re seeking a Senior Data Engineer to lead the integration and modernization of a recently acquired company’s data ecosystem. You’ll collaborate closely with our new colleagues to learn, document, and understand their existing data product — currently based on Python and Airtable — and work to evolve it into our modern Google Cloud Platform (GCP) stack.

This is a hands-on, strategic role that combines architecture, delivery excellence, and collaboration. You’ll bring strong software engineering discipline to our data workflows, ensuring all pipelines are built with testability, version control, and CI/CD best practices in mind.

Bridge and Modernize Systems

  • Learn and document the acquired company’s current data ecosystem, including their Airtable structures, Python scripts, and API integrations.
  • Translate that understanding into clear technical documentation and communicate findings back to our internal data and engineering teams.
  • Design and lead a modernization roadmap to integrate their systems into our GCP-based medallion architecture.
  • Champion pragmatic migration strategies that balance business continuity with long-term scalability.

Design and Deliver Robust Data Pipelines

  • Build, orchestrate, and maintain scalable ELT pipelines using Astro (Airflow), DBT, and BigQuery.
  • Ingest data from APIs and third-party systems into our unified data model.
  • Embed data validation, testing, and observability into every stage of the pipeline.

Apply Engineering Best Practices

  • Drive excellence in the software development lifecycle (SDLC) — from design and peer review to automated testing, deployment, and monitoring.
  • Implement and improve CI/CD pipelines for data workflows (e.g., Astro Cloud, GitHub Actions, Terraform).
  • Promote a “data-as-code” mindset, ensuring reproducibility, versioning, and auditability across environments.

Collaborate and Lead in an Agile Environment

  • Act as a senior contributor in the teams Scrum ceremonies (planning, review, retrospectives), driving transparency and continuous improvement.
  • Communicate findings, technical designs, and migration recommendations clearly and proactively to internal stakeholders.
  • Partner with global data and software teams to align on design standards, delivery milestones, and business priorities.

✉️