Cross Border Talents

Senior Data Platform Engineer | Poland - Based

Job Description

Lakehouse Engineering | Cloud-Native Data Platforms

  • Location: Remote (Poland-based)

  • Job Type: Full-time | Senior

  • Compensation: PLN 208,000 – PLN 312,000 + bonus + long-term incentives
  • Our Client: Enterprise SaaS Data Platform Environment

Work Authorization

  • Must reside in Poland
  • Must have the legal right to work in Poland
  • Visa sponsorship not available

About the Opportunity

Our client is building a modern, cloud-native lakehouse platform to power advanced analytics at enterprise scale. This role focuses on designing and maintaining distributed data systems using Spark, Delta Lake, and Iceberg to enable reliable, secure, and high-performance analytics workflows.

You will combine strong software engineering principles with deep data expertise to deliver scalable and future-ready data infrastructure.

What You'll Do

  • Design and implement scalable Spark-based data pipelines

  • Build and maintain lakehouse capabilities with Delta Lake and Iceberg

  • Apply CI/CD, automated testing, and infrastructure-as-code best practices
  • Integrate dbt for SQL transformations on distributed compute engines

  • Optimize performance and cost across Databricks and Snowflake

  • Implement governance, lineage, and compliance controls

  • Enable self-service analytics through curated, analytics-ready datasets

  • Contribute to observability and reliability of data platforms

  • Participate in team on-call rotations

Tech Environment

  • Python and SQL

  • Apache Spark

  • Delta Lake and/or Apache Iceberg

  • dbt

  • Databricks and Snowflake

  • Kubernetes and Docker

  • CI/CD and Infrastructure-as-Code

What We're Looking For

  • Strong Python and SQL programming skills

  • Hands-on Spark experience in production environments

  • Expertise in Delta Lake and/or Apache Iceberg

  • Experience building scalable cloud-native data platforms

  • Familiarity with dbt for analytics modeling

  • Experience with Databricks and/or Snowflake

  • Understanding of data governance, lineage, and compliance

  • Solid software engineering fundamentals

Nice to Have

  • Java, Scala, or Rust experience

  • Exposure to event-driven architectures

  • Experience supporting self-service analytics

  • Performance tuning and cost optimization expertise

  • Observability and data quality framework experience

Why This Role Stands Out

  • Strategic investment in modern lakehouse technology

  • Enterprise-scale data environment

  • Strong focus on governance and advanced analytics

  • Remote flexibility within Poland

  • Competitive compensation and long-term incentives

If you enjoy building scalable Spark-based platforms that power advanced analytics at enterprise scale, we'd love to meet you...