SwingDev—a hippo company is the Polish branch of a fast-growing InsurTech product company – Hippo Insurance. We’re here to make home insurance smarter and more proactive, using tech to bring fresh solutions. We also create digital products that support our customers, always focusing on what really makes an impact.
Putting a lot of effort into hiring top-tier devs, designers, PMs, and QAs, is a proof that we care a lot about tech experience, the attitude, human approach, and what we could call „culture fit”.
„SwingDev is all about people” – yes, it may sound a bit cliche. But whether we're writing code or just hanging out, we know that people are at the heart of everything we do. We like to have a good time and keep things light, even when we're tackling big projects. We could brag about what’s making us special, but we’ve boiled it down to two key ingredients: mature, companionable people who, rather than compete, prefer to inspire and have each other’s backs; a culture of trust, empathy, and positivity that keeps us together, lets us interact as teammates and friends, and truly enjoy the ride.
So if you’re looking to shake things up and have a good time while you’re at it, you’ve come to the right place. 🚀
What will you do?
Lead a multidisciplinary data organization, hiring, developing, and establishing career frameworks while bridging legacy expertise with modern platform capabilities.
Collaborate closely with analytics, product, engineering, and business teams to provide the platform foundation they depend on.
Own the architecture and delivery of the enterprise data platform across cloud environments, supporting batch, streaming, structured, and unstructured data.
Define and govern enterprise-level data architecture patterns and data modeling standards, applying them pragmatically across domains.
Develop integration strategies for source systems to enhance data quality, prevent technical debt, and optimize access to reliable data.
Establish the data product model with clear ownership, SLAs, versioning, and discoverability, shifting the team’s output from pipelines to trusted, consumable data assets.
Lead the design and build of the enterprise semantic layer with canonical metrics and business definitions that serve BI tooling, AI grounding, and operational consumers from a single source of truth.
Embed governance in the engineering lifecycle through data contracts, policy-as-code, lineage automation, quality checks, and access controls with auditable versioning.
Build the data infrastructure required for AI/ML workloads, including feature pipelines, vector stores, embedding pipelines, and model-ready data products across all enterprise data assets.
We might be a match if you…
Bring experience in data engineering, architecture, and data platforms, along with a track record of leading technical teams and delivering modern, cloud-based data solutions.
Have built data products with real users in mind, including managing SLAs and incorporating continuous feedback loops.
Have a hands-on background in semantic layer design and apply strong analytics engineering practices.
Understand how to implement governance-as-code, including data contracts, lineage, quality, and access as core engineering artifacts.
Have working knowledge of AI data infrastructure, such as feature stores, vector databases, and pipelines for unstructured data.
Bring deep data modeling expertise across multiple paradigms and know how to apply the right approach depending on the context.
Have experience leading teams with diverse skill sets across different geographies.
Are proficient in the modern data stack, including cloud platforms (AWS, GCP, Azure), as well as data lakes and data warehouses such as Snowflake or BigQuery, and understand how tooling is evolving to support AI use cases.
Recruitment process:
Send us your CV – it's the best way for us to get to know you.
Meet Katrina, Recruitment Manager.
Join for a 60-minute Manager Interview with Robin, Chief Data and Analytics Officer.
Technical Interview with Tomek and John.
Data & Product Interview with Margot, Lead Product Manager, Data.