WalletConnect is one of the core infrastructure teams in Web3 — we build the connectivity layer that lets wallets and apps communicate securely across blockchains. Since launching in 2018, we’ve grown into a network of 75,000+ apps, 700+ wallets, and 50+ million users. Our mission is to power the financial internet by making digital ownership and payments interoperable and accessible.
We’ve recently launched WalletConnect Pay — a payments solution that lets merchants and payment providers leverage blockchain rails for new payment experiences, like stablecoin checkout, payouts, and deposits.
Backed by $38M from investors like Union Square Ventures, Shopify, Coinbase Ventures, Circle Ventures, and Uniswap Labs, we’re a global, remote-first team that values openness, simplicity, innovation, and ownership.
We’re entering our most ambitious chapter yet. WalletConnect Pay is an end-to-end crypto and stablecoin payment method built on the world’s largest wallet network — already embedded in Stripe, Coinbase Commerce, Shopify, MoonPay, Shift4, and BitPay, with a landmark partnership with Ingenico bringing stablecoin payments to 40M+ terminals across 120+ countries.
As WalletConnect scales into payments, data is becoming foundational — not optional. From wallet connection success rates and transaction health to financial reconciliation, merchant reporting, and operational monitoring, our data systems must be as reliable as the payment flows they track. We’re building a next-generation data platform to support real-time payments infrastructure, developer tooling, and product analytics. This hire joins at the moment we’re moving from MVP pipelines to scalable production architecture, and we need a senior engineer who can help design and operate the data foundation behind a global payments network.
This is a Senior Data Engineer position on the Data Engineering team. You’ll work closely with the Data Engineering Lead to design and scale the data platform powering WalletConnect’s payments infrastructure and ecosystem analytics.
This role sits at the intersection of data engineering and backend systems. You’ll build and operate event-driven data pipelines, maintain real-time processing systems, and ensure data quality across high-volume transactional workflows. You’ll also build backend services that expose data to internal platforms and APIs, so teams across the company can consume it.
Day-to-day, you’ll work with Python, SQL, ClickHouse, Airflow, and dbt within an event-driven architecture. You’ll own the systems you build end-to-end — including monitoring, reliability, and data correctness — and collaborate with product, engineering, and infrastructure teams to support new data use cases as the platform grows.
Requirements
Data Platform & Pipeline Engineering
Data Quality & Observability
Backend Services & Data Access
Ownership & Operations
Our current stack includes Python, SQL, ClickHouse, Airflow, dbt, Grafana, and Preset / Deepnote for data visualization. The architecture is event-driven with near real-time processing. Rust is a plus but not required. We’re also experimenting with AI-assisted monitoring and data tooling as part of our platform evolution.
Must-Haves
Nice-to-Haves
Benefits