Apollo Io

Senior Data Engineer

Job Description

**This is a Permanent EoR role and not a B2B Contract**

Your Role & Mission

As a Senior Data Engineer you will be responsible for maintaining and operating the data warehouse and connecting in Apollo’s data sources.

Daily Adventures and Responsibilities

  • Develop and maintain scalable data pipelines and build new integrations to support continuing increases in data volume and complexity.
  • Implement automated monitoring, alerting, self-healing (restartable/graceful failures) features while building the consumption pipelines.
  • Implement processes and systems to monitor data quality, ensuring production data is always accurate and available.
  • Write unit/integration tests, contributes to engineering wiki and document work.
  • Define company data models and write jobs to populate data models in our data warehouse.
  • Work closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.

Competencies

  • Excellent communication skills to work with engineering, product, and business owners to develop and define key business questions and build data sets that answer those questions.
  • Self-motivated and self-directed
  • Inquisitive, able to ask questions and dig deeper
  • Organized, diligent, and great attention to detail
  • Acts with the utmost integrity
  • Genuinely curious and open; loves learning
  • Critical thinking and proven problem-solving skills required

Skills & Relevant Experience

Required:

  • 5+ years experience in data engineering or in data facing role
  • Experience in data modeling, data warehousing, and building ETL pipelines
  • Deep knowledge of data warehousing with an ability to collaborate cross-functionally
  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)
  • Proven experience leveraging AI tools by demonstrating fluency in integrating AI-driven solutions into their workflows and a willingness to stay current with emerging AI technologies

Preferred:

  • Experience using the Python data stack
  • Experience deploying and managing data pipelines in the cloud (preferably AWS or GCP)
  • Experience working with technologies like Airflow, Hadoop and Spark
  • Understanding of streaming technologies like Kafka, Spark Streaming



Why You’ll Love Working at Apollo

At Apollo, we’re driven by a shared mission: to help our customers unlock their full revenue potential. That’s why we take extreme ownership of our work, move with focus and urgency, and learn voraciously to stay ahead.

We invest deeply in your growth, ensuring you have the resources, support, and autonomy to own your role and make a real impact. Collaboration is at our core—we’re all for one, meaning you’ll have a team across departments ready to help you succeed. We encourage bold ideas and courageous action, giving you the freedom to experiment, take smart risks, and drive big wins.

If you’re looking for a place where your work matters, where you can push boundaries, and where your career can thrive—Apollo is the place for you.