Apollo Io

Staff Data Engineer (Remote, United States)

Job Description

As a Staff Data Engineer, you will be responsible for maintaining and operating the data platform that caters to machine learning workflows, analytics and powers some of the products offered to Apollo customers.

Daily adventures/responsibilities

  • Develop and maintain scalable data pipelines and build new integrations to support continuing increases in data volume and complexity.
  • Develop and improve Data APIs used in machine learning / AI product offerings
  • Implement automated monitoring, alerting, self-healing (restartable/graceful failures) features while building the consumption pipelines.
  • Implement processes and systems to monitor data quality, ensuring production data is always accurate and available.
  • Write unit/integration tests, contribute to the engineering wiki, and document work.
  • Define company data models and write jobs to populate data models in our data warehouse.
  • Work closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.

Competencies

  • Customer driven: Attentive to our internal customers’ needs and strive to deliver a seamless and delightful customer experience in data processing, analytics, and visualization.
  • High impact: Understand what the most important customer metrics are and make the data platform and datasets an enabler for other teams to achieve improvement.
  • Ownership: Take ownership of team-level projects/platforms from start to finish, ensure high-quality implementation, and move fast to find the most efficient ways to iterate.
  • Team mentorship and sharing: Share knowledge and best practices with the engineering team to help up-level the team.
  • Agility: Organized and able to effectively plan and break down large projects into smaller tasks that are easier to estimate and deliver. Can lead fast iterations.
  • Speak and act courageously: Not afraid to fail, challenge the status quo, or speak up for a contrarian view.
  • Focus and move with urgency: Prioritize for impact and move quickly to deliver experiments and features that create customer value.
  • Intelligence: Learns quickly, demonstrates the ability to understand and absorb new codebases, frameworks, and technologies efficiently.

Qualifications

Required:

  • 8+ years of experience as a data platform engineer or a software engineer in data or big data engineer.
  • Experience in data modeling, data warehousing, APIs, and building data pipelines.
  • Deep knowledge of databases and data warehousing with an ability to collaborate cross-functionally.
  • Bachelor's degree in a quantitative field (Physical/Computer Science, Engineering, Mathematics, or Statistics).

Preferred:

  • Experience using the Python data stack.
  • Experience deploying and managing data pipelines in the cloud.
  • Experience working with technologies like Airflow, Hadoop, FastAPI and Spark.
  • Understanding of streaming technologies like Kafka and Spark Streaming.

Why You’ll Love Working at Apollo

At Apollo, we’re driven by a shared mission: to help our customers unlock their full revenue potential. That’s why we take extreme ownership of our work, move with focus and urgency, and learn voraciously to stay ahead.

We invest deeply in your growth, ensuring you have the resources, support, and autonomy to own your role and make a real impact. Collaboration is at our core—we’re all for one, meaning you’ll have a team across departments ready to help you succeed. We encourage bold ideas and courageous action, giving you the freedom to experiment, take smart risks, and drive big wins.

If you’re looking for a place where your work matters, where you can push boundaries, and where your career can thrive—Apollo is the place for you.