Sayari

Data Engineer Intern - Web Crawling

Job Description

Internship Description:
Sayari is looking for a Data Engineer Intern specializing in web crawling to join its Data Engineering team! Sayari has developed a robust web crawling project that collects hundreds of millions of documents every year from a diverse set of sources around the world. These documents serve as source records for Sayari’s flagship graph product, which is a global network of corporate and trade entities and relationships. As a member of Sayari's data team your primary objective will be to work on maintaining and improving Sayari’s web crawling framework, with an emphasis on scalability and reliability. You will work with our Product and Software Engineering teams to ensure our crawling deployment meets product requirements and integrates efficiently with our ETL pipelines.

This is a remote paid internship with work expectations being between 20-30 hours a week.


Job Responsibilities:
  • Investigate and implement web crawlers for new sources
  • Maintain and improve existing crawling infrastructure
  • Improve metrics and reporting for web crawling
  • Help improve and maintain ETL processes
  • Contribute to development and design of Sayari’s data product

  • Required Skills & Experience:
  • Experience with Python
  • Experience managing web crawling at scale, any framework, Scrapy is a plus
  • Experience working with Kubernetes
  • Experience working collaboratively with git
  • Experience working with selectors such as: XPath, CSS, JMESPath
  • Experience with WebDev tools (Chrome/Firefox)

  • Desired Skills & Experience:
  • Experience with Apache projects such as Spark, Avro, Nifi, and Airflow
  • Experience with datastores Postgres and/or RocksDB
  • Experience working on a cloud platform like GCP, AWS, or Azure
  • Working knowledge of API frameworks, primarily REST
  • Understanding of or interest in knowledge graphs
  • Experience with *nix environments
  • Experience with reverse engineering
  • Proficient in bypassing anti-crawling techniques
  • Experience with Javascript