Our Data Engineering team is responsible for building, maintaining, and enhancing the entire data infrastructure, including realtime and batch data ingestion, transformation, and serving, to support a wide range of business functions. As a Senior Platform Engineer, you will be at the forefront of designing and developing scalable, high-performance data platforms that power critical data-driven applications. You will work closely with engineering, data scientists, analysts, and security teams to ensure the infrastructure is reliable, performant, and capable of evolving with the needs of the business.
Candidates can be remote in Canada but must be working to coincide with core EST hours (i.e. 9am to 5pm EST).
What you'll be doing:
Designing, building, maintaining, and monitoring a large scale distributed system serving millions of people per day
Building new and updating existing microservices on AWS in Go and Python
Maintain and scale our event platform powered by Kafka (ingesting 200K msgs/sec)
Improving reliability and performance of core components and existing systems
Configuring and maintaining the infrastructure that runs those systems
Writing unit and service tests for all your code
Researching new technology to solve tomorrow’s scaling issues
What we're looking for:
You love learning and have a growth mindset
8+ years of experience in a prior software development role
You are highly experienced in Golang and/or Python
You enjoy both building and maintaining complex back-end systems that operate 24/7 at high scale
Interest or experience in information retrieval systems (e.g., Elasticsearch, time-series databases, vector databases)
Strong Interest or familiarity with some big data technologies such as, but not limited to, Airflow, EMR, Dataproc, AWS DMS, AWS Glue, Spark, Redshift, Flink, Kafka
You write clean, readable code that communicates its intents and methods clearly to future engineers (but you can also deal with code that isn’t and doesn’t)
You have some understanding of database fundamentals and experience working on data platforms
We value candidates with interests in DevOps or infrastructure and related technologies (e.g. Kubernetes)
We value experience planning and leading technical initiatives or projects of various scopes and sizes
You have or are currently mentoring other developers
Bonus Points:
Exposure and some working experience with data warehouses like Redshift
You have a deep knowledge of Docker and exposure to container orchestration tools like Kubernetes or ECS
Experience with Kafka
Experience with Apache Spark data processing framework
You have exposure to Infrastructure-as-code tools such as Terraform
What we offer:
Career development; we believe in mentorship and investing in your learning, supporting you to achieve your goals
Top industry health benefits, including vision and dental
Your own health/wellness account to spend each year