Data Engineer, Data Platform

  • Glass Lewis Europe Limited
  • Verified

Job Description

Company Overview

Glass Lewis is the world’s choice for governance solutions. We enable institutional investors and publicly-listed companies to make sustainable decisions based in research and data. We cover 30,000+ meetings each year, across approximately 100 global markets.

Our customers include the majority of the world’s largest pension plans, mutual funds, and asset managers, collectively managing over $40 trillion in assets. We have teams located across the United States, Europe, and Asia-Pacific giving us global reach with a local perspective on the important governance issues. Founded in 2003, Glass Lewis is headquartered in San Francisco, California with additional offices in Kansas City, Missouri; Paris, France; Limerick, Ireland; Karlsruhe, Germany; Sydney, Australia; Tokyo, Japan; and Manila, Philippines.

Position:

We are hiring Engineers (2 positions) to join our Data Engineering team in Timisoara, Cluj, or Bucharest, Romania. Candidates must have broad experience in a cloud data-centric environment.

Responsibilities:

    • Design, develop, and maintain data pipelines for a medallion architecture using Azure/Microsoft Fabric components (Data Factory, Synapse Analytics, Purview, etc.).
    • Building observability tools for data quality, consistency and security through the data pipeline
    • Optimize data pipeline performance and efficiency.
    • Stay up-to-date with the latest platform developments and trends.
    • Ensure projects are delivered on time, within budget, and with high quality.
    • Report on project progress and status to stakeholders.
    • Managing teams and mentoring team members and enabling them to progress in their careers.

Requirements

    • Degree in computer science or equivalent with a minimum of 5 years’ experience in software development.
    • Strong proficiency in Azure/Microsoft Fabric components (Data Factory, Synapse Analytics, Purview, etc.) or AWS/GCP equivalents
    • Experience with data pipeline development and optimization
    • Familiarity with front-end toolkits for data stewards and observability
    • Familiarity with cloud computing and DevOps practices.
    • Understanding of data quality and governance principles.
    • Excellent communication and interpersonal skills.
    • Strong problem-solving and decision-making abilities.
    • Experience working with Azure DevOps and Git.
    • Strong documentation and communication skills.