We are looking for a Developer passionate about technology and with expertise in building and maintaining robust and scalable data pipelines. If you have a deep understanding of data engineering, mastery of tools such as Databricks and Azure, and a passion for working with data, this is the perfect opportunity for you!
Your responsibilities will be:
- Develop and maintain code in Python, focusing on object-oriented programming, using good data engineering practices.
- Design and implement complex and efficient data pipelines using Databricks and Azure Data Factory, following Medallion's architectural patterns.
- Master the ETL process, ensuring data quality at all stages, from extraction to loading.
- Collaborate with different business areas to understand data needs and translate them into effective solutions.
- Implement unit and integration tests to ensure the quality of code and data pipelines.
- Orchestrate complex data pipelines using workflows and Azure Data Factory.
- Communicate clearly and efficiently with the team, contributing to a collaborative and high-performance environment.
Essential requirements for the position:
We need a Python-specialized Developer, with an emphasis on Azure Pipelines and Databricks, and fluency in English. The combination of these 4 pillars is fundamental for a good fit. This position is in a staff augmentation format, meaning you will have daily contact with the client's team, which is multicultural. If you have these skills and are looking for a challenging project with opportunities for growth and recognition, this position is for you!