Senior Data Engineer

Job Description

🌟 Accomplished Tech Visionary: 
Embark on an exciting journey into the realm of software development with 3Pillar!

We extend an invitation for you to join our team and gear up for a thrilling adventure. At 3Pillar, our focus is on leveraging cutting-edge technologies that revolutionize industries by enabling data driven decision making. As a Senior Data Engineer, you will hold a crucial position within our dynamic team, actively contributing to thrilling projects that reshape data analytics for our clients, providing them with a competitive advantage in their respective industries.

If your passion for data analytics solutions that make a real-world impact, consider this your pass to the captivating world of Data Science and Engineering! 🔮🌐


Key Responsibilities:
  • Build shippable software following Engineering standards in place.
  • Build and maintain key Engineering blocks that other teams can rely upon (such as APIs and Big Data implementations).
  • Support the current stack and be able to extend it with new features.
  • Work on ad-hoc R&D projects
  • Work closely with TMG’s business intelligence users, operations and development teams on projects and CR’s, encouraging a data driven and pragmatic approach to tackling challenges and problems.  
  • Ensure the deliveries are on time and of the required quality
  • Maintain the company’s data assets at required quality levels
  • Help to design and build solid, efficient, stable APIs.
  • Help to maintain our high standard of code
  • Keep up to date with the latest technologies and methodologies.
  • Ensure a globally robust and highly scalable approach to development to support our growing number of global users and services.
  • Enforce best practices in terms of code quality and design of processes.

  • Essential Skills:
  • Python development skills
  • Ability to implement ETL data pipelines in Python
  • Creating REST APIs
  • Advanced SQL scripting knowledge
  • Experience with Google Cloud Platform, AWS or Azure
  • 2+ years of experience in data or software development
  • Knowledge of big data platforms
  • Knowledge of relational databases
  • Knowledge of technologies: Git, Docker, Bash language
  • Ability to propose, design and implement a simple ETL solutions both in batch and real-time
  • Understanding of what is a continuous delivery pipeline and ability to design a process
  • Ability to pick the correct technology for the correct task

  • Desirable Skills:
  • Experience with DBT (Data build Tools) to develop data pipelines
  • Experience with Data streams in Google Dataflow or Apache beam
  • Experience using Airflow
  • Experience with NoSQL databases like Redis, Elastic Search