Principal Data Platform Engineer (Dublin x1/2 days in office) The Xcede Recruitment team are delighted to be partnering with a fantastic tech org in Dublin in hiring a new Principal Data Platform Engineer for their Data unit.
The unit is responsible for managing the infrastructure of a data platform, along with pipelines and services that deliver data extracts and analytics.
This includes processing and transforming a variety of datasets, such as metrics from online platforms, social media, media product usage, metadata, and historical data to create financial forecasts, etc.
These efforts support various business teams by providing data solutions and self-service tools.
The team is working on building a selfservice data mesh to give analysts better tools for exploring datasets and ensuring their accuracy.
The unit aim to build fantastic tools for users and internal stakeholders alike, so this is a highly impactful role.
Responsibilities: Design and develop shared infrastructure to simplify the creation, discovery, and consumption of data products across teams.Implement and manage systems for access control, monitoring, and cost allocation to ensure secure and efficient platform operations.Develop and maintain tools for data cataloguing, quality assurance, and metadata management.Create templates, guidelines, and tools to streamline the adoption and migration to the new platform for Analytical stakeholders.Collaborate with stakeholders to promote self-service analytics, enabling teams to derive insights independently.Drive innovation and operational efficiency by building scalable tools and infrastructure that leverage data and AI.Contribute to strategic initiatives that enhance business performance and create revenue-generating opportunities through the use of data analytics.Work collaboratively within the music publishing industry to empower data-driven decision-making and innovation.Requirements: Deep expertise in Python as the primary programming language, with additional experience in other OOP languages like Java or Scala being a plus.Solid experience in writing and optimizing SQL queries.Experience in designing, building, and optimizing ETL pipelines.Strong background in cloud computing, particularly in areas like ETL workflows, orchestration, and permissions management.Hands-on experience in managing and maintaining CI/CD pipelines.Advanced skills in writing automated tests to ensure code reliability and robustness.Expertise in monitoring services and implementing automated alerting systems.Experience with implementing Infrastructure as Code solutions using tools like Terraform or CloudFormation.AWS cloud experience is preferred.If this role interests you and you would like to find out more, please apply here or contact us via ****** (feel free to include a CV for review).
#J-18808-Ljbffr