Client: Elwood Roberts
Location: Dublin, Ireland
Job Category: Other
EU work permit required: Yes
Job Reference: 161e7e63e6e6
Job Views: 44
Posted: 21.01.2025
Expiry Date: 07.03.2025
Job Description:
Job: Data Platform Engineer (Snowflake and Databricks)
Location: North Dublin
Working model: Hybrid (1-2 days on site)
Type: Contract
Duration: 12 months+
We are working with a leader of industry. Working alongside a talented Engineering team, we are looking to add a talented Data Platform Engineer to join our team in Dublin.
This job offers the chance to shape and support the team's data architecture, working on cutting-edge cloud technologies and driving the success of our data-driven projects. You should have a strong background in Databricks, Snowflake, and AWS. You should be proficient in MLOps to support seamless deployment and scaling of machine learning models. You'll play a critical role in our mission to enhance data accessibility, streamline data sourcing pipelines and optimize performance for large-scale data solutions.
Responsibilities:
* Architect and Implement Cloud-Native Data Solutions: Design and develop scalable data platforms, focusing on a cloud-native approach, data mesh architectures, and seamless integration across multiple data sources.
* MLOps Pipeline Development: Build and maintain MLOps pipelines using tools like MLflow, ensuring efficient and reliable deployment of machine learning models to production environments.
* Data Governance and Quality Management: Create and enforce data governance standards, ensuring robust data quality and compliance through tools such as Databricks Unity Catalog.
* Data Integration & Migration: Lead migration projects from legacy data platforms to modern cloud solutions, optimizing cost and operational efficiency.
* Performance Tuning and Optimization: Leverage tools such as Snowflake and Delta Lake to improve data accessibility, reliability, and performance, delivering high-quality data products that adhere to best practices.
* Data Mesh Architecture: Design and deploy data mesh frameworks to streamline data integration and scalability across business domains.
* MLOps Pipelines: Prototype and operationalize MLOps pipelines to enhance the efficiency of machine learning workflows.
* Data Migration & Cost Optimisation: Migrate large-scale datasets to Azure and AWS platforms, focusing on business-critical data sources and significant cost reductions.
* Data Governance Applications: Develop applications to enforce data governance, data quality, and enterprise standards, supporting a robust production environment.
Required Experience:
* Experience in Data Platform Engineering: Proven track record in architecting and delivering large-scale cloud-native data solutions.
* Proficiency in Databricks and Snowflake: Strong skills in data warehousing and lakehouse technologies with hands-on experience in Databricks, Spark, Pyspark, and Delta Lake.
* MLOps Expertise: Experience with MLOps practices, ideally with MLflow for model management and deployment.
* Cloud Platforms: Knowledge of AWS, with additional experience in Azure beneficial for multi-cloud environments.
* Programming Languages: Strong coding skills in Python, SQL, and Scala.
* Tooling Knowledge: Experience with version control (GitHub), CI/CD pipelines (Azure DevOps, GitHub Actions), data orchestration tools (Airflow, Jenkins), and dashboarding tools (Tableau, Alteryx).
* Some familiarity with data governance tools and best practices would be ideal.
This is an exciting opportunity to work in a secure contract with a leading business undertaking a major digital transformation. You will play a critical part in the technical advancement and work alongside a skilled and friendly group of Engineers. If you would be interested, please submit your CV to the link provided for immediate consideration.
#J-18808-Ljbffr