At Ancient Gaming, we lead the way in innovation within the iGaming software industry. As pioneers, we are constantly pushing the boundaries of what's possible, setting new standards with engaging software and transformative experiences. Our products are designed to captivate a high-spending, forward-thinking generation eager to reshape the future of gaming.Founded in Malta in 2018, Ancient Gaming has rapidly grown into a global entity, with a team of over 100 skilled professionals spanning more than 30 countries. Embracing a remote-first model, we continue to drive advancements in the gaming landscape.⚡ Your Mission as Data Engineer:You will play a critical role in designing, managing, and optimizing our data infrastructure on the Google Cloud Platform (GCP). Leveraging your expertise in cloud-based solutions, orchestration, ETL pipelines, and data analytics, you’ll ensure that our data systems are robust, efficient, and scalable.You will be working on:CSGORoll — the world’s #1 skin gaming site. A community social gaming platform specifically designed for CS2 players, featuring unique in-house custom-built games, such as Roulette, Crash, Unboxing, and PVP.HypeDrop — a leading gamified shopping platform, where over 500,000 users experience the excitement of real-time mystery box openings, box battles, and customizable deals. Since 2018, HypeDrop has sold over 150 million boxes, revolutionizing the way people shop and win.You will:Develop, deploy, and maintain DAGs using Apache Airflow (Google Cloud Composer).Develop efficient Extract, Transform, Load (ETL) processes to ensure seamless data flow from various sources to our BigQuery data warehouse.Design and optimise data models using DBT Cloud for data transformation and version control.Collaborate with cross-functional teams to integrate data from different systems, providing unified and comprehensive insights.Collaborate with data scientists and analysts to integrate analytics solutions seamlessly.Establish and enforce data quality standards, ensuring compliance with industry regulations and best practices.Monitor and optimise data pipelines and queries for performance and efficiency.You will excel in this role if you possess:Proficient in cloud services. Ideally with Google Cloud Platform services: BigQuery, Cloud Storage, and Composer.Strong experience with Airflow DAG development for automated data workflows.Expertise in DBT for data transformation and versioning.Extensive knowledge of SQL and performance optimisation in large datasets.Proficient in Python.Familiarity with CI/CD pipelines and data governance best practices.It would be advantageous if you possess:Experience in data warehousing, medallion architecture, and working in an agile environment.Strong problem-solving skills and a focus on efficient cloud resource usage.Experience with data observability services.