Our mission is to help organizations understand the offline world. The opportunity is Full-time, based in Dublin, Ireland.
What do we do?
We provide high-quality data which powers our customer’s products. The data is sourced from industries, such as advertising, smart cities, real estate, market research and financial services.
We aggregate, refine and use this data to build products that help our clients get insights into how people move and behave in the real world.
Who are we searching for?
A Data Engineer who will help migrate our existing infrastructure into the cloud and build reliable, efficient and scalable data processing systems.
To be successful as a Data Engineer, you should have prior software development and data processing experience, be able to collaborate with team members, be a strategic problem-solver and be willing to learn new data technologies.
The opportunity
You will primarily work within the AWS Cloud Platform with a variety of technologies including Python, Spark (PySpark), EMR, S3, EC2, Lambda and many orchestration and scheduling tools, modern deployment and management systems. Moreover, you will participate in data modeling activities and the design of data flows through their implementation and support in production.
This is a unique opportunity to join, build and grow a successful start-up leading its market, from the early stages.
Your role
* Develop new features in collaboration with the other engineers and the CTO.
* Collect and process a wide variety of diverse data sets.
* Design datasets for external facing consumers for speed, consistency, cost, and efficiency.
* Write complex SQL/PySpark operations to transform raw data.
* Implement workflows and schedule them.
* Manage data observability, scalability, transparency, and accuracy.
* Ensure data is clean, consistent, and available.
* Perform data quality checks, and build monitoring systems.
* Investigate, test, and implement new tools, processes, and technologies on an ongoing basis.
* Work closely with the sales team to provide data to our customers.
* Develop tools to make data extraction very efficient, flexible and time-predictable.
What we’re looking for
* Fluent English speaker (French is a plus).
* At least 3 years of experience with Data technologies such as PySpark.
* Minimum 3 or more years of experience developing and debugging in Python.
* Comfortable working with AWS products such as EC2, S3 and EMR.
* Experience with Data Flow, Data Processor, Kafka, Spark, EMR, Athena, RDS, or other streaming technologies is a plus.
* Strong experience with Agile Methods.
* Prior knowledge of GitHub CI/CD, and technologies such as Terraform.
* Irish resident with the ability to work in Ireland without an employment permit.
Recruitment process
* A first interview with Boubou, our Tech Talent Acquisition Specialist - 45 minutes
* A technical test through CodinGame (Python, SQL and AWS assessment) - 50 minutes
* In-depth technical interview with Filip, Principal Data Engineer - 45 minutes
* Problem-solving interview with Filip and Camille - 45 minutes
* Cultural fit interview in Paris to know better our French entity
#J-18808-Ljbffr