Optum is a global organisation that delivers care, aided by technology to help millions of people live healthier lives.
About the Role
We want to make healthcare work better for everyone. This depends on hiring the best and brightest. With a thriving ecosystem of investment and innovation, our business in Ireland is constantly growing to support the healthcare needs of the future.
Our Mission
We believe everyone – of every race, gender, sexuality, age, location and income – deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of colour, historically marginalised groups and those with lower incomes.
Key Responsibilities
* Design, build and implement automation processes for production release of data engineering, machine learning and business intelligence processes.
* Write code, and leverage tools, to transform data to incorporate business logic as defined in conjunction with the OptumInsight business and technology partners.
* Design, code, test, document, and maintain high-quality and scalable Big Data solutions.
* Analyse raw data sources, data transformation, structural requirements for new software and application.
* Migrate data from legacy systems to new solutions.
* Design conceptional and logical data model and flowcharts.
* Define security and backup strategy for data solutions.
* Design, build and implement real time / near real time data pipelines.
* Design and prototype data monitoring models for pipeline.
* Write technical documentation.
Requirements
* Experience with data processing and SQL databases.
* Advanced SQL experience.
* Previous experience with Relational Databases (RDBMS) and Non-Relational Database.
* Previous experience in implementation of ETL applications, Data warehousing/data modelling principles, architecture and its implementation in large environments.
* Working Knowledge on public clouds like Azure, AWS, Google Cloud and developing applications on Linux environments.
* Experience with Hadoop/Snowflake/Vertica/Redshift/Synapse.
* Experience with Spark/Databricks in data processing and machine learning processes.
* Experience with Airflow in implementation of data pipeline orchestrations, Airflow operators and hooks.
* Experience with Kubernetes and Docker, designing, implementing and running data applications.
* Hands-on experience with Python (Java, Scala, useful too).
* Experience with agile/scrum methodologies.