Optum is a global organisation that delivers care, aided by technology to help millions of people live healthier lives.
The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities.
Come make an impact on the communities we serve as you help us advance health equity on a global scale.
About the role:
We want to make healthcare work better for everyone. This depends on hiring the best and brightest. With a thriving ecosystem of investment and innovation, our business in Ireland is constantly growing to support the healthcare needs of the future.
Our teams are at the forefront of building and adapting the latest technologies to propel healthcare forward in a way that better serves everyone. We use the most advanced development tools, AI, data science and innovative approaches to make the healthcare system work better for everyone.
Todays' RCM is inherently complex due to opaque and payer-specific adjudication rules, incurring significant waste and inefficiency. Optum Insight is committed to delivering new technology and processes to improve the Revenue Cycle Management (RCM) workflow using advances in Gen AI and other state-of-the-art technology that would ultimately benefit all stakeholders involved in the healthcare ecosystem — payers, providers, and patients.
Main responsibilities:
* Work collaboratively with business partners, SMEs, and developers to ensure a shared understanding of business and technical requirements.
* Design and build data pipelines to process terabytes of data.
* Develop and recommend best practice re data ingestion, processing, cleaning and standardizing of data (typically on Azure).
* Create Docker images for various applications and deploy.
* Design and build tests.
* Troubleshoot production issues.
* Analyze existing Data solutions and recommend automation/efficiency options.
* Work on Proof of Concepts for Big Data and Data Science.
* Demonstrate superior communication and presentation capabilities, adept at simplifying complex data insights for audiences without a technical background.
You'll be rewarded and recognized:
You will be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role, as well as providing development for other roles you may be interested in.
Required qualifications:
* Developing processes in Spark/Databricks.
* Writing complex SQL queries.
* Building ETL/data pipelines.
* Cloud-based technologies, data migration/automation and data engineering practices.
* Exposure to Kubernetes or Docker.
Preferred qualifications:
* Cloud Technologies Certification e.g. Azure/Amazon AWS.
* Azure Data Factory/Airflow.
* Terraform.
* Jenkins / Travis.
* Microsoft Synapse.
* Previous experience with Relational Databases (RDBMS) & Non- Relational Database.
* Analytical and problem-solving experience applied to a Big Data environment and Distributed Processing.
* Experience working in projects with agile/scrum methodologies.
* Exposure to DevOps methodology.
* Data warehousing principles (eg Kimball, Inmon, Data Vault), architecture and its implementation in large environments.