Job Description:
We are a global organisation that delivers care, aided by technology to help millions of people live healthier lives. Our work directly improves health outcomes by connecting people with the care, pharmacy benefits, data, and resources they need to feel their best.
We have a culture guided by diversity and inclusion, talented peers, comprehensive benefits, and career development opportunities. You will make an impact on the communities we serve as you help us advance health equity on a global scale.
About the Role
At UnitedHealth Group and Optum, we want to make healthcare work better for everyone. This depends on hiring the best and brightest. With a thriving ecosystem of investment and innovation, our business in Ireland is constantly growing to support the healthcare needs of the future.
Our teams are at the forefront of building and adapting the latest technologies to propel healthcare forward in a way that better serves everyone. We use the most advanced development tools, AI, data science, and innovative approaches to make the healthcare system work better for everyone.
The Challenge
Today's Revenue Cycle Management (RCM) is inherently complex due to opaque and payer-specific adjudication rules, incurring significant waste and inefficiency. Optum Insight is committed to delivering new technology and processes to improve the RCM workflow using advances in Gen AI and other state-of-the-art technology that would ultimately benefit all stakeholders involved in the healthcare ecosystem — payers, providers, and patients.
Your Role
As a Principal Data Engineer at Optum, you'll be responsible for working with key business and technical partners to develop industry-leading data solutions that provide insights and analytics that drive efficiency and value to clients.
Primary Responsibilities
* Work collaboratively with business partners, SMEs, and developers to ensure a shared understanding of business and technical requirements.
* Design and build data pipelines to process terabytes of data.
* Develop and recommend best practices regarding data ingestion, processing, cleaning, and standardizing of data (typically on Azure).
* Create Docker images for various applications and deploy.
* Design and build tests.
* Troubleshoot production issues.
* Analyze existing Data solutions and recommend automation/efficiency options.
* Work on Proof of Concepts for Big Data and Data Science.
* Demonstrate superior communication and presentation capabilities, adept at simplifying complex data insights for audiences without a technical background.
* Serve as a leader/mentor.
Required Qualifications
* Developing processes in Spark/Databricks.
* Writing complex SQL queries.
* Building ETL/data pipelines.
* Cloud-based technologies, data migration/automation, and data engineering practices.
* Exposure to Kubernetes or Docker.
* Related/complementary open-source software platforms and languages (e.g., Scala, Python, Java, Linux).
Preferred Qualifications
* Cloud Technologies Certification e.g., Azure/Amazon AWS.
* Azure Data Factory/Airflow.
* Terraform.
* Jenkins / Travis.
* Microsoft Synapse.
* Previous experience with Relational Databases (RDBMS) & Non-Relational Database.
* Analytical and problem-solving experience applied to a Big Data environment and Distributed Processing.
* Experience working in projects with agile/scrum methodologies.
* Exposure to DevOps methodology.
* Data warehousing principles (e.g., Kimball, Inmon, Data Vault), architecture and its implementation in large environments.
Please note you must currently be eligible to work and remain indefinitely without any restrictions in the country to which you are making an application. Proof will be required to support your application.
All telecommuters will be required to adhere to the UnitedHealth Group's Telecommuter Policy.