Social network you want to login/join with:
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together.
About the Role:
As a Senior Data Engineer you will be working on developing and maintaining data pipelines that extract, transform, and load (ETL) data from various sources into a centralized data storage system, such as a data warehouse or data lake. Ensure the smooth flow of data from source systems to destination systems while adhering to data quality and integrity standards. In addition to having impact on a great team, you'll also discover the career opportunities you'd expect from an Industry Leader.
Working Schedule:
This is a full-time position with standard working hours of Monday – Friday, standard business hours. Careers with Optum offer flexible work arrangements and individuals who live and work in the Republic of Ireland will have the opportunity to split their monthly work hours between our Dublin or Letterkenny office and telecommuting from a home-based office in a hybrid work model.
Primary Responsibilities of the Senior Data Engineer:
1. Data Integration: Integrate data from multiple sources and systems, including databases, APIs, log files, streaming platforms, and external data providers. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting.
2. Data Transformation and Processing: Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks.
3. Contribute to common frameworks and best practices in code development, deployment, and automation/orchestration of data pipelines.
4. Implement data governance in line with company standards.
5. Partner with Data Analytics and Product leaders to design best practices and standards for developing and productionalizing analytic pipelines.
6. Partner with Infrastructure leaders on architecture approaches to advance the data and analytics platform, including exploring new tools and techniques that leverage the cloud environment (Azure, Snowflake, others).
7. Monitoring and Support: Monitor data pipelines and data systems to detect and resolve issues promptly. Develop monitoring tools, alerts, and automated error handling mechanisms to ensure data integrity and system reliability.
Required Qualifications of the Senior Data Engineer:
1. Extensive experience designing data solutions including data modeling. Data Architecture experience is a plus.
2. Extensive hands-on experience developing data processing jobs (PySpark / SQL) that demonstrate a strong understanding of software engineering principles.
3. Experience orchestrating data pipelines using technology like ADF, Airflow etc.
4. Experience working with Hive /HBase / Presto.
5. Fluent in SQL (any flavor), with experience using Window functions and more advanced features.
6. Experience of DevOps tools, Git workflow and building CI/CD pipelines.
7. Experience applying data governance controls within a highly regulated environment.
8. The ability to communicate effectively with users. There will be a requirement to be able to understand new required changes.
Preferred Qualifications of the Senior Data Engineer:
1. Bachelor’s Degree or higher in Database Management, Information Technology, Computer Science or similar.
2. Proven Data Engineering experience.
3. A motivated self-starter who excels at managing their own tasks and takes ownership.
4. Experience working in projects with agile/scrum methodologies.
5. Experience with Azure Databricks and Snowflake.
6. Well versed in Python regarding data manipulation, cleaning, transforming, and analyzing structured data to support our data-driven initiatives.
7. Familiarity with production quality ML and/or AI model development and deployment.
Please note you must currently be eligible to work and remain indefinitely without any restrictions in the country to which you are making an application. Proof will be required to support your application.
All telecommuters will be required to adhere to the UnitedHealth Group’s Telecommuter Policy.
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalised groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.
#J-18808-Ljbffr