Role Summary
Pfizer's purpose is to deliver breakthroughs that change patients' lives. Research and Development is at the heart of fulfilling Pfizer's purpose as we work to translate advanced science and technologies into the therapies and vaccines that matter most.
Key Responsibilities
* Develop, implement, and optimize artificial intelligence models and algorithms to drive innovation and efficiency in Data Analytics and Supply Chain solutions.
* Create test plans, test scripts, and perform data validation.
* Conceive, design, and implement Cloud Data Lake, Data Warehouse, Data Marts, and Data APIs.
* Develop complex data products that are beneficial for PGS and allow for reusability across enterprise.
* Collaborate with contractors to deliver technical enhancements.
* Develop automated systems for building, testing, monitoring, and deploying ETL data pipelines within a continuous integration environment.
* Develop internal APIs and data solutions to enhance application functionality and facilitate connectivity.
* Collaborate with backend engineering teams to analyze data, enhancing its quality and consistency.
* Conduct root cause analysis and address production data issues.
* Design, develop, and implement AI models and algorithms to solve sophisticated data analytics and supply chain initiatives.
BASIC QUALIFICATIONS
* A bachelor's or master's degree in computer science, Artificial Intelligence, Machine Learning, or a related discipline.
* Over 2 years of experience as a Data Engineer, Data Architect, or in Data Warehousing, Data Modeling, and Data Transformations.
* Over 1 year of experience in AI, machine learning, and large language models (LLMs) development and deployment.
* Proven track record of successfully implementing AI solutions in a healthcare or pharmaceutical setting is preferred.
PREFERRED QUALIFICATIONS
* Data Warehousing: Experience with data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake.
* ETL Tools: Knowledge of ETL tools like Apache NiFi, Talend, or Informatica.
* Cloud Platforms: Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP).
* Containerization: Understanding of Docker and Kubernetes for containerization and orchestration.
* AI and Automation: Knowledge of AI-driven tools for data pipeline automation, such as Apache Airflow or Prefect.