ROLE SUMMARY
Pfizer's purpose is to deliver breakthroughs that change patients' lives. Research and Development is at the heart of fulfilling Pfizer's purpose as we work to translate advanced science and technologies into the therapies and vaccines that matter most. Whether you are in the discovery sciences, ensuring drug safety and efficacy or supporting clinical trials, you will apply cutting-edge design and process development capabilities to accelerate and bring the best in class medicines to patients around the world.
Pfizer is seeking a highly skilled and motivated AI Engineer to join our advanced technology team. You will be responsible for developing, implementing, and optimizing artificial intelligence models and algorithms to drive innovation and efficiency in our Data Analytics and Supply Chain solutions. This role demands a collaborative mindset, a passion for cutting-edge technology, and a commitment to improving patient outcomes.
ROLE RESPONSIBILITIES
* Responsible for data modeling and engineering within the advanced data platforms teams to achieve digital outcomes. Create test plans, test scripts, and perform data validation.
* Conceive, design, and implement Cloud Data Lake, Data Warehouse, Data Marts, and Data APIs.
* Develop complex data products that are beneficial for Pfizer Global Supply and allow for reusability across enterprise.
* Ability to collaborate with contractors to deliver technical enhancements.
* Develop automated systems for building, testing, monitoring, and deploying ETL data pipelines within a continuous integration environment.
* Develop internal APIs and data solutions to enhance application functionality and facilitate connectivity.
* Collaborate with backend engineering teams to analyze data, enhancing its quality and consistency.
* Conduct root cause analysis and address production data issues.
* Design, develop, and implement AI models and algorithms to solve sophisticated data analytics and supply chain initiatives.
* Stay abreast of the latest advancements in AI and machine learning technologies and apply them to Pfizer's projects.
* Provide technical expertise and guidance to team members and stakeholders on AI-related initiatives.
* Document and present findings, methodologies, and project outcomes to various stakeholders.
* Integrate and collaborate with different technical teams across Digital to drive overall implementation and delivery.
* Ability to work with large and complex datasets, including data cleaning, preprocessing, and feature selection.
BASIC QUALIFICATIONS
* A bachelor's or master's degree in computer science, Artificial Intelligence, Machine Learning, or a related discipline.
* Over 3 years of experience as a Data Engineer, Data Architect, or in Data Warehousing, Data Modeling, and Data Transformations.
* Over 1 years of experience in AI, machine learning, and large language models (LLMs) development and deployment.
* Proven track record of successfully implementing AI solutions in a healthcare or pharmaceutical setting is preferred.
* Strong understanding of data structures, algorithms, and software design principles
* Programming Languages: Experience in Python, SQL, and familiarity with Java or Scala.
* Big Data Technologies: Familiarity with Hadoop, Spark, and Kafka for big data processing.
PREFERRED QUALIFICATIONS
* Data Warehousing: Experience with data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake.
* ETL Tools: Knowledge of ETL tools like Apache NiFi, Talend, or Informatica.
* Cloud Platforms: Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP).
* Containerization: Understanding of Docker and Kubernetes for containerization and orchestration.
* AI and Automation: Knowledge of AI-driven tools for data pipeline automation, such as Apache Airflow or Prefect. Ability to use GenAI or Agents to augment data engineering practices
* Data Integration: Skills in integrating data from various sources, including APIs, databases, and external files.
* Data Modeling: Understanding of data modeling and database design principles, including graph technologies like Neo4j or Amazon Neptune.
* Structured Data: Proficiency in handling structured data from relational databases, data warehouses, and spreadsheets.
* Unstructured Data: Experience with unstructured data sources such as text, images, and log files, and tools like Apache Solr or Elasticsearch.
* Data Excellence: Familiarity with data excellence concepts, including data governance, data quality management, and data stewardship.
PHYSICAL/MENTAL REQUIREMENTS
NA
NON-STANDARD WORK SCHEDULE, TRAVEL OR ENVIRONMENT REQUIREMENTS
OCCASIONAL TRAVEL MAY BE REQUIRED
WORK LOCATION ASSIGNMENT: HYBRID