Pathway Language Model (PaLM): scaling up to 540 billion parameters

Recently, large language models (LLMs) have shown that it is possible to achieve impressive results without large-scale task-specific data collection or model parameter updating.

Image Credits: Google AI Blog

Image Credits: Google AI Blog

To further increase understanding of the capabilities that emerge with low-shot learning, Google Research has proposed an approach to Pathway, a single model that can generalize across domains and tasks while being highly efficient.

A recent paper introduces the Pathway Language Model (PaLM), which enables progress towards this goal. PaLM is a 540-billion parameter, compact decoder-BLP only transformer model trained with Pathway Systems.

Image Credits: Google AI Blog

Image Credits: Google AI Blog

This model illustrates language comprehension and success capabilities in diverse domains such as generation, logic and code-related tasks. For example, in the field of natural language processing, it can separate cause and effect and even predict film from emoji.

Source Link: https://ai.googleblog.com/2022/04/pathways-language-model-palm-scaling-to.html