PaLM 2
text
paid
PaLM 2 (2023) is Google’s 340B-parameter language model. It was trained on extensive multilingual and code datasets (~78...
Version: 2.0
Released: 2y 5m 22d ago on 05/10/2023
Architecture
- parameters: 340B (dense Transformer)
- context_length: 8192 tokens (approx.)
- training_data: Pretrained on ~780B tokens of web, books, code, and more
- inference: Decoder-only Transformer
Capabilities
- Strong few-shot learning for reasoning, coding, and translation
- reasoning
- coding
- translation
- multilingual support
- 100+ languages
- code generation
Benchmarks
- Performance: SOTA on multiple reasoning and translation benchmarks (e.g., surpasses earlier LMs)
Safety
- Trained with Google’s Responsible AI practices
- includes content filters aligned to Google policies.
Deployment
- regions: global
- hosting: API
- integrations: Google Cloud products
Tags
LLMmultilingualreasoningGoogle