Falcon 180B

text
open-source
TII's Falcon 180B (2023) is an open 180B-parameter LLM trained on 3.5T tokens, ranking among the top public models.
Version: 1.0
Released: 2y 1m 26d ago on 09/06/2023
Pricing:
  • details: free
Repository: Hugging FaceRepo
License: Apache 2.0
Weights Available: Yes

Architecture

  • family: Falcon
  • parameters: 180B
  • training_data: RefinedWeb (3.5T tokens)
  • context_length: 2048
  • inference_type: local or cloud

Capabilities

  • chat
  • text-generation
  • code generation

Languages Supported

en

Benchmarks

  • MMLU: 67.9

Safety

  • content filtering: false
  • alignment: none (foundation)

Deployment

  • regions: global
  • hosting: local, Hugging Face Spaces
  • integrations: Hugging Face Transformers

Tags

open-weightresearch

Join our community

Connect with others, share experiences, and stay in the loop.