Mistral 7B

text
open-source
An open-weight 7B parameter model optimized for inference and cost efficiency.
Version: 1.0
Released: 2y 1m 5d ago on 09/27/2023
Pricing:
  • details: free
Repository: Hugging FaceRepo
License: Apache 2.0
Weights Available: Yes

Architecture

  • family: Transformer (Dense)
  • parameters: 7B
  • training_data: Filtered web text and code
  • context_length: 8192
  • inference_type: local or cloud

Capabilities

  • chat
  • text-generation
  • code
  • inference

Languages Supported

enfrdees

Benchmarks

  • MMLU: 72.4
  • GSM8K: 58.2
  • HumanEval: 54.6

Safety

  • No built-in content filters
  • open model may generate biased content
  • may generate harmful content depending on prompts.

Deployment

  • regions: self-hosted
  • hosting: local, Hugging Face Spaces
  • integrations: Text Generation WebUI, Ollama

Tags

open-weightlightweightcost-efficient

Join our community

Connect with others, share experiences, and stay in the loop.