DeepSeek LLM

Released

Open-source 67B parameter LLM with strong bilingual capabilities

Released on 2024.01.05

Overview

DeepSeek LLM is a powerful open-source large language model with 67 billion parameters. It demonstrates strong performance across various benchmarks and supports both English and Chinese languages.

Key Features

  • 67 billion parameters
  • Strong Chinese and English performance
  • Open-source with permissive license
  • Competitive with GPT-3.5

Specifications

Parameters
67B
Architecture
Transformer Decoder
Context Length
4K tokens
Training Tokens
2T tokens
License
DeepSeek License

Resources