DeepSeek Timeline | From Founding to DeepSeek-V4

Explore DeepSeek's complete development history, from founding through DeepSeek-V3 and DeepSeek-R1 to the upcoming DeepSeek-V4.

Founding

DeepSeek Founded

DeepSeek was founded by Liang Wenfeng, backed by High-Flyer quantitative hedge fund. The company set out to build world-class AI models.

Product

DeepSeek LLM Released

Released the first open-source 67B parameter large language model, marking DeepSeek's entry into the AI model space.

View Product
Product

DeepSeek Coder Released

Launched DeepSeek Coder, achieving state-of-the-art performance among open-source code models.

View Product
Product

DeepSeek-V2 Released

Released DeepSeek-V2 with revolutionary MoE architecture and Multi-head Latent Attention, dramatically reducing inference costs.

View Product
Product

DeepSeek-Coder-V2 Released

Released DeepSeek-Coder-V2, achieving GPT-4 Turbo level performance in code tasks.

View Product
Product

DeepSeek-R1-Lite Preview

Released R1-Lite Preview, showcasing early reasoning capabilities and chain-of-thought abilities.

View Product
Product

DeepSeek-V3 Released

Released DeepSeek-V3, a 671B MoE model trained for only $5.58M that rivals GPT-4o performance.

View Product
Product

DeepSeek-R1 Released

Released DeepSeek-R1, matching OpenAI o1 through pure reinforcement learning. Open-sourced with MIT license.

View Product
Milestone

Global AI Industry Impact

DeepSeek's efficient training methods and open-source approach sparked global discussions about AI development costs and accessibility.