Deep Seek

Open Large Language Model

As of 2025-03-05, Deepseek R1 is still the ONLY open LLM in the top 10 of most LLM rankings.


source: llmarena


source: artificialanalysis

Distilled Models

Many Deepseek R1 models (including distilled models) are now available in quantised format on Ollama. For example:

Ollama Models Parameters Size
Qwen-7B 7 billion 4.7 Gbyte
Llama-8B 8 billion 4.9 Gbyte
Qwen-32B 32 billion 20 Gbyte
Llama-70B 70 billion 43 Gbyte
Standard 671 billion 404 Gbyte

The more than 500 Gbyte of fast memory to run the standard deepseek-r1 model (even in quantised format) is beyond most individuals. We did tried it for fun without CPU using one of our upcycled Campus Station 2025c - the result wasn't pretty.

, even the larger distill models have fast memory requirement of more than 16 Gbytes making them beyond the reach of most households.

Recommended Size

The DeepSeek-R1-Distill-Llama-70B provides more correct answers than others distill models, but can be slow and takes a lot of VRAM. The DeepSeek-R1-Distill-Qwen-32B is suitable for most use cases.

Performance


source: towardsai

Hardware


source: reddit

Deepseek Deployment

There are MANY ways of deployinig Deepseek R1 some good references:

  1. https://snowkylin.github.io/blogs/a-note-on-deepseek-r1.html
  2. https://unsloth.ai/blog/deepseekr1-dynamic

Deepseek Stories

DeepSeek-R1 is ranked amongst the top reasoning AI models by many different ranking lists, have something that powerful with a open MIT license is amazing.

More interesting that the actual paper: deepseek-r1.pdf is what people worldwide are doing with it.

1. Aha Moments

Deepseek's Aha moment demonstrated the power of reinforcement learning have been reproduced at increasingly less cost:

2. Self Improvement

Deepseek-R1 has been used to improvement its OWN ecosystem:

Dynamic Quantization

https://unsloth.ai/blog/deepseekr1-dynamic

Deepseek CPU Only Test

It is possible to run Ollama deepseek-r1:671b model at low cost (and low performance) with just CPU.

Ask Deepseek to "write a short poem" and check the tokens per seconds.

ollama show deepseek-r1:671b

Model
architecture deepseek2
parameters 671.0B
context length 163840
embedding length 7168
quantization Q4_K_M

Parameters
stop "<|begin▁of▁sentence|>"
stop "<|end▁of▁sentence|>"
stop "<|User|>"
stop "<|Assistant|>"

The two tests below show similar token generation rate (between 1 to 2 tokens/second) between slow old and fast new hardware with the SAME prompt. Although others have higher performance there is no much details about how the numbers were achieved.

1.1. New Fast CPU Sample (Nov 2022)

CPU: Dual EPYC 9654
RAM: 768 GB (24 channels DDR5)

>>> write a short poem

<think> Okay, the user wants me to write a short poem. Let's start by thinking about themes they might like. Maybe something universal and uplifting? Nature often works well for poetry because it's relatable and vivid.  I should consider the structure—probably a simple rhyme scheme so it flows nicely. AABB or ABAB maybe. Need to keep it concise, just four stanzas perhaps. Let me brainstorm some imagery: dawn, forests, rivers... these elements can symbolize beginnings and persistence.  Wait, adding contrast might add depth—like shadows and light, storms and calm. That creates a dynamic feel. Also, including elements like whispers, roots, tides can give it a sense of movement and growth.  Now check the rhythm. Each line should have a similar meter. Let me read through to see if it's smooth. Avoid forced rhymes; make sure each couplet makes sense together. The ending should tie everything up with a hopeful message about time and journeys. Yeah, that works. </think>

Echoes of Dawn

Beneath the veil of waking skies,
a blush of light begins to rise—
the forest hums a whispered hymn,
as shadows shrink from morning’s rim.

A river carves its ancient creed
through stones that hold forgotten seed.
The wind, both architect and thief,
unfurls the map of every leaf.

We trace the paths where storms have been,
where roots embrace what lies unseen—
and in the pulse of tides,
we find the fragile threads that bind mankind.

For time is but a breath undone,
a fleeting dance with moon and sun.
Yet here we stand, both lost and found,
our echoes woven into ground.

total duration:       3m24.6020313s
load duration:        15.0067ms
prompt eval count:    7 token(s)
prompt eval duration: 999ms
prompt eval rate:     7.01 tokens/s
eval count:           356 token(s)
eval duration:        3m23.587s
eval rate:            1.75 tokens/s 

source: reddit

1.2. Old Slow CPU Sample (Sep 2014)

CPU: Dual Xeon 2630 v3
RAM: 512 GB (8 channels DDR4)

>>> write a short poem

<think>
Okay, the user asked me to write a short poem. Let's start by thinking about what they might need. A short poem can vary in structure—maybe a haiku, quatrain, or free verse? The user didn't specify a theme, so I should choose something universal and relatable.

Nature themes are usually safe and evoke vivid imagery. Seasons often inspire emotions. Autumn is rich with sensory details: colors, cool air, falling leaves. That could work well. Let me brainstorm some images: golden leaves, crisp air, harvest time, migrating 
birds.

Now, structure. A quatrain with rhyme scheme ABAB might be nice. Keep each line concise. Start setting the scene—maybe a forest in autumn. Use personification for elements like wind or trees to add depth.

First line: "The amber woods breathe crisp and low," introduces color and atmosphere. Second line could mention a journey, maybe of a leaf or a creature. "A whispered thread through branches sown"—weaves in movement and sound.

Third line connects the season's change to human experience: "Each fallen leaf a harvest hymn." Links nature to emotion. Last line ties it together with a universal feeling: "The earth holds roots where all truths begin."

Check syllable count for rhythm. Each line around 8-10 syllables. Read aloud to test flow. Adjust any clunky parts. Ensure the imagery is cohesive and the message resonates—maybe about change and finding truth in nature.

Final read-through. Does it capture a moment and evoke feeling? I think so. Let me present this as the poem, then offer feedback if needed.
</think>

Autumn's Whisper

The amber woods breathe crisp and low,
A whispered thread through branches sown—
Each fallen leaf a harvest hymn,
The earth holds roots where all truths begin.

total duration:       5m4.320628196s
load duration:        61.279626ms
prompt eval count:    7 token(s)
prompt eval duration: 10.424s
prompt eval rate:     0.67 tokens/s
eval count:           387 token(s)
eval duration:        4m53.821s
eval rate:            1.32 tokens/s

source: ourselves using Campus Station 2025c