DuckLLM By Duck Inc.

Experience AI
With Privacy

DuckLLM is a powerful, privacy-focused model built to provide a seamless GUI Experience for Your Desktop!

DuckLLM Is Based On the Qwen 2.5 Vision Model Customized To Perfection, With The Changes DuckLLM Is Capable Of Great Efficiency And File Handling

All While Running Entirely Locally On Your Machine With No Cloud Dependencies.

Requirements

Python 3.12

Compiler Required

Ollama

Windows Only Requirement

Latest Desktop Release

v4.0.0 Release Log

This Update Brings a Better Experience For All Users But Especially For Developers!

Added

Full Screen Mode

Full Screen Mode With Full Sync Is Now Available For Users Who Want To Maximize Their Experience!

Better Coding UI

Now For Developers, Easily Copy & Download Your Code With The New UI!

Improved Perfomance

Now With a Refined API You Get Full Native Ollama Perfomance For Responses In Near Seconds!

Fixed

Detection Area Scale Fixed For Windows

Now You Won't Have To React In Lightning Speed Just To Add a File (sorry mb for the bug)

Removed

None!

Nothing Has Been Removed!

Latest Mobile Release

v1.0.0 Release Logs

Welcome To The Age Of Native AI Directly On Your Mobile Phones!

DuckLLM Mobile utilizes Ollama & Wllama For Native Easy Local AI Models On Device!

Added

Duck Memos

Duck Memos Also Known As DuckLLM Memories Now Allow You To Easily Manage Workflows With The Ability To Add Both Text Instructions And Files!

Heavy Perfomance Boosts (Wllama)

Tested & Confirmed To Be 2x Faster Than The Closed Beta Version!

Refined UI

Experience a Smooth Feel Across The Entire Interface With Beautiful Animations, Transitions, Gradients, Fades!

Dual Mode

For Advanced Users Looking For Speed & Control You Can Turn On "Ollama" Mode For Faster Responses Utilizing Ollama In Termux

Unfiltered Mode

Unfiltered Mode Mimics The Unfiltered Responses Of Grok AI Directly On Your Device!

Real-Time Web Search

Now Easily Toggle On Web Search For Info About Any Topic You Want!

Dedicated Settings Menu

Now With a Dedicated Settings Menu For Easy Customization!

Download Center

Easily Download The Official DuckLLM Mobile Model Directly In-App

Fixed

None

No Issues

Removed

None!

No Things Removed

Screenshots

A look at DuckLLM in action.

DuckLLM Desktop (Small) 1 / 6
Screenshot

Supported Distributions

Windows
Ubuntu LTS
Debian
Android
Arch Linux
All Linux(s)

System Specifications

Windows x86/ARM

Build Configuration

Processor i5 7th Gen / Ryzen 1600
Memory 8GB-12GB DDR3
Graphics Integrated Graphics

Recommended Build

Processor i5 12th Gen / Ryzen 3600x
Memory 12GB - 16GB DDR4
Graphics GTX 1660 TI / Radeon 5000 (8GB+)
Linux x86/ARM

Build Configuration

Processor i5 5200U / Ryzen 1000
Memory 8GB DDR3 RAM
Graphics Intel UHD 4000

Recommended Build

Processor i5 10th Gen / Ryzen 3500x
Memory 8GB - 12GB DDR4
Graphics Integrated Graphics
AndroidARM Only (Beta)

Minimum Specs

Processor Snapdragon 7 Gen 1/MediaTek Dimensity 1200/Exynos 982x
Memory 6GB
Android Android 12

Recommended Specs

Processor Snapdragon 8 Gen 1/MediaTek Dimensity 9000/Exynos 2200
Memory 12GB (Feature Proof)
Android Newest Android
DuckLLM VisionThis Is Calculated Without CPU & RAM Bottleneck

Minimum Build

CPU Intel Core i5 11th Gen/AMD Ryzen 5700
GPU AMD Radeon 5700 XT / Nvidia RTX 2080 / Intel ARC A770
VRAM 8GB Ram

Recommended Build

CPU Intel Core i7 12th Gen/AMD Ryzen 5800
GPU AMD Radeon 6800 XT / Nvidia RTX 3080
VRAM 12GB Ram

Community & Contact

You Can Find Support Via The Official DuckLLM Discord Or The DuckLLM Guide

Join The Discord Community

View DuckLLM Guide / Docs
Why DuckLLM?

Here are Examples Of Where DuckLLM Excels As a Better Alternative! (Situation/Alternative)

S : I Am Tired Of Paying 20$/Month For Cloud Services Which Still Limit Me

A: DuckLLM Is Completely Free With 0 Monthly Fees Or Cloud Services

S : I Want To Have Actual Control Over My Data

A: DuckLLM Offers a Secure & Private Alternative For You!

S : I'm Tired Of Having To Have Internet Just To Get Help

A: DuckLLM Uses a Local Model, So You Can Use It Offline!

S : I Don't Want To Be Forced Into Guidelines That Make The Responses Worse

A: DuckLLM Offers an Unfiltered Mode For Those Who Prefer Unrestricted Responses (Or Customize The Instructions In The Modelfile)

S : I'm Too Scared Of Using Local Models Because Of The Complexity Of The Setup

A: DuckLLM Provides Simple Installation Making Using Local Models Accessible To Everyone!