DuckLLM is a powerful, privacy-focused model built to provide a seamless GUI Experience for Your Desktop!
DuckLLM Is Based On the Qwen 2.5 Vision Model Customized To Perfection, With The Changes DuckLLM Is Capable Of Great Efficiency And File Handling
All While Running Entirely Locally On Your Machine With No Cloud Dependencies.
Python 3.12
Compiler Required
Ollama
Windows Only Requirement
v4.0.0 Release Log
Full Screen Mode With Full Sync Is Now Available For Users Who Want To Maximize Their Experience!
Now For Developers, Easily Copy & Download Your Code With The New UI!
Now With a Refined API You Get Full Native Ollama Perfomance For Responses In Near Seconds!
Now You Won't Have To React In Lightning Speed Just To Add a File (sorry mb for the bug)
Nothing Has Been Removed!
v1.0.0 Release Logs
DuckLLM Mobile utilizes Ollama & Wllama For Native Easy Local AI Models On Device!
AddedDuck Memos Also Known As DuckLLM Memories Now Allow You To Easily Manage Workflows With The Ability To Add Both Text Instructions And Files!
Tested & Confirmed To Be 2x Faster Than The Closed Beta Version!
Experience a Smooth Feel Across The Entire Interface With Beautiful Animations, Transitions, Gradients, Fades!
For Advanced Users Looking For Speed & Control You Can Turn On "Ollama" Mode For Faster Responses Utilizing Ollama In Termux
Unfiltered Mode Mimics The Unfiltered Responses Of Grok AI Directly On Your Device!
Now Easily Toggle On Web Search For Info About Any Topic You Want!
Now With a Dedicated Settings Menu For Easy Customization!
Easily Download The Official DuckLLM Mobile Model Directly In-App
No Issues
No Things Removed
A look at DuckLLM in action.
You Can Find Support Via The Official DuckLLM Discord Or The DuckLLM Guide
Join The Discord Community View DuckLLM Guide / DocsHere are Examples Of Where DuckLLM Excels As a Better Alternative! (Situation/Alternative)
A: DuckLLM Is Completely Free With 0 Monthly Fees Or Cloud Services
A: DuckLLM Offers a Secure & Private Alternative For You!
A: DuckLLM Uses a Local Model, So You Can Use It Offline!
A: DuckLLM Offers an Unfiltered Mode For Those Who Prefer Unrestricted Responses (Or Customize The Instructions In The Modelfile)
A: DuckLLM Provides Simple Installation Making Using Local Models Accessible To Everyone!