4 Sources
[1]
Clippy resurrected as AI assistant -- project turns infamous Microsoft mascot into LLM interface
Everyone's favorite old paperclip is back in action -- in a manner of speaking. If you're old enough to remember when 8MB thumb drives hit the scene, you'll probably remember Clippy, Microsoft's digital writing assistant. Clippy lived in the bottom corner of Microsoft Office from 1996 to 2003, but now he can return to your desktop with a new life as a mouthpiece for AI, thanks to a new project from software engineer ‪Felix Rieseberg. This odd couple of 90's UI design and the modern-day AI craze provides potential users the ability to set up any locally installed LLM and use Clippy as its mouthpiece. Many of the most popular publicly-available LLMs will function with Clippy, with one-click installation supported for the newest from Google, Meta, Microsoft, and Qwen. Clippy's original art and animations are joined by a Windows 98-styled chat and settings window. On installation, Clippy silently cycles through animations while the program automatically downloads Google's Gemma3-1B model. Once paired with an LLM, the Clippy-bot approximates the original Clippy's tone thanks to a lengthy prompt instruction that seeks to disguise the model in use. Users are able to edit or replace this starting prompt to get their most Clippy-esque experience (or to give your Clippy more of his highly-memed snark). Rieseberg, creator of new Clippy, refers to the program as "a love letter and homage to the late, great Clippy," as well as Microsoft's 90s visual aesthetic. He calls the app a piece of "software art," or, if you don't like it, "software satire." There is certainly something to be said about the relationship between an artist and programmer designing a quirky writing aide character, and a chatbot later told to attempt to emulate that work -- though the list of people waxing philosophical about Clippy likely doesn't extend far beyond this author. We've seen Clippy replacements before, but this new-and-improved paperclip doesn't require access to a paid tier of ChatGPT, nor does it seek to modernize Clippy's look to match Windows' newer design sensibilities. This is the Clippy you know and love (?): a more 90's-looking blend of old and new computing sensibilities. The Clippy Desktop Assistant is available for download for Windows, Mac, and Linux via the project's website, with a deeper look behind the curtain available on Github. While it's not likely to revolutionize any desktop workflows, this new Clippy stands at the ready for those happy few with one foot in the vector-graphics of the 90s... and the other in the AI-present.
[2]
Clippy back as local LLM interface, but not from Microsoft
Clippy is back - and this time, its arrival on your desktop as a front-end for locally run LLMs has nothing to do with Microsoft. In what appears to be a first for the 90s icon, Clippy has finally been made useful, ish, in the form of a small application that allows users to chat with a variety of AI models running locally, with Gemma 3, Qwen3, Phi-4 Mini and Llama 3.2 serving as built-in ready-to-download versions. Clippy can also be configured to run any other local LLM from a GGUF file. Developed by San Francisco-based dev Felix Rieseberg, who we've mentioned several times on The Register before for his work maintaining cross-platform development framework Electron and passion for early Windows nostalgia, the app was written as a "love letter" to Clippy, he wrote on the unofficial app's GitHub page. "Consider it software art," Rieseberg said. "If you don't like it, consider it software satire." He added on the app's About page that he doesn't mean high art. "I mean 'art' in the sense that I've made it like other people do watercolors or pottery - I made it because building it was fun for me." Rieseberg is one of the maintainers of Electron, which uses a Chromium engine and Node.js to allow web apps in things like HTML, CSS, and JavaScript to operate like desktop applications regardless of the underlying platform - and that's what the latest variant of Clippy is all about demonstrating. This Nu-Clippy is meant to be a reference implementation of Electron's LLM module, Rieseberg wrote in the GitHub documentation, noting he's "hoping to help other developers of Electron apps make use of local language models." And what better way than with a bit of '90s tech nostalgia? This unofficial AI iteration of Clippy (Clippy 2.0? 3.0?) may be more capable than its predecessor(s), but that's not to say it's loaded with features. Compared to a platform like LM Studio, which allows users to chat with local LLMs and has nigh countless options for tweaking and modifying models, Clippy is just a chat interface that lets a user talk to a local LLM like it would to one that lives in a datacenter. In that sense, it's definitely a privacy improvement when considered alongside ChatGPT, Gemini, or its relatives, which are invariably trained on user data. Clippy doesn't go online for practically anything, Rieseberg said in its documentation. "The only network request Clippy makes is to check for updates (which you can disable)," Rieseberg noted. AI Clippy is dead-simple to run, too. In this vulture's test on his MacBook Pro, it was a snap to download the package file for an Apple Silicon chip, unzip it, let it download its default model (Gemma 3 with 1 billion parameters), and start asking questions. When Clippy's Windows 95-themed chat window is closed, it remains on the desktop, and a click opens the window back up for a new round of queries. As to what AI Clippy could do if Rieseberg had the time, he told The Register that node-llama-cpp, the Node.js binding file used by Llama and other LLMs, could allow Clippy to access all the typical Llama.cpp inference features that one could use with other locally-run AIs. Aside from temperature, top k, and system prompting, Rieseberg said they're not exposed. "That's just a matter of me being lazy, though. The code to expose all those options is there," Rieseberg added. He's unlikely to get a chance to do so anytime soon, as he's scheduled to join Anthropic to work on Claude next week, meaning he'll be busy with more serious AI projects. Rieseberg also said that he's not particularly worried about Microsoft coming at him for coopting their contentious desktop companion, but said that if they came after him he wouldn't fight. "The moment they tell me to stop, shut it down, and hand over all my code, I will," Rieseberg told us. But he doesn't think it would make sense for Microsoft to do anything with Clippy and AI itself. "Building a fun stupid toy like I have is an entirely different ballgame from building something really solid for the market," he said. "With Cortana and Copilot they have probably much better characters available." The new Clippy is available for Windows, macOS, and Linux, which makes perfect sense given the developer's cross-platform background.®
[3]
The dreaded Clippy assistant is back from the dead after this fan project attaches an LLM to him
Summary Fan-made Clippy app uses AI to revive the pesky paperclip, creating a modern chatbot experience with familiar flair. Features include simple chat interface, multiple AI models, easy setup, ability to load custom models, and offline use. Project is a nostalgic homage to 90s Microsoft, not a groundbreaking chatbot, encompassing the essence of Clippy's era. The name "Clippy" invokes one of two moods. Either you never met him and you're ambivalent towards his name, or you went through the ordeal of using him as your Office assistant back in 1997. If you're of the former camp, Clippy was a little animated "friend" that took the form of a paperclip that tried to be as helpful as possible. He was, essentially, the personification of every annoying Windows pop-up you've ever had, constantly interjecting with ideas or thoughts until you turned him off. Back then, he was very simple for an AI; he could only detect specific patterns or issues and recommend solutions for them. However, a fan has brought him into this side of the 21st Century. By attaching an LLM to him, he can now perform the same functions as a modern-day AI, whether you like him or not. Related 29 years ago, Microsoft Bob released and lived less than a year Microsoft Bob is one of the company's most well-known failures. It was released 29 years ago, and killed off a year later. Posts This fan-made Clippy app lets you revive an ancient evil, now with an LLM As spotted by Tom's Hardware, this fan-made Clippy app aims to recreate the titular assistant in an environment more suited to him today. You can set him up to use one of four AI models: Google's Gemma3, Meta's Llama 3.2, Microsoft's Phi-4, and Qwen's Qwen3. Once you install his brains, you can talk to him like you would a normal chatbot. Here's a full list of features: Simple, familiar, and classic chat interface. Send messages to your models, get a response. Batteries included: No complicated setup. Just open the app and chat away. Thanks to llama.cpp and node-llama-cpp, the app will automatically discover the most efficient way to run your models (Metal, CUDA, Vulkan, etc). Custom models, prompts, and parameters: Load your own downloaded models and play with the settings. Offline, local, free: Everything runs on your computers. The only network request Clippy makes is to check for updates (which you can disable). If you want to pore over his source code, you can check out the GitHub page for Clippy. Although don't expect something that will revolutionise the world of artificial intelligence; as the developer puts it, "This project isn't trying to be your best chatbot." It's simply a cool homage to Microsoft in the 90s using technology from 2025, and that's all it needs to be. Did you manage to avoid Clippy? If so, you can read more about his reign over on our piece on these six Microsoft flops that users hated.
[4]
Clippy AI Assistant: Transform Microsoft's Mascot into a Local LLM Interface
Clippy is back, but this time he's your desktop AI buddy. You don't need to send anything up to the cloud; the whole thing runs on your own machine. The setup's built with Electron, so it works on Windows, macOS and Linux. When you fire it up, Clippy quietly downloads a default LLM -- Google's Gemma3-1B -- and then waits in the corner, just like old times. The interface even looks like Windows 98, complete with the same paperclip animations you remember. You can swap in other models from Meta, Microsoft or wherever with just one click, and if you really want to cut the cord, you can disable all network access so everything stays local. Under the hood, there's a Node.js process that handles model downloads and configuration, and Python code that actually runs the model using PyTorch or TensorFlow. Everything's configured with a JSON file that lays out which model to load, the prompt you want Clippy to start with, and display settings. By default, there's a built-in prompt that makes Clippy sound like he's from the 90s -- snarky and concise -- but you can edit that prompt or replace it entirely if you'd rather have a more serious assistant. The Electron wrapper makes sure all of Clippy's art assets and animations load correctly, and it keeps everything up to date without you having to lift a finger. Once you start typing, the chat window sends your text off to the model via an inter-process link. The reply comes back and pops up in a speech bubble exactly like the one you used to dismiss in Word. On typical consumer hardware, the default 1 billion-parameter Gemma3-1B model answers in about one to two seconds, especially if you've got a decent GPU. If you're the kind of guru who likes to push limits, you can point the JSON config at bigger models -- LLaMA, Qwen, Gemma3-7B -- install their runtimes and see how Clippy performs. It's all under MIT license, and the source, build instructions and issue tracker are on GitHub, so you can fork, tweak or troubleshoot as you see fit. This isn't just a nostalgia trip; it's a framework you can extend. Want Clippy to fetch weather or query your local database? You can write Node.js plugins or Python scripts to hook into the input/output stream. The inter-process interface gives you raw access to the messages, so you can log interactions or pipe responses into other tools. And since it's all offline, you keep full control over your data. Downloading and running the Clippy Desktop Assistant takes five minutes -- longer if you choose a larger model. After that, you've got an AI chat interface that looks and feels like 1996, but thinks with a 2025 brain. Grab it here. Source: tomshardware
Share
Copy Link
A software engineer has revived Microsoft's infamous Clippy as an AI assistant, transforming the 90s icon into a local large language model (LLM) interface. This nostalgic project combines retro aesthetics with modern AI capabilities, offering users a unique and privacy-focused chatbot experience.
Software engineer Felix Rieseberg has breathed new life into Microsoft's iconic Office assistant, Clippy, by transforming it into an AI-powered chatbot interface. This nostalgic project combines the charm of 90s user interface design with cutting-edge AI technology, offering users a unique and privacy-focused digital assistant experience 1.
The Clippy Desktop Assistant allows users to interact with locally installed large language models (LLMs) through a familiar and retro interface. Key features include:
The application is built using Electron, a framework that allows web technologies to create desktop applications. This implementation serves as a reference for Electron's LLM module, potentially helping other developers integrate local language models into their applications 2.
Under the hood, the app utilizes:
One of the key advantages of this new Clippy is its focus on privacy and local processing. The application runs entirely on the user's machine, with no data sent to external servers. The only network request made is to check for updates, which can be disabled 2.
Rieseberg describes the project as a "love letter and homage to the late, great Clippy" and Microsoft's 90s visual aesthetic. He positions it as "software art" or "software satire," emphasizing its role as a fun and nostalgic creation rather than a serious productivity tool 1.
While currently focused on providing a simple chat interface, the project's open-source nature and modular design allow for potential expansions. Developers can extend its functionality by writing Node.js plugins or Python scripts to integrate additional features like weather fetching or database querying 4.
Google rolls out an AI-powered business calling feature in Search and upgrades AI Mode with Gemini 2.5 Pro and Deep Search capabilities, showcasing significant advancements in AI integration for everyday tasks.
11 Sources
Technology
14 hrs ago
11 Sources
Technology
14 hrs ago
Calvin French-Owen, a former OpenAI engineer, shares insights into the company's intense work environment, rapid growth, and secretive culture, highlighting both challenges and achievements in AI development.
4 Sources
Technology
14 hrs ago
4 Sources
Technology
14 hrs ago
Microsoft's AI assistant Copilot lags behind ChatGPT in downloads and user adoption, despite the company's significant investment in AI technology and infrastructure.
4 Sources
Technology
14 hrs ago
4 Sources
Technology
14 hrs ago
Larry Ellison, Oracle's co-founder, surpasses Mark Zuckerberg to become the world's second-richest person with a net worth of $251 billion, driven by Oracle's AI-fueled stock rally and strategic partnerships.
4 Sources
Business and Economy
23 hrs ago
4 Sources
Business and Economy
23 hrs ago
OpenAI has added Google Cloud to its list of cloud partners, joining Microsoft, Oracle, and CoreWeave, as the AI giant seeks to meet escalating demands for computing capacity to power its AI models like ChatGPT.
5 Sources
Technology
7 hrs ago
5 Sources
Technology
7 hrs ago