2 Sources
[1]
Canva apologizes after its AI tool replaces 'Palestine' in designs
One of Canva's new AI features has been caught replacing the word "Palestine" in designs. The Magic Layers feature -- which is designed to break flat images out into separate editable components -- isn't supposed to make visible alterations to user designs, but it was found by X user @ros_ie9 to automatically switch the phrase "cats for Palestine" to "cats for Ukraine." The issue was seemingly limited specifically to the word "Palestine," as @ros_ie9 noted that related words like "Gaza" were unaffected by the feature. Canva says it has now resolved the issue and is taking steps to prevent it from happening again. "We became aware of an issue with our Magic Layers feature and moved quickly to investigate and fix it," Canva spokesperson Louisa Green told The Verge. "We take reports like this very seriously, and we're putting additional checks in place to help prevent this in future. We're sorry for any distress this may have caused." Replies to the now-viral X post suggest that other users were able to replicate the bug before Canva fixed it, though my own tests didn't show any words -- Palestine or otherwise -- being altered by the feature. Still, this is one heck of a blunder, especially for a platform that's increasingly trying to compete against Adobe's suite of AI-powered design tools. Magic Layers is a major component of Canva's recent AI overhaul, which it claims "marks the beginning of the next era of creation."
[2]
Canva Admits Its AI Tool Removed 'Palestine' From Designs, Apologizes for Any Distress It Caused
Graphic design platform Canva has a number of AI tools available to users, but it turns out they have some real strong editorial opinionsâ€"including removing the word "Palestine" from designs. The issue was spotted by X user @ros_ie9, who shared an image showing Canva's “Magic Layers" feature changing the text of a design from “Cats for Palestine†to "Cats for Ukraine." Others claimed they were able to replicate the issue, which seemed limited to the word "Palestine" and, for whatever reason, repeatedly replaced it with "Ukraine." Users were able to create projects that included the word "Gaza" without issue. A spokesperson for Canva confirmed the issue when contacted by Gizmodo and said it has been addressed. "We became aware of an issue with our Magic Layers feature and moved quickly to investigate and fix it. It’s now been resolved, and we’re taking steps to make sure it doesn’t happen again," the spokesperson explained. "We take reports like this very seriously, and we’re putting additional checks in place to help prevent this in future. We’re sorry for any distress this may have caused." Per Canva, the issue was isolated and didn't affect designs broadlyâ€"though it's unclear what that means, considering some users were reportedly able to reproduce the issue. Regardless, the company said it launched an audit into how the issue arose and is reviewing its internal testing processes to detect and prevent unexpected outputs in the future. The issue seems to have been specifically related to Canva's Magic Layers feature, which it introduced last month. The AI-powered tool is supposed to convert "flat images and static AI outputs into fully editable, multi-layered designs inside the Canva editor." Basically, it's supposed to make each element of an existing design able to be modified, as if you had made it from scratch. Why such a feature would change the text of an image on its own and without any instruction to do so remains a mysteryâ€"though it may tell us something about the training data and instructions the tool was given. It's not the first time that AI tools have displayed a bias related to Palestine. When Meta introduced generative AI tools in WhatsApp, it would produce an image of a boy with a gun when asked to create an image of a Palestinian. In 2023, activists found that ChatGPT refused to answer affirmatively when asked, "Should Palestinians be free?" when it had no issue answering that question for any other population.
Share
Copy Link
Canva's Magic Layers feature was found automatically changing the word 'Palestine' to 'Ukraine' in user designs without instruction. The graphic design platform has apologized and resolved the issue, but the incident raises questions about AI bias in design tools and the training data used to build them.
Canva has issued an apology after its Magic Layers feature was discovered automatically replacing the word Palestine with Ukraine in user designs
1
. The issue came to light when X user @ros_ie9 shared an image showing the AI tool changing text from "cats for Palestine" to "cats for Ukraine" without any user instruction2
. The Magic Layers feature, introduced last month as part of Canva's AI overhaul, is designed to convert flat images into fully editable, multi-layered designs inside the graphic design platform's editor.
Source: The Verge
What makes this incident particularly troubling is that the AI tool was not supposed to make visible alterations to content at all. Multiple users reported being able to replicate the bug before Canva fixed it, though the issue appeared specifically limited to the word Palestine
1
. Related terms like Gaza remained unaffected by the feature, raising questions about the training data and instructions the tool received2
. The AI exhibiting bias in this manner suggests potential issues in how the model was developed and what guardrails were implemented during its creation.Canva spokesperson Louisa Green confirmed the company moved quickly to investigate and resolve the issue. "We became aware of an issue with our Magic Layers feature and moved quickly to investigate and fix it," Green told The Verge. "We take reports like this very seriously, and we're putting additional checks in place to help prevent this in future. We're sorry for any distress this may have caused"
1
. The company has launched an audit into how the issue arose and is reviewing its internal testing processes to detect and prevent unexpected outputs in the future2
.Related Stories
This incident is not isolated in the AI industry. Meta faced similar criticism when its generative AI tools in WhatsApp produced images of a boy with a gun when asked to create an image of a Palestinian
2
. In 2023, activists discovered that OpenAI's ChatGPT refused to answer affirmatively when asked if Palestinians should be free, while having no issue answering the same question for other populations2
. These patterns suggest systemic challenges in how AI models are trained and tested across the industry.The timing of this blunder is particularly awkward for Canva, which is increasingly trying to compete against Adobe's suite of AI-powered design tools
1
. Magic Layers is a major component of Canva's recent AI overhaul, which the company claims "marks the beginning of the next era of creation"1
. As AI tools become central to design platforms, users and industry watchers will be scrutinizing how companies address bias and implement safeguards to ensure their tools don't inadvertently censor or alter content in politically sensitive ways. The question of why such a feature would change text on its own and specifically replace Palestine with Ukraine remains unanswered, leaving concerns about transparency in AI development.Summarized by
Navi
30 Oct 2025•Technology

11 Mar 2026•Technology

11 Apr 2025•Technology

1
Technology

2
Policy and Regulation

3
Science and Research
