3 Sources
3 Sources
[1]
YouTube denies AI was involved with odd removals of tech tutorials
This week, tech content creators began to suspect that AI was making it harder to share some of the most highly sought-after tech tutorials on YouTube, but now YouTube is denying that odd removals were due to automation. Creators grew alarmed when educational videos that YouTube had allowed for years were suddenly being bizarrely flagged as "dangerous" or "harmful," with seemingly no way to trigger human review to overturn removals. AI seemed to be running the show, with creators' appeals seemingly getting denied faster than a human could possibly review them. Late Friday, a YouTube spokesperson confirmed that videos flagged by Ars have been reinstated, promising that YouTube will take steps to ensure that similar content isn't removed in the future. But, to creators, it remains unclear why the videos got taken down, as YouTube claimed that both initial enforcement decisions and decisions on appeals were not the result of an automation issue. Shocked creators were stuck speculating Rich White, a computer technician who runs an account called CyberCPU Tech, had two videos removed that demonstrated workarounds to install Windows 11 on unsupported hardware. These videos are popular, White told Ars, with people looking to bypass Microsoft account requirements each time a new build is released. For tech content creators like White, "these are bread and butter videos," dependably yielding "extremely high views," he said. Because there's such high demand, many tech content creators' channels are filled with these kinds of videos. White's account has "countless" examples, he said, and in the past, YouTube even featured his most popular video in the genre on a trending list. To White and others, it's unclear exactly what has changed on YouTube that triggered removals of this type of content. YouTube only seemed to be removing recently posted content, White told Ars. However, if the takedowns ever impacted older content, entire channels documenting years of tech tutorials risked disappearing in "the blink of an eye," another YouTuber behind a tech tips account called Britec09 warned after one of his videos was removed. The stakes appeared high for everyone, White warned, in a video titled "YouTube Tech Channels in Danger!" White had already censored content that he planned to post on his channel, fearing it wouldn't be worth the risk of potentially losing his account, which began in 2020 as a side hustle but has since become his primary source of income. If he continues to change the content he posts to avoid YouTube penalties, it could hurt his account's reach and monetization. Britec told Ars that he paused a sponsorship due to the uncertainty that he said has already hurt his channel and caused a "great loss of income." YouTube's policies are strict, with the platform known to swiftly remove accounts that receive three strikes for violating community guidelines within 90 days. But, curiously, White had not received any strikes following his content removals. Although Britec reported that his account had received a strike following his video's removal, White told Ars that YouTube so far had only given him two warnings, so his account is not yet at risk of a ban. Creators weren't sure why YouTube might deem this content as harmful, so they tossed around some theories. It seemed possible, White suggested in his video, that AI was detecting this content as "piracy," but that shouldn't be the case, he claimed, since his guides require users to have a valid license to install Windows 11. He also thinks it's unlikely that Microsoft prompted the takedowns, suggesting tech content creators have a "love-hate relationship" with the tech company. "They don't like what we're doing, but I don't think they're going to get rid of it," White told Ars, suggesting that Microsoft "could stop us in our tracks" if it were motivated to end workarounds. But Microsoft doesn't do that, White said, perhaps because it benefits from popular tutorials that attract swarms of Windows 11 users who otherwise may not use "their flagship operating system" if they can't bypass Microsoft account requirements. Those users could become loyal to Microsoft, White said. And eventually, some users may even "get tired of bypassing the Microsoft account requirements, or Microsoft will add a new feature that they'll happily get the account for, and they'll relent and start using a Microsoft account," White suggested in his video. "At least some people will, not me." Microsoft declined Ars' request to comment. To White, it seemed possible that YouTube was leaning on AI to catch more violations but perhaps recognized the risk of over-moderation and, therefore, wasn't allowing AI to issue strikes on his account. But that was just a "theory" that he and other creators came up with, but couldn't confirm, since YouTube's chatbot that supports creators seemed to also be "suspiciously AI-driven," seemingly auto-responding even when a "supervisor" is connected, White said in his video. Absent more clarity from YouTube, creators who post tutorials, tech tips, and computer repair videos were spooked. Their biggest fear was that unexpected changes to automated content moderation could unexpectedly knock them off YouTube for posting videos that in tech circles seem ordinary and commonplace, White and Britec said. "We are not even sure what we can make videos on," White said. "Everything's a theory right now because we don't have anything solid from YouTube." YouTube recommends making the content it's removing White's channel gained popularity after YouTube highlighted an early trending video that he made, showing a workaround to install Windows 11 on unsupported hardware. Following that video, his channel's views spiked, and then he gradually built up his subscriber base to around 330,000. In the past, White's videos in that category had been flagged as violative, but human review got them quickly reinstated. "They were striked for the same reason, but at that time, I guess the AI revolution hadn't taken over," White said. "So it was relatively easy to talk to a real person. And by talking to a real person, they were like, 'Yeah, this is stupid.' And they brought the videos back." Now, YouTube suggests that human review is causing the removals, which likely doesn't completely ease creators' fears about arbitrary takedowns. Britec's video was also flagged as dangerous or harmful. He has managed his account that currently has nearly 900,000 subscribers since 2009, and he's worried he risked losing "years of hard work," he said in his video. Britec told Ars that "it's very confusing" for panicked tech content creators trying to understand what content is permissible. It's particularly frustrating, he noted in his video, that YouTube's creator tool inspiring "ideas" for posts seemed to contradict the mods' content warnings and continued to recommend that creators make content on specific topics like workarounds to install Windows 11 on unsupported hardware. "This tool was to give you ideas for your next video," Britec said. "And you can see right here, it's telling you to create content on these topics. And if you did this, I can guarantee you your channel will get a strike." From there, creators hit what White described as a "brick wall," with one of his appeals denied within one minute, which felt like it must be an automated decision. As Britec explained, "You will appeal, and your appeal will be rejected instantly. You will not be speaking to a human being. You'll be speaking to a bot or AI. The bot will be giving you automated responses." YouTube insisted that the decisions weren't automated, even when an appeal was denied within one minute. White told Ars that it's easy for creators to be discouraged and censor their channels rather than fight with the AI. After wasting "an hour and a half trying to reason with an AI about why I didn't violate the community guidelines" once his first appeal was quickly denied, he "didn't even bother using the chat function" after the second appeal was denied even faster, White confirmed in his video. "I simply wasn't going to do that again," White said. All week, the panic spread, reaching fans who follow tech content creators. On Reddit, people recommended saving tutorials lest they risk YouTube taking them down. "I've had people come out and say, 'This can't be true. I rely on this every time,'" White told Ars.
[2]
YouTube's AI moderator pulls Windows 11 workaround videos
Creators baffled as videos on local accounts, unsupported PCs vanish under 'harmful acts' rule Is installing Windows 11 with a local account or on unsupported hardware harmful or dangerous? YouTube's AI moderation system seems to think so, as it has started pulling videos that show users how to sidestep Microsoft's setup restrictions. Tech YouTuber Rich White, aka CyberCPU Tech, was the first to go public about the issue on October 26, when he posted a video reporting the removal of a how-to he published on installing Windows 11 25H2 with a local account instead of a Microsoft account. In the video, White expressed concern that YouTube's automated flagging process may be the root of the problem, as he found it hard to believe that "creating a local account in Windows 11 could lead to serious harm or even death," as YouTube reportedly alleged when it removed the video. When he appealed, White said that YouTube denied the request within 10 to 20 minutes, early on a Sunday morning, which led him to speculate that there wasn't a human in the loop when the request was shut down. That wasn't his only video removed, either. The next day, White uploaded his video for this week on installing Windows 11 25H2 on unsupported hardware, which was removed hours after being posted. YouTube justified the removal on similar grounds. As was the case with the first removal, White appealed. "The appeal was denied at 11:55, a full one minute after submitting it," White explained in a video following up on both removals. "If this was reviewed by a real human like YouTube claims they are, they watched a 17-minute video and denied the appeal in one minute." At least two other YouTubers - Britec09 and Hrutkay Mods - have released videos alleging much of the same: They published content about Windows workarounds, YouTube removed them, and they had an impossible time getting to a human for any help. YouTube's AI, all three contend, blocked their videos, despite the fact that bypassing Windows 11 locks is neither illegal nor dangerous for people who follow the instructions. Sure, you might muck up your Windows machine, but no one is going to lose a finger. "I don't believe I've spoken with a single human being from Google or YouTube yet since this started," White told us in an email. "It's been all automated." White speculated in one of his videos about the takedowns that Microsoft may have been trying to exert pressure on Google to have the videos taken down. Neither Microsoft nor Google has responded to our questions on the matter, so while that prospect remains speculative, there is a timing element that's quite coincidental. Microsoft just closed the local account loophole for Windows 11 setup in its most recent insider build. That's not the only loophole that Microsoft has been trying to quiet down, either. In February, Redmond took down its own advice on how to install Windows 11 on unsupported hardware without Trusted Platform Module 2.0 hardware in a bid to get people to stop doing that and just buy new hardware. Coincidentally or not, that's the topic of White's and others' removed videos. Odd timing aside, White doesn't believe Microsoft's malfeasance was part of the takedowns despite his publicly speculating on that in one of his videos. "I mentioned in one of the videos I made regarding this issue the possibility of Microsoft's involvement," White told us. "That was spoken more out of frustration and confusion over the whole issue and I just don't think Microsoft had anything to do with it." This problem, he explained, is more about AI inappropriately flagging content and YouTube not having the manpower to deal with appeals. It could go further than just a few removed videos, though. All three channels we identified expressed far more concern about the chilling effect AI moderation without a human in the loop can have on free expression on YouTube. "The number of creators currently being affected reaches far further than the ones that have had videos taken down like I have because creators have voiced fear in what we are allowed to publish," White explained. The CyberCPU vlogger explained that YouTube didn't provide any actual explanation about what his videos did to violate the site's content policy, leaving him and other tech creators in a position where anything they publish could result in a strike against their channel and eventual removal. "My fear is this could lead to many creators fearing covering more moderate to advanced tutorials," White told us, adding that such self-censorship would inevitably lead to less engagement. "Another creator shared that sentiment with me because he had been posting more 'safe' videos since this ordeal started and his views have suffered from it." Ultimately, the tech YouTubers waging this campaign seem to simply want an explanation and clarity. "We would just like YouTube to tell us what the issue is," White said. "If it's just a mistake then fine, restore our videos and we'll move on. If it's a new policy on YouTube, then tell us where the line is and hopefully we can move forward. "Operating blind isn't going to work," White concluded in his message to us. Welcome to the age of AI moderation - here's hoping you don't trip the system with no way to appeal. ®
[3]
YouTube's AI moderator pulls Windows 11 workaround videos, calls them dangerous - General Chat
Is installing Windows 11 with a local account or on unsupported hardware harmful or dangerous? YouTube's AI moderation system seems to think so, as it has started pulling videos that show users how to sidestep Microsoft's setup restrictions. Tech YouTuber Rich White, aka CyberCPU Tech, was the first to go public about the issue on October 26, when he posted a video reporting the removal of a how-to he published on installing Windows 11 25H2 with a local account instead of a Microsoft account. In the video, White expressed concern that YouTube's automated flagging process may be the root of the problem, as he found it hard to believe that "creating a local account in Windows 11 could lead to serious harm or even death," as YouTube reportedly alleged when it removed the video. When he appealed, White said that YouTube denied the request within 10 to 20 minutes, early on a Sunday morning, which led him to speculate that there wasn't a human in the loop when the request was shut down. That wasn't his only video removed, either. At least two other YouTubers - Britec09 and Hrutkay Mods - have released videos alleging much of the same: They published content about Windows workarounds, YouTube removed them, and they had an impossible time getting to a human for any help. YouTube's AI, all three contend, blocked their videos, despite the fact that bypassing Windows 11 locks is neither illegal nor dangerous for people who follow the instructions. At least two other YouTubers - Britec09 and Hrutkay Mods - have released videos alleging much of the same: They published content about Windows workarounds, YouTube removed them, and they had an impossible time getting to a human for any help. YouTube's AI, all three contend, blocked their videos, despite the fact that bypassing Windows 11 locks is neither illegal nor dangerous for people who follow the instructions.
Share
Share
Copy Link
YouTube's automated moderation system has been removing tech tutorial videos showing Windows 11 workarounds, sparking concerns about AI over-moderation and its impact on educational content creators.
YouTube's automated moderation system has sparked controversy by removing educational videos that demonstrate Windows 11 workarounds, with creators alleging that artificial intelligence is incorrectly flagging legitimate tech tutorials as "dangerous" or "harmful" content
1
.
Source: The Register
Tech YouTuber Rich White, who operates the CyberCPU Tech channel, became the first to publicly report the issue on October 26 when his video showing how to install Windows 11 25H2 with a local account was removed
2
. White expressed disbelief that "creating a local account in Windows 11 could lead to serious harm or even death," as YouTube allegedly claimed when justifying the removal.The speed of YouTube's appeal denials has raised significant concerns about the platform's review process. When White appealed his first video removal, YouTube denied the request within 10 to 20 minutes on a Sunday morning, leading him to speculate that no human was involved in the decision
2
. The situation became even more suspicious when his second video, demonstrating Windows 11 installation on unsupported hardware, was removed hours after posting, with the appeal denied just one minute after submission.
Source: Ars Technica
"If this was reviewed by a real human like YouTube claims they are, they watched a 17-minute video and denied the appeal in one minute," White explained in a follow-up video
2
.The removals have impacted several prominent tech content creators beyond White's channel. Britec09 and Hrutkay Mods have also reported similar experiences, with their Windows workaround videos being removed and appeals denied without apparent human review
2
3
.These videos typically show users how to bypass Microsoft's setup restrictions, such as creating local accounts instead of Microsoft accounts or installing Windows 11 on hardware that doesn't meet official requirements. Such content has historically been popular and profitable for tech creators, with White describing these tutorials as "bread and butter videos" that consistently generate "extremely high views"
1
.Related Stories
Late Friday, YouTube confirmed that the flagged videos had been reinstated and promised to prevent similar content from being removed in the future
1
. However, the platform denied that the removals were the result of automation issues, claiming that both initial enforcement decisions and appeal reviews were not due to AI involvement.Despite YouTube's denials, creators remain skeptical about the platform's explanation, particularly given the impossibly fast appeal denial times and the pattern of removals affecting recently posted content.
The uncertainty surrounding YouTube's content policies has created significant economic pressure on affected creators. White, whose channel began as a side hustle in 2020 but has since become his primary income source, has already begun censoring content he planned to post, fearing potential account penalties
1
.Britec reported pausing a sponsorship due to the uncertainty, describing it as a "great loss of income"
1
. The fear extends beyond individual creators, with White noting that "creators have voiced fear in what we are allowed to publish," leading to self-censorship that could result in reduced engagement and monetization2
.Summarized by
Navi
[2]
1
Business and Economy

2
Business and Economy

3
Technology
