2 Sources
[1]
Backlash builds over NHS plan to hide source code from AI hacking risk
NHS England is pulling its open-source software from the internet because of fears around computer-hacking AI models like Mythos. Opposition is growing among those who say the move is bad for transparency and efficiency, and will also do nothing to improve security A decision by NHS England to withdraw open-source code created with UK taxpayer funds because of the risk posed by computer-hacking AI models is attracting growing backlash. Last month, Mythos, an AI created by technology firm Anthropic, was widely reported to be capable of discovering flaws in virtually any software, potentially allowing hackers to break into systems running it. NHS England has now told staff that existing and future software must be pulled from public view and kept behind closed doors by 11 May because of this risk. The decision goes against the NHS service standard, which requires that staff make any software they produce open-source so that tools can be built upon, improved and used without the need for duplicated effort. And experts say that withdrawing code from public sight will do nothing to improve security. Now, an open letter calling on NHS England to reverse its decision is attracting hundreds of signatures. At the time of writing, 682 people have signed the letter, including author and digital rights campaigner Cory Doctorow and former UK health secretary Matt Hancock, who, when contacted for comment by New Scientist, pointed to a post on LinkedIn in which he called the policy a "huge mistake". "One of the smartest things the NHS has done in recent years is open-source its code. Taxpayers paid for it, so taxpayers should benefit from it," wrote Hancock. "But the practical case is just as strong: open source code is more rigorously tested, more secure, and allows the best minds anywhere in the world to build on top of it." Vlad-Stefan Harbuz at the University of Edinburgh, UK, is a co-author of the open letter. He has access to Mythos and was part of a group that recently used it to scan open-source NHS code for vulnerabilities. They found "a few relatively severe vulnerabilities" that were responsibly disclosed to the NHS prior to the decision to pull open-source projects.
[2]
NHS to close-source GitHub repos over AI, security concerns
Healthcare giant's maintainers handed May deadline to enact the change The UK's National Health Service (NHS) is ordering all of its technology leaders to temporarily wall off the organization's open source projects over concerns relating to advanced AI and Anthropic's Mythos. According to guidance shared internally within the organization and seen by The Register, GitHub repositories must be set from public to private by May 11. The guidance reads: "Public repositories materially increase the risk of unintended disclosure of source code, architectural decisions, configuration detail, and contextual information that may be exploited - particularly given rapid advancements in AI models capable of large-scale code ingestion, inference, and reasoning (e.g. developments such as the Mythos model)." It also states GitHub repos should not be public "unless there is an explicit and exceptional need." The decision was approved by the NHS' Engineering Board. An NHS England spokesperson told The Register this was merely a temporary measure enacted while the organization shores up its cybersecurity posture. "We are temporarily restricting access to some NHS England source code to further strengthen cybersecurity while we assess the impact of rapid developments in AI models," they said. "We will continue to publish source code where there is a clear need." NHS sources told us very few of the hundreds of NHS open source repositories contain anything remotely sensitive. Examples of open repos include those dedicated to documentation, architecture diagrams, and codebases for internal tools, such as web apps for managing clinic times. While there are bugs that an frontier AI model such as Mythos could unearth, there is thought to be very little risk to healthcare services. The NHS's decision to pull a curtain over its code does, however, mark a significant, albeit temporary, U-turn in its longstanding policy of favoring open source. Reflecting the policy of the wider British government, the organization's service manual states that all new source code should be made open source and shareable under an appropriate license. Its reasoning lies in how it is funded. "Public services are built with public money," the manual states. "So unless there's a good reason not to, the code they're based [on] should be made available for other people to reuse and build on. "Open source code can save teams duplicating effort and help them build better services faster. And publishing source code under an open license means that you're less likely to get locked in to working with a single supplier." Reports on the NHS deleting web pages devoted to communicating its approach to open source circulated late last year, suggesting it could be wavering. However, the healthcare org responded by saying this was part of a routine cleanup job related to NHSX and NHS Digital being folded into NHS England. NHS England did not give an estimate for when this temporary closed-sourcing will end, nor did it answer questions about what it deems the most significant threats advanced AI models pose to its open source repos. Reg readers have no doubt caught the ghost stories swirling around Anthropic's latest AI model, Mythos. It is touted by Anthropic as a model capable of rapidly finding vulnerabilities that skilled human teams would miss. Others see it as over-hyped. National authorities, including the UK's AI Safety Institute and National Cyber Security Centre, have somewhat validated Anthropic's claims of Mythos representing an advancement beyond the forecasted AI development cycle. However, others are more sceptical about the purported bug-hunting power. Anthropic has still not yet revealed the number of false positives the model throws up when running vulnerability scans, which is a common issue with AI thus far. Tests comparing Mythos with open source models have also revealed the proficiency gap is narrower than Anthropic implies. For now, Mythos is locked behind Project Glasswing, available only to select organizations. But Forrester analysts warn that once powerful models reach the public - and attackers - open source software faces a genuine threat, one that Anthropic's $4 million donation to Project Glasswing is unlikely to meaningfully address. Former head of open technology at NHSX, Terence Eden, argued that shifting open source repos from public to private will not provide a meaningful defense against advanced AI capabilities. "[People's open source code] was all ingested for 'training purposes' years ago," he writes in a recent blog. "If it was moderately interesting, then it was backed up by a digital hoarder. It has been archived by various digital libraries. Anyone who wants to do research on your code base can. "Closing now doesn't meaningfully protect you." Many of the serious vulnerabilities facing an organization are not necessarily in their respective codebases, he added, but in their software supply chains - their operating systems and libraries, and so on. "The bigger risk comes not from subtle logic bugs but from phishers, poor password hygiene, and insider threats. Securing your existing systems provides more protection than rushing to close-source your code." ®
Share
Copy Link
NHS England has ordered all technology leaders to hide hundreds of open-source repositories from public view by May 11, citing fears around Anthropic's Mythos AI model. The decision reverses longstanding NHS open-source policy and has attracted an open letter with over 682 signatures, including former UK health secretary Matt Hancock, who called it a "huge mistake."
NHS England has instructed staff to withdraw all existing and future open-source code from public platforms like GitHub by May 11, marking a significant departure from the organization's longstanding NHS open-source policy
1
. The decision to close-source GitHub repos stems from cybersecurity concerns related to advanced AI models, particularly Anthropic's Mythos, which has been reported as capable of discovering vulnerabilities in virtually any software2
. According to internal guidance seen by The Register, the healthcare giant believes that public repositories "materially increase the risk of unintended disclosure of source code, architectural decisions, configuration detail, and contextual information that may be exploited"2
.
Source: The Register
The move has triggered substantial backlash from technology experts, digital rights advocates, and even political figures. An open letter calling on NHS England to reverse its decision has attracted 682 signatures, including digital rights campaigner Cory Doctorow and former UK health secretary Matt Hancock
1
. In a LinkedIn post, Hancock stated: "One of the smartest things the NHS has done in recent years is open-source its code. Taxpayers paid for it, so taxpayers should benefit from it. But the practical case is just as strong: open source code is more rigorously tested, more secure, and allows the best minds anywhere in the world to build on top of it"1
.The decision directly contradicts the NHS service standard, which requires staff to make software produced with public money open-source so tools can be built upon, improved, and used without duplicated effort
1
. The organization's service manual explicitly states that "public services are built with public money" and unless there's a good reason, code should be made available for reuse2
. NHS sources indicated that very few of the hundreds of repositories contain anything remotely sensitive, with examples including documentation, architecture diagrams, and codebases for internal tools such as web apps for managing clinic times2
.
Source: New Scientist
Security experts argue that withdrawing code from public view will do nothing to improve AI security or address the AI hacking risk posed by computer-hacking AI models
1
. Former head of open technology at NHSX, Terence Eden, stated that the code "was all ingested for 'training purposes' years ago" and that "closing now doesn't meaningfully protect you"2
. He emphasized that many serious vulnerabilities facing organizations exist not in their codebases but in their software supply chains2
.Related Stories
Vlad-Stefan Harbuz at the University of Edinburgh, a co-author of the open letter, has access to Mythos and was part of a group that recently used it to scan open-source NHS code for vulnerabilities
1
. They found "a few relatively severe vulnerabilities" that were responsibly disclosed to the NHS prior to the decision1
. While national authorities including the UK's AI Safety Institute and National Cyber Security Centre have validated Anthropic's claims of Mythos representing an advancement beyond forecasted AI development cycles, skepticism remains about its purported bug-hunting power, particularly as Anthropic has not revealed the number of false positives the model generates2
.An NHS England spokesperson characterized this as a temporary measure enacted while the organization strengthens its cybersecurity posture, stating they "will continue to publish source code where there is a clear need"
2
. However, NHS England did not provide an estimate for when this temporary closed-sourcing will end or specify what it deems the most significant threats advanced AI models pose to its repositories2
. The decision, approved by the NHS Engineering Board, requires GitHub repositories to be set from public to private unless there is "an explicit and exceptional need"2
. This shift raises questions about the future of transparency in government-funded technology projects and whether the perceived security benefits outweigh the loss of collaborative development and public accountability that open-source code provides.Summarized by
Navi
[2]
15 Apr 2026•Policy and Regulation

30 Apr 2026•Technology

14 Jan 2025•Policy and Regulation

1
Health

2
Technology

3
Technology
