2 Sources
[1]
US bank discloses security lapse after sharing customer data with AI app | TechCrunch
Community Bank, which operates in Pennsylvania, Ohio, and West Virginia, disclosed a cybersecurity incident that exposed customers' names, dates of birth, and Social Security numbers. In an 8-K filing dated May 7 with the U.S. Securities and Exchange Commission, the bank said it detected an exposure of customers' personal data due to the use of "an unauthorized artificial intelligence-based software application." The bank said it disclosed the incident "due to the volume and sensitive nature of the non-public information at issue." It's unclear exactly what happened, but based on the language in the filing, it appears someone working for Community Bank may have uploaded customer data to an online AI chatbot, potentially exposing that information to the chatbot maker. While Community Bank did not disclose how many customers were affected by the incident, nor what AI application was involved, the company said it is "evaluating the customer data that was affected" and is sending notifications in accordance with relevant laws. Community Bank's chief executive John Montgomery did not immediately respond to TechCrunch's request for comment.
[2]
US bank reports itself after slinging customer data at 'unauthorized AI app'
A US commercial bank just tattled on itself to the Securities and Exchange Commission (SEC) for plugging a bunch of customer data into an unauthorized AI application. Community Bank, which operates in southwestern Pennsylvania, Ohio, and West Virginia, filed an 8-K with the regulator on Monday, saying it launched an investigation into the internal cockup, which remains ongoing. It felt compelled to submit the filing "due to the volume and sensitive nature of the non-public information." This included customer names, dates of birth, and Social Security numbers, but the filing provided no further detail about the incident. Community Bank did not specify what this "unauthorized AI-based software application" was or how it was used. However, the disclosure of data such as SSNs, which in the US are generally categorized among the most sensitive types of data that organizations can store on behalf of customers, is protected under several federal and state laws. One possibility is that the data was entered into a generative AI tool outside the bank's approved systems. If so, that could raise questions about whether the information was transmitted to a third-party provider and how it may have been retained or processed. The Register asked Community Bank for more details and will update this story if it responds. The bank confirmed that it suffered no operational impact and customers were not prevented from accessing their accounts or payment services as a result. "The company is evaluating the customer data that was affected and is conducting notifications as required by applicable federal and state laws and regulatory guidance," Community Bank stated in its cybersecurity disclosure. "The company has been, and continues to be, in communication with relevant banking and financial regulators regarding the incident." It also promised to continue its remediation efforts, take action to prevent future failures, and gave the "we're committed to protecting customers' data" line that always goes down so well. ®
Share
Copy Link
Community Bank disclosed a cybersecurity breach involving an unauthorized AI application that exposed customer names, dates of birth, and Social Security numbers. The Pennsylvania-based bank filed an 8-K with the SEC, citing the volume and sensitive nature of the exposed data. While the exact AI tool remains unidentified, experts suspect an employee uploaded customer information to an external AI chatbot.
Community Bank, operating across Pennsylvania, Ohio, and West Virginia, has reported a significant data breach after sensitive customer data was shared with an unauthorized AI application. In an 8-K filing dated May 7 with the SEC, the bank revealed that customers' names, dates of birth, and Social Security numbers were exposed due to what appears to be an internal security lapse
1
. The bank stated it felt compelled to disclose the incident "due to the volume and sensitive nature of the non-public information at issue"2
.
Source: The Register
While Community Bank has not specified which AI application was involved or how many customers were affected, the language in the filing suggests that someone working for the bank may have uploaded customer data to an online AI chatbot, potentially exposing that information to the chatbot maker
1
. This customer data exposure raises critical questions about internal controls and employee training regarding the use of generative AI tools.The disclosure of Social Security numbers represents a particularly serious aspect of this cybersecurity breach, as these are categorized among the most sensitive types of data that organizations can store on behalf of customers and are protected under several federal and state laws
2
. One possibility experts are considering is that the data was entered into a generative AI tool outside the bank's approved systems, which could raise questions about whether the information was transmitted to a third-party provider and how it may have been retained or processed.
Source: TechCrunch
Community Bank confirmed that it suffered no operational impact and customers were not prevented from accessing their accounts or payment services as a result of the incident
2
. The bank is currently evaluating the customer data that was affected and conducting notifications as required by applicable federal and state laws and regulatory guidance. Chief executive John Montgomery has not yet commented publicly on the incident1
.Related Stories
Community Bank has been in communication with relevant financial regulators regarding the incident and has promised to continue its remediation efforts and take action to prevent future failures
2
. The bank's decision to self-report through an 8-K filing demonstrates the seriousness with which it is treating this security lapse, though questions remain about how such sensitive customer data could be shared with an unauthorized AI app in the first place.This incident highlights a growing concern across the financial services industry as employees increasingly turn to AI chatbots and other generative AI tools for assistance with their work. Without proper safeguards and clear policies about which AI applications are approved for use with sensitive customer data, organizations face significant risks of inadvertent data exposure. The Community Bank case serves as a cautionary tale for other financial institutions about the importance of implementing robust controls around AI tool usage and ensuring employees understand the risks of uploading confidential information to external platforms. As the investigation continues, industry observers will be watching to see what specific AI application was involved and whether additional security measures will be mandated across the banking sector to prevent similar incidents.🟡 Atkinson=
Summarized by
Navi
27 Apr 2026•Technology

13 Mar 2025•Business and Economy

20 Apr 2026•Policy and Regulation

1
Business and Economy

2
Technology

3
Technology
