2 Sources
[1]
Prosecutors to Consider Companies' Use of AI When Assessing Compliance Programs | PYMNTS.com
Companies' use of artificial intelligence (AI) is one of the things prosecutors will look at when assessing their compliance programs during investigations of criminal offenses like bribery or fraud. Nicole Argentieri, principal deputy assistant attorney general for the criminal division of the Department of Justice (DOJ), said this on Monday (Sept. 23) while outlining changes to the DOJ's Evaluation of Corporate Compliance Programs (ECCP), the Wall Street Journal (WSJ) reported Tuesday (Sept. 24). This DOJ guidance is important to lawyers and compliance officers because compliance programs that adhere to its guidelines are eligible for more lenient treatment when a compliance breakdown occurs, according to the report. In prepared remarks for a speech delivered Monday, Argentieri said that under the updated ECCP, prosecutors will consider AI and other technology a company uses to conduct business, whether the company has conducted a risk assessment of the use of that technology, and whether it has taken steps to mitigate that risk. "For example, prosecutors will consider whether the company is vulnerable to criminal schemes enabled by new technology, such as false approvals and documentation generated by AI," Argentieri said. "If so, we will consider whether compliance controls and tools are in place to identify and mitigate those risks, such as tools to confirm the accuracy or reliability of data used in the business. We also want to know whether the company is monitoring and testing its technology to evaluate if it is functioning as intended and consistent with the company's code of conduct." In the second of three major changes to the ECCP outlined in her speech, Argentieri said prosecutors will consider companies' commitment to whistleblower protection and treatment of employees who report misconduct. The third update outlined in the speech covers a compliance program's access to data and technology, and whether the company is putting the same resources into compliance as it puts into other parts of the business. The ECCP was first issued in 2017 to lay out a series of factors for prosecutors to consider when assessing the effectiveness of corporate compliance programs as part of making charging decisions and negotiating resolutions.
[2]
If your AI does the crime, you'll do the time: DoJ
If juggling the extreme cost and hazy ROI of AI weren't enough of a headache, the United States Department of Justice (DoJ) now expects enterprise compliance officers to start weighing the tech's potential for harm - or risk stiff fines if it breaks the law. Nicole Argentieri, the principal deputy assistant attorney general for the DoJ's criminal division, discussed the changes made to the Evaluation of Corporate Compliance Program (ECCP) guidelines [PDF] in an address to the Society of Corporate Compliance and Ethics earlier this week. The guidelines detail how DoJ prosecutors should approach criminal investigations and evaluate service providers' effectiveness at preventing criminal behavior. As such, the ECCP effectively functions as a guide for compliance officers looking to avoid the DoJ's ire. After a pilot program, these rules have officially been extended to include the use of AI. The tech is are increasingly being deployed by businesses and could therefore conceivably be used to make decisions or facilitate actions that are less than legal. The ECCP guidelines include a list of questions the DoJ thinks compliance officers should ask themselves about the use of AI systems because those are exactly the questions prosecutors will be asking in the event of an investigation. Examples include: You can find the full list of AI-related compliance questions on page four of the ECCP document here. According to Argentieri, per a transcript, "prosecutors will consider whether the company is vulnerable to criminal schemes enabled by new technology such as false approvals and documentation generated by AI. If so, we will consider whether compliance controls and tools are in place to identify and mitigate those risks." Additionally, the DoJ will take into consideration whether the company is actively monitoring and testing AI applications to ensure they're functioning as intended. In other words, it doesn't matter whether it's the AI that broke the law - the company will be held accountable. Executives should therefore take steps to identify and address these risks before the DoJ comes knocking. "Because we prosecute corporate crime, we ask not just what happened, but why it happened and what the company has done to prevent misconduct from recurring," Argentieri explained, adding: "We expect corporations to continuously review and update their compliance programs to account for emerging risk factors." In addition to guidelines governing AI compliance, the ECCP updates also include guidance related to whistleblowers under a program designed to incentivize workers to report illegal activities. ®
Share
Copy Link
The U.S. Department of Justice updates its guidance on corporate compliance programs to include the use of artificial intelligence. This move reflects the growing importance of AI in business operations and risk management.
The U.S. Department of Justice (DOJ) has announced a significant update to its guidance on evaluating corporate compliance programs, now including the use of artificial intelligence (AI) as a key consideration 1. This move underscores the growing importance of AI in business operations and risk management strategies.
Under the new guidance, prosecutors will assess how companies leverage AI to enhance their compliance efforts. This includes evaluating the effectiveness of AI tools in detecting and preventing misconduct, as well as ensuring that these systems are properly monitored and maintained 2.
The DOJ's updated guidance emphasizes several critical aspects that companies should address:
This update signals a shift in how the DOJ views technology's role in corporate governance. Companies will need to carefully consider their AI strategies and ensure they align with regulatory expectations 1.
While the integration of AI into compliance programs presents challenges, it also offers opportunities for more efficient and effective risk management. Companies that successfully implement AI-driven compliance solutions may gain a competitive advantage 2.
The DOJ's focus on AI in compliance programs reflects a wider trend of increased scrutiny on the use of advanced technologies in business operations. This move may influence other regulatory bodies to adopt similar approaches in evaluating corporate practices 12.
Google has launched its new Pixel 10 series, featuring improved AI capabilities, camera upgrades, and the new Tensor G5 chip. The lineup includes the Pixel 10, Pixel 10 Pro, and Pixel 10 Pro XL, with prices starting at $799.
60 Sources
Technology
11 hrs ago
60 Sources
Technology
11 hrs ago
Google launches its new Pixel 10 smartphone series, showcasing advanced AI capabilities powered by Gemini, aiming to compete with Apple in the premium handset market.
22 Sources
Technology
11 hrs ago
22 Sources
Technology
11 hrs ago
NASA and IBM have developed Surya, an open-source AI model that can predict solar flares and space weather with improved accuracy, potentially helping to protect Earth's infrastructure from solar storm damage.
6 Sources
Technology
19 hrs ago
6 Sources
Technology
19 hrs ago
Google's latest smartwatch, the Pixel Watch 4, introduces significant upgrades including a curved display, AI-powered features, and satellite communication capabilities, positioning it as a strong competitor in the smartwatch market.
18 Sources
Technology
11 hrs ago
18 Sources
Technology
11 hrs ago
FieldAI, a robotics startup, has raised $405 million to develop "foundational embodied AI models" for various robot types. The company's innovative approach integrates physics principles into AI, enabling safer and more adaptable robot operations across diverse environments.
7 Sources
Technology
11 hrs ago
7 Sources
Technology
11 hrs ago