Curated by THEOUTPOST
On Wed, 14 Aug, 8:04 AM UTC
9 Sources
[1]
Wyoming reporter caught using artificial intelligence to create fake quotes and stories
Quotes from Wyoming's governor and a local prosecutor were the first things that seemed slightly off to Powell Tribune reporter CJ Baker. Then, it was some of the phrases in the stories that struck him as nearly robotic. The dead giveaway, though, that a reporter from a competing news outlet was using generative artificial intelligence to help write his stories came in a June 26 article about the comedian Larry the Cable Guy being chosen as the grand marshal of a local parade. It concluded with an explanation of the inverted pyramid, the basic approach to writing a breaking news story. "The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy's most beloved figures," the Cody Enterprise reported. "This structure ensures that the most critical information is presented first, making it easier for readers to grasp the main points quickly." After doing some digging, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who Baker says admitted that he had used AI in his stories before he resigned from the Enterprise. The publisher and editor at the Enterprise, which was cofounded in 1899 by Buffalo Bill Cody, have since apologised and vowed to take steps to ensure it never happens again. In an editorial published Monday, Enterprise Editor Chris Bacon said he "failed to catch" the AI copy and false quotes and apologized that "AI was allowed to put words that were never spoken into stories." Journalists have derailed their careers by making up quotes or facts in stories long before AI came about. But this latest scandal illustrates the potential pitfalls and dangers that AI poses to many industries, including journalism, as chatbots can spit out spurious if somewhat plausible articles with only a few prompts. AI has found a role in journalism, including in the automation of certain tasks. Some newsrooms, including The Associated Press, use AI to free up reporters for more impactful work, but most staff are not allowed to use generative AI to create publishable content. The AP has been using technology to assist in articles about financial earnings reports since 2014, and more recently for some sports stories. It is also experimenting with an AI tool to translate some stories from English to Spanish. At the end of each such story is a note that explains technology's role in its production. Being upfront about how and when AI is used has proven important. Sports Illustrated was criticized last year for publishing AI-generated online product reviews that were presented as having been written by reporters who didn't actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the once-powerful publication's reputation. In his Powell Tribune story breaking the news about Pelczar's use of AI in articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, "Obviously I've never intentionally tried to misquote anybody" and promised to "correct them and issue apologies and say they are misstatements," Baker wrote, noting that Pelczar insisted his mistakes shouldn't reflect on his Cody Enterprise editors. After the meeting, the Enterprise discovered seven stories that included AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other stories. "They're very believable quotes," Bacon said, noting that the people he spoke to during his review of Pelczar's articles said the quotes sounded like something they'd say, but that they had never actually talked to Pelczar. Baker reported that seven people told him that they had been quoted in Pelczar stories but hadn't spoken to him. Pelczar did not respond to an AP phone message left at a number listed as his asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that reached out. Baker, who regularly reads the Enterprise because it's a competitor, told the AP that a combination of phrases and quotes in Pelczar's stories aroused his suspicions. Pelczar's story about a shooting in Yellowstone National Park included the sentence: "This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene settings." Baker said the line sounded like the summaries of his stories that a certain chatbot seems to generate, in that it tacks on some kind of a "life lesson" at the end. Another story - about a poaching sentencing - included quotes from a wildlife official and a prosecutor that sounded like they came from a news release, Baker said. However, there wasn't a news release and the agencies involved didn't know where the quotes had come from, he said. Two of the questioned stories included fake quotes from Wyoming Gov. Mark Gordon that his staff only learned about when Baker called them. One quote was entirely fabricated and another was partially fabricated, said Michael Pearlman, a spokesperson for the governor. It's not difficult to create AI-generated stories. Users could put a criminal affidavit through an AI program and ask it to write an article about the case including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the preeminent journalism think tank. "These generative AI chatbots are programmed to give you an answer, no matter whether that answer is complete garbage or not," Mahadevan said. Megan Barton, the Cody Enterprise's publisher, wrote an editorial calling AI "the new, advanced form of plagiarism and in the field of media and writing, plagiarism is something every media outlet has had to correct at some point or another." Barton wrote that the newspaper now has a system in place to recognize AI-generated stories and will "have longer conversations about how AI-generated stories are not acceptable." In his editorial, Bacon invoked the legacy of former professional baseball player Robin Ventura, who famously got the worst of a scuffle with Hall of Fame pitcher Nolan Ryan after charging the mound during a 1993 game in a scene that is reposted on social media to this day. The editor compares his own shame to the ridicule that the then-Chicago White Sox infielder (Bacon misidentified him as a member of the Chicago Cubs) experiences to this day. "I always suspected that the thought of being taken to the woodshed in front of millions in a fight Ventura started hurt more than his face. Now, your editor having been taken to the woodshed in the Wyoming press, I am sure of it." The Enterprise didn't have an AI policy, in part because it seemed obvious that journalists shouldn't use it to write stories, Bacon said. Poynter has a template from which news outlets can build their own AI policy. Bacon, a military veteran and former air ambulance pilot who was named editor in May after a few months working as a reporter, plans to have a policy in place by the end of the week. "This will be a pre-employment topic of discussion," he said. Hanson reported from Helena, Montana.
[2]
Wyoming reporter caught using artificial intelligence to create fake quotes and stories
HELENA, Mont. (AP) -- A quote from Wyoming's governor and a local prosecutor were the first things that seemed slightly off to Powell Tribune reporter CJ Baker. Then, it was some of the phrases in the stories that struck him as nearly robotic. The dead giveaway, though, that a reporter from a competing news outlet was using generative artificial intelligence to help write his stories came in a June 26 article about the comedian Larry the Cable Guy being chosen as the grand marshal of the Cody Stampede Parade. "The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy's most beloved figures," the Cody Enterprise reported. "This structure ensures that the most critical information is presented first, making it easier for readers to grasp the main points quickly." After doing some digging, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who Baker says admitted that he had used AI in his stories before he resigned from the Enterprise. The publisher and editor at the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, have since apologized and vowed to take steps to ensure it never happens again. In an editorial published Monday, Enterprise Editor Chris Bacon said he "failed to catch" the AI copy and false quotes. "It matters not that the false quotes were the apparent error of a hurried rookie reporter that trusted AI. It was my job," Bacon wrote. He apologized that "AI was allowed to put words that were never spoken into stories." Journalists have derailed their careers by making up quotes or facts in stories long before AI came about. But this latest scandal illustrates the potential pitfalls and dangers that AI poses to many industries, including journalism, as chatbots can spit out spurious if somewhat plausible articles with only a few prompts. AI has found a role in journalism, including in the automation of certain tasks. Some newsrooms, including The Associated Press, use AI to free up reporters for more impactful work, but most AP staff are not allowed to use generative AI to create publishable content. The AP has been using technology to assist in articles about financial earnings reports since 2014, and more recently for some sports stories. It is also experimenting with an AI tool to translate some stories from English to Spanish. At the end of each such story is a note that explains technology's role in its production. Being upfront about how and when AI is used has proven important. Sports Illustrated was criticized last year for publishing AI-generated online product reviews that were presented as having been written by reporters who didn't actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the once-powerful publication's reputation. In his Powell Tribune story breaking the news about Pelczar's use of AI in articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, "Obviously I've never intentionally tried to misquote anybody" and promised to "correct them and issue apologies and say they are misstatements," Baker wrote, noting that Pelczar insisted his mistakes shouldn't reflect on his Cody Enterprise editors. After the meeting, the Enterprise launched a full review of all of the stories Pelczar had written for the paper in the two months he had worked there. They have discovered seven stories that included AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other stories. "They're very believable quotes," Bacon said, noting that the people he spoke to during his review of Pelczar's articles said the quotes sounded like something they'd say, but that they never actually talked to Pelczar. Baker reported that seven people told him that they had been quoted in stories written by Pelczar, but had not spoken to him. Pelczar did not respond to an AP phone message left at a number listed as his asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that had reached out. Baker, who regularly reads the Enterprise because it's a competitor, told the AP that a combination of phrases and quotes in Pelczar's stories aroused his suspicions. Pelczar's story about a shooting in Yellowstone National Park included the sentence: "This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene settings." Baker said the line sounded like the summaries of his stories that a certain chatbot seems to generate, in that it tacks on some kind of a "life lesson" at the end. Another story -- about a poaching sentencing -- included quotes from a wildlife official and a prosecutor that sounded like they came from a news release, Baker said. However, there wasn't a news release and the agencies involved didn't know where the quotes had come from, he said. Two of the questioned stories included fake quotes from Wyoming Gov. Mark Gordon that his staff only learned about when Baker called them. "In one case, (Pelczar) wrote a story about a new OSHA rule that included a quote from the Governor that was entirely fabricated," Michael Pearlman, a spokesperson for the governor, said in an email. "In a second case, he appeared to fabricate a portion of a quote, and then combined it with a portion of a quote that was included in a news release announcing the new director of our Wyoming Game and Fish Department." The most obvious AI-generated copy appeared in the story about Larry the Cable Guy that ended with the explanation of the inverted pyramid, the basic approach to writing a breaking news story. It's not difficult to create AI stories. Users could put a criminal affidavit into an AI program and ask it to write an article about the case including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the preeminent journalism think tank. "These generative AI chatbots are programmed to give you an answer, no matter whether that answer is complete garbage or not," Mahadevan said. Megan Barton, the Cody Enterprise's publisher, wrote an editorial calling AI "the new, advanced form of plagiarism and in the field of media and writing, plagiarism is something every media outlet has had to correct at some point or another. It's the ugly part of the job. But, a company willing to right (or quite literally write) these wrongs is a reputable one." Barton wrote that the newspaper has learned its lesson, has a system in place to recognize AI-generated stories and will "have longer conversations about how AI-generated stories are not acceptable." The Enterprise didn't have an AI policy, in part because it seemed obvious that journalists shouldn't use it to write stories, Bacon said. Poynter has a template from which news outlets can build their own AI policy. Bacon plans to have one in place by the end of the week. "This will be a pre-employment topic of discussion," he said.
[3]
Wyoming reporter caught using artificial intelligence to create fake quotes and stories
HELENA, Mont. -- A quote from Wyoming's governor and a local prosecutor were the first things that seemed slightly off to Powell Tribune reporter CJ Baker. Then, it was some of the phrases in the stories that struck him as nearly robotic. The dead giveaway, though, that a reporter from a competing news outlet was using generative artificial intelligence to help write his stories came in a June 26 article about the comedian Larry the Cable Guy being chosen as the grand marshal of the Cody Stampede Parade. "The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy's most beloved figures," the Cody Enterprise reported. "This structure ensures that the most critical information is presented first, making it easier for readers to grasp the main points quickly." After doing some digging, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who Baker says admitted that he had used AI in his stories before he resigned from the Enterprise. The publisher and editor at the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, have since apologized and vowed to take steps to ensure it never happens again. In an editorial published Monday, Enterprise Editor Chris Bacon said he "failed to catch" the AI copy and false quotes. "It matters not that the false quotes were the apparent error of a hurried rookie reporter that trusted AI. It was my job," Bacon wrote. He apologized that "AI was allowed to put words that were never spoken into stories." Journalists have derailed their careers by making up quotes or facts in stories long before AI came about. But this latest scandal illustrates the potential pitfalls and dangers that AI poses to many industries, including journalism, as chatbots can spit out spurious if somewhat plausible articles with only a few prompts. AI has found a role in journalism, including in the automation of certain tasks. Some newsrooms, including The Associated Press, use AI to free up reporters for more impactful work, but most AP staff are not allowed to use generative AI to create publishable content. The AP has been using technology to assist in articles about financial earnings reports since 2014, and more recently for some sports stories. It is also experimenting with an AI tool to translate some stories from English to Spanish. At the end of each such story is a note that explains technology's role in its production. Being upfront about how and when AI is used has proven important. Sports Illustrated was criticized last year for publishing AI-generated online product reviews that were presented as having been written by reporters who didn't actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the once-powerful publication's reputation. In his Powell Tribune story breaking the news about Pelczar's use of AI in articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, "Obviously I've never intentionally tried to misquote anybody" and promised to "correct them and issue apologies and say they are misstatements," Baker wrote, noting that Pelczar insisted his mistakes shouldn't reflect on his Cody Enterprise editors. After the meeting, the Enterprise launched a full review of all of the stories Pelczar had written for the paper in the two months he had worked there. They have discovered seven stories that included AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other stories. "They're very believable quotes," Bacon said, noting that the people he spoke to during his review of Pelczar's articles said the quotes sounded like something they'd say, but that they never actually talked to Pelczar. Baker reported that seven people told him that they had been quoted in stories written by Pelczar, but had not spoken to him. Pelczar did not respond to an AP phone message left at a number listed as his asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that had reached out. Baker, who regularly reads the Enterprise because it's a competitor, told the AP that a combination of phrases and quotes in Pelczar's stories aroused his suspicions. Pelczar's story about a shooting in Yellowstone National Park included the sentence: "This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene settings." Baker said the line sounded like the summaries of his stories that a certain chatbot seems to generate, in that it tacks on some kind of a "life lesson" at the end. Another story -- about a poaching sentencing -- included quotes from a wildlife official and a prosecutor that sounded like they came from a news release, Baker said. However, there wasn't a news release and the agencies involved didn't know where the quotes had come from, he said. Two of the questioned stories included fake quotes from Wyoming Gov. Mark Gordon that his staff only learned about when Baker called them. "In one case, (Pelczar) wrote a story about a new OSHA rule that included a quote from the Governor that was entirely fabricated," Michael Pearlman, a spokesperson for the governor, said in an email. "In a second case, he appeared to fabricate a portion of a quote, and then combined it with a portion of a quote that was included in a news release announcing the new director of our Wyoming Game and Fish Department." The most obvious AI-generated copy appeared in the story about Larry the Cable Guy that ended with the explanation of the inverted pyramid, the basic approach to writing a breaking news story. It's not difficult to create AI stories. Users could put a criminal affidavit into an AI program and ask it to write an article about the case including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the preeminent journalism think tank. "These generative AI chatbots are programmed to give you an answer, no matter whether that answer is complete garbage or not," Mahadevan said. Megan Barton, the Cody Enterprise's publisher, wrote an editorial calling AI "the new, advanced form of plagiarism and in the field of media and writing, plagiarism is something every media outlet has had to correct at some point or another. It's the ugly part of the job. But, a company willing to right (or quite literally write) these wrongs is a reputable one." Barton wrote that the newspaper has learned its lesson, has a system in place to recognize AI-generated stories and will "have longer conversations about how AI-generated stories are not acceptable." The Enterprise didn't have an AI policy, in part because it seemed obvious that journalists shouldn't use it to write stories, Bacon said. Poynter has a template from which news outlets can build their own AI policy. Bacon plans to have one in place by the end of the week. "This will be a pre-employment topic of discussion," he said.
[4]
Wyoming reporter caught using artificial intelligence to create fake...
A quote from Wyoming's governor and a local prosecutor were the first things that seemed slightly off to Powell Tribune reporter CJ Baker. Then, it was some of the phrases in the stories that struck him as nearly robotic. The dead giveaway, though, that a reporter from a competing news outlet was using generative artificial intelligence to help write his stories came in a June 26 article about the comedian Larry the Cable Guy being chosen as the grand marshal of the Cody Stampede Parade. "The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy's most beloved figures," the Cody Enterprise reported. "This structure ensures that the most critical information is presented first, making it easier for readers to grasp the main points quickly." After doing some digging, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who Baker says admitted that he had used AI in his stories before he resigned from the Enterprise. The publisher and editor at the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, have since apologized and vowed to take steps to ensure it never happens again. In an editorial published Monday, Enterprise Editor Chris Bacon said he "failed to catch" the AI copy and false quotes. "It matters not that the false quotes were the apparent error of a hurried rookie reporter that trusted AI. It was my job," Bacon wrote. He apologized that "AI was allowed to put words that were never spoken into stories." Journalists have derailed their careers by making up quotes or facts in stories long before AI came about. But this latest scandal illustrates the potential pitfalls and dangers that AI poses to many industries, including journalism, as chatbots can spit out spurious if somewhat plausible articles with only a few prompts. AI has found a role in journalism, including in the automation of certain tasks. Some newsrooms, including The Associated Press, use AI to free up reporters for more impactful work, but most AP staff are not allowed to use generative AI to create publishable content. The AP has been using technology to assist in articles about financial earnings reports since 2014, and more recently for some sports stories. It is also experimenting with an AI tool to translate some stories from English to Spanish. At the end of each such story is a note that explains technology's role in its production. Being upfront about how and when AI is used has proven important. Sports Illustrated was criticized last year for publishing AI-generated online product reviews that were presented as having been written by reporters who didn't actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the once-powerful publication's reputation. In his Powell Tribune story breaking the news about Pelczar's use of AI in articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, "Obviously I've never intentionally tried to misquote anybody" and promised to "correct them and issue apologies and say they are misstatements," Baker wrote, noting that Pelczar insisted his mistakes shouldn't reflect on his Cody Enterprise editors. After the meeting, the Enterprise launched a full review of all of the stories Pelczar had written for the paper in the two months he had worked there. They have discovered seven stories that included AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other stories. "They're very believable quotes," Bacon said, noting that the people he spoke to during his review of Pelczar's articles said the quotes sounded like something they'd say, but that they never actually talked to Pelczar. Baker reported that seven people told him that they had been quoted in stories written by Pelczar, but had not spoken to him. Pelczar did not respond to an AP phone message left at a number listed as his asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that had reached out. Baker, who regularly reads the Enterprise because it's a competitor, told the AP that a combination of phrases and quotes in Pelczar's stories aroused his suspicions. Pelczar's story about a shooting in Yellowstone National Park included the sentence: "This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene settings." Baker said the line sounded like the summaries of his stories that a certain chatbot seems to generate, in that it tacks on some kind of a "life lesson" at the end. Another story -- about a poaching sentencing -- included quotes from a wildlife official and a prosecutor that sounded like they came from a news release, Baker said. However, there wasn't a news release and the agencies involved didn't know where the quotes had come from, he said. Two of the questioned stories included fake quotes from Wyoming Gov. Mark Gordon that his staff only learned about when Baker called them. "In one case, (Pelczar) wrote a story about a new OSHA rule that included a quote from the Governor that was entirely fabricated," Michael Pearlman, a spokesperson for the governor, said in an email. "In a second case, he appeared to fabricate a portion of a quote, and then combined it with a portion of a quote that was included in a news release announcing the new director of our Wyoming Game and Fish Department." The most obvious AI-generated copy appeared in the story about Larry the Cable Guy that ended with the explanation of the inverted pyramid, the basic approach to writing a breaking news story. It's not difficult to create AI stories. Users could put a criminal affidavit into an AI program and ask it to write an article about the case including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the preeminent journalism think tank. "These generative AI chatbots are programmed to give you an answer, no matter whether that answer is complete garbage or not," Mahadevan said. Megan Barton, the Cody Enterprise's publisher, wrote an editorial calling AI "the new, advanced form of plagiarism and in the field of media and writing, plagiarism is something every media outlet has had to correct at some point or another. It's the ugly part of the job. But, a company willing to right (or quite literally write) these wrongs is a reputable one." Barton wrote that the newspaper has learned its lesson, has a system in place to recognize AI-generated stories and will "have longer conversations about how AI-generated stories are not acceptable." The Enterprise didn't have an AI policy, in part because it seemed obvious that journalists shouldn't use it to write stories, Bacon said. Poynter has a template from which news outlets can build their own AI policy. Bacon plans to have one in place by the end of the week. "This will be a pre-employment topic of discussion," he said.
[5]
Wyoming reporter caught using artificial intelligence to create fake quotes and stories
HELENA, Mont. -- A quote from Wyoming's governor and a local prosecutor were the first things that seemed slightly off to Powell Tribune reporter CJ Baker. Then, it was some of the phrases in the stories that struck him as nearly robotic. The dead giveaway, though, that a reporter from a competing news outlet was using generative artificial intelligence to help write his stories came in a June 26 article about the comedian Larry the Cable Guy being chosen as the grand marshal of the Cody Stampede Parade. "The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy's most beloved figures," the Cody Enterprise reported. "This structure ensures that the most critical information is presented first, making it easier for readers to grasp the main points quickly." After doing some digging, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who Baker says admitted that he had used AI in his stories before he resigned from the Enterprise. The publisher and editor at the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, have since apologized and vowed to take steps to ensure it never happens again. In an editorial published Monday, Enterprise Editor Chris Bacon said he "failed to catch" the AI copy and false quotes. "It matters not that the false quotes were the apparent error of a hurried rookie reporter that trusted AI. It was my job," Bacon wrote. He apologized that "AI was allowed to put words that were never spoken into stories." Journalists have derailed their careers by making up quotes or facts in stories long before AI came about. But this latest scandal illustrates the potential pitfalls and dangers that AI poses to many industries, including journalism, as chatbots can spit out spurious if somewhat plausible articles with only a few prompts. AI has found a role in journalism, including in the automation of certain tasks. Some newsrooms, including The Associated Press, use AI to free up reporters for more impactful work, but most AP staff are not allowed to use generative AI to create publishable content. The AP has been using technology to assist in articles about financial earnings reports since 2014, and more recently for some sports stories. It is also experimenting with an AI tool to translate some stories from English to Spanish. At the end of each such story is a note that explains technology's role in its production. Being upfront about how and when AI is used has proven important. Sports Illustrated was criticized last year for publishing AI-generated online product reviews that were presented as having been written by reporters who didn't actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the once-powerful publication's reputation. In his Powell Tribune story breaking the news about Pelczar's use of AI in articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, "Obviously I've never intentionally tried to misquote anybody" and promised to "correct them and issue apologies and say they are misstatements," Baker wrote, noting that Pelczar insisted his mistakes shouldn't reflect on his Cody Enterprise editors. After the meeting, the Enterprise launched a full review of all of the stories Pelczar had written for the paper in the two months he had worked there. They have discovered seven stories that included AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other stories. "They're very believable quotes," Bacon said, noting that the people he spoke to during his review of Pelczar's articles said the quotes sounded like something they'd say, but that they never actually talked to Pelczar. Baker reported that seven people told him that they had been quoted in stories written by Pelczar, but had not spoken to him. Pelczar did not respond to an AP phone message left at a number listed as his asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that had reached out. Baker, who regularly reads the Enterprise because it's a competitor, told the AP that a combination of phrases and quotes in Pelczar's stories aroused his suspicions. Pelczar's story about a shooting in Yellowstone National Park included the sentence: "This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene settings." Baker said the line sounded like the summaries of his stories that a certain chatbot seems to generate, in that it tacks on some kind of a "life lesson" at the end. Another story -- about a poaching sentencing -- included quotes from a wildlife official and a prosecutor that sounded like they came from a news release, Baker said. However, there wasn't a news release and the agencies involved didn't know where the quotes had come from, he said. Two of the questioned stories included fake quotes from Wyoming Gov. Mark Gordon that his staff only learned about when Baker called them. "In one case, (Pelczar) wrote a story about a new OSHA rule that included a quote from the Governor that was entirely fabricated," Michael Pearlman, a spokesperson for the governor, said in an email. "In a second case, he appeared to fabricate a portion of a quote, and then combined it with a portion of a quote that was included in a news release announcing the new director of our Wyoming Game and Fish Department." The most obvious AI-generated copy appeared in the story about Larry the Cable Guy that ended with the explanation of the inverted pyramid, the basic approach to writing a breaking news story. It's not difficult to create AI stories. Users could put a criminal affidavit into an AI program and ask it to write an article about the case including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the preeminent journalism think tank. "These generative AI chatbots are programmed to give you an answer, no matter whether that answer is complete garbage or not," Mahadevan said. Megan Barton, the Cody Enterprise's publisher, wrote an editorial calling AI "the new, advanced form of plagiarism and in the field of media and writing, plagiarism is something every media outlet has had to correct at some point or another. It's the ugly part of the job. But, a company willing to right (or quite literally write) these wrongs is a reputable one." Barton wrote that the newspaper has learned its lesson, has a system in place to recognize AI-generated stories and will "have longer conversations about how AI-generated stories are not acceptable." The Enterprise didn't have an AI policy, in part because it seemed obvious that journalists shouldn't use it to write stories, Bacon said. Poynter has a template from which news outlets can build their own AI policy. Bacon plans to have one in place by the end of the week. "This will be a pre-employment topic of discussion," he said.
[6]
Wyoming reporter caught using artificial intelligence to create fake quotes and stories
HELENA, Mont. (AP) -- A quote from Wyoming's governor and a local prosecutor were the first things that seemed slightly off to Powell Tribune reporter CJ Baker. Then, it was some of the phrases in the stories that struck him as nearly robotic. The dead giveaway, though, that a reporter from a competing news outlet was using generative artificial intelligence to help write his stories came in a June 26 article about the comedian Larry the Cable Guy being chosen as the grand marshal of the Cody Stampede Parade. "The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy's most beloved figures," the Cody Enterprise reported. "This structure ensures that the most critical information is presented first, making it easier for readers to grasp the main points quickly." After doing some digging, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who Baker says admitted that he had used AI in his stories before he resigned from the Enterprise. The publisher and editor at the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, have since apologized and vowed to take steps to ensure it never happens again. In an editorial published Monday, Enterprise Editor Chris Bacon said he "failed to catch" the AI copy and false quotes. "It matters not that the false quotes were the apparent error of a hurried rookie reporter that trusted AI. It was my job," Bacon wrote. He apologized that "AI was allowed to put words that were never spoken into stories." Journalists have derailed their careers by making up quotes or facts in stories long before AI came about. But this latest scandal illustrates the potential pitfalls and dangers that AI poses to many industries, including journalism, as chatbots can spit out spurious if somewhat plausible articles with only a few prompts. AI has found a role in journalism, including in the automation of certain tasks. Some newsrooms, including The Associated Press, use AI to free up reporters for more impactful work, but most AP staff are not allowed to use generative AI to create publishable content. The AP has been using technology to assist in articles about financial earnings reports since 2014, and more recently for some sports stories. It is also experimenting with an AI tool to translate some stories from English to Spanish. At the end of each such story is a note that explains technology's role in its production. Being upfront about how and when AI is used has proven important. Sports Illustrated was criticized last year for publishing AI-generated online product reviews that were presented as having been written by reporters who didn't actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the once-powerful publication's reputation. In his Powell Tribune story breaking the news about Pelczar's use of AI in articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, "Obviously I've never intentionally tried to misquote anybody" and promised to "correct them and issue apologies and say they are misstatements," Baker wrote, noting that Pelczar insisted his mistakes shouldn't reflect on his Cody Enterprise editors. After the meeting, the Enterprise launched a full review of all of the stories Pelczar had written for the paper in the two months he had worked there. They have discovered seven stories that included AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other stories. "They're very believable quotes," Bacon said, noting that the people he spoke to during his review of Pelczar's articles said the quotes sounded like something they'd say, but that they never actually talked to Pelczar. Baker reported that seven people told him that they had been quoted in stories written by Pelczar, but had not spoken to him. Pelczar did not respond to an AP phone message left at a number listed as his asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that had reached out. Baker, who regularly reads the Enterprise because it's a competitor, told the AP that a combination of phrases and quotes in Pelczar's stories aroused his suspicions. Pelczar's story about a shooting in Yellowstone National Park included the sentence: "This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene settings." Baker said the line sounded like the summaries of his stories that a certain chatbot seems to generate, in that it tacks on some kind of a "life lesson" at the end. Another story -- about a poaching sentencing -- included quotes from a wildlife official and a prosecutor that sounded like they came from a news release, Baker said. However, there wasn't a news release and the agencies involved didn't know where the quotes had come from, he said. Two of the questioned stories included fake quotes from Wyoming Gov. Mark Gordon that his staff only learned about when Baker called them. "In one case, (Pelczar) wrote a story about a new OSHA rule that included a quote from the Governor that was entirely fabricated," Michael Pearlman, a spokesperson for the governor, said in an email. "In a second case, he appeared to fabricate a portion of a quote, and then combined it with a portion of a quote that was included in a news release announcing the new director of our Wyoming Game and Fish Department." The most obvious AI-generated copy appeared in the story about Larry the Cable Guy that ended with the explanation of the inverted pyramid, the basic approach to writing a breaking news story. It's not difficult to create AI stories. Users could put a criminal affidavit into an AI program and ask it to write an article about the case including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the preeminent journalism think tank. "These generative AI chatbots are programmed to give you an answer, no matter whether that answer is complete garbage or not," Mahadevan said. Megan Barton, the Cody Enterprise's publisher, wrote an editorial calling AI "the new, advanced form of plagiarism and in the field of media and writing, plagiarism is something every media outlet has had to correct at some point or another. It's the ugly part of the job. But, a company willing to right (or quite literally write) these wrongs is a reputable one." Barton wrote that the newspaper has learned its lesson, has a system in place to recognize AI-generated stories and will "have longer conversations about how AI-generated stories are not acceptable." The Enterprise didn't have an AI policy, in part because it seemed obvious that journalists shouldn't use it to write stories, Bacon said. Poynter has a template from which news outlets can build their own AI policy. Bacon plans to have one in place by the end of the week. "This will be a pre-employment topic of discussion," he said.
[7]
Wyoming Reporter Caught Using Artificial Intelligence to Create Fake Quotes and Stories
HELENA, Mont. (AP) -- A quote from Wyoming's governor and a local prosecutor were the first things that seemed slightly off to Powell Tribune reporter CJ Baker. Then, it was some of the phrases in the stories that struck him as nearly robotic. The dead giveaway, though, that a reporter from a competing news outlet was using generative artificial intelligence to help write his stories came in a June 26 article about the comedian Larry the Cable Guy being chosen as the grand marshal of the Cody Stampede Parade. "The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy's most beloved figures," the Cody Enterprise reported. "This structure ensures that the most critical information is presented first, making it easier for readers to grasp the main points quickly." After doing some digging, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who Baker says admitted that he had used AI in his stories before he resigned from the Enterprise. The publisher and editor at the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, have since apologized and vowed to take steps to ensure it never happens again. In an editorial published Monday, Enterprise Editor Chris Bacon said he "failed to catch" the AI copy and false quotes. "It matters not that the false quotes were the apparent error of a hurried rookie reporter that trusted AI. It was my job," Bacon wrote. He apologized that "AI was allowed to put words that were never spoken into stories." Journalists have derailed their careers by making up quotes or facts in stories long before AI came about. But this latest scandal illustrates the potential pitfalls and dangers that AI poses to many industries, including journalism, as chatbots can spit out spurious if somewhat plausible articles with only a few prompts. AI has found a role in journalism, including in the automation of certain tasks. Some newsrooms, including The Associated Press, use AI to free up reporters for more impactful work, but most AP staff are not allowed to use generative AI to create publishable content. The AP has been using technology to assist in articles about financial earnings reports since 2014, and more recently for some sports stories. It is also experimenting with an AI tool to translate some stories from English to Spanish. At the end of each such story is a note that explains technology's role in its production. Being upfront about how and when AI is used has proven important. Sports Illustrated was criticized last year for publishing AI-generated online product reviews that were presented as having been written by reporters who didn't actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the once-powerful publication's reputation. In his Powell Tribune story breaking the news about Pelczar's use of AI in articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, "Obviously I've never intentionally tried to misquote anybody" and promised to "correct them and issue apologies and say they are misstatements," Baker wrote, noting that Pelczar insisted his mistakes shouldn't reflect on his Cody Enterprise editors. After the meeting, the Enterprise launched a full review of all of the stories Pelczar had written for the paper in the two months he had worked there. They have discovered seven stories that included AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other stories. "They're very believable quotes," Bacon said, noting that the people he spoke to during his review of Pelczar's articles said the quotes sounded like something they'd say, but that they never actually talked to Pelczar. Baker reported that seven people told him that they had been quoted in stories written by Pelczar, but had not spoken to him. Pelczar did not respond to an AP phone message left at a number listed as his asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that had reached out. Baker, who regularly reads the Enterprise because it's a competitor, told the AP that a combination of phrases and quotes in Pelczar's stories aroused his suspicions. Pelczar's story about a shooting in Yellowstone National Park included the sentence: "This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene settings." Baker said the line sounded like the summaries of his stories that a certain chatbot seems to generate, in that it tacks on some kind of a "life lesson" at the end. Another story -- about a poaching sentencing -- included quotes from a wildlife official and a prosecutor that sounded like they came from a news release, Baker said. However, there wasn't a news release and the agencies involved didn't know where the quotes had come from, he said. Two of the questioned stories included fake quotes from Wyoming Gov. Mark Gordon that his staff only learned about when Baker called them. "In one case, (Pelczar) wrote a story about a new OSHA rule that included a quote from the Governor that was entirely fabricated," Michael Pearlman, a spokesperson for the governor, said in an email. "In a second case, he appeared to fabricate a portion of a quote, and then combined it with a portion of a quote that was included in a news release announcing the new director of our Wyoming Game and Fish Department." The most obvious AI-generated copy appeared in the story about Larry the Cable Guy that ended with the explanation of the inverted pyramid, the basic approach to writing a breaking news story. It's not difficult to create AI stories. Users could put a criminal affidavit into an AI program and ask it to write an article about the case including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the preeminent journalism think tank. "These generative AI chatbots are programmed to give you an answer, no matter whether that answer is complete garbage or not," Mahadevan said. Megan Barton, the Cody Enterprise's publisher, wrote an editorial calling AI "the new, advanced form of plagiarism and in the field of media and writing, plagiarism is something every media outlet has had to correct at some point or another. It's the ugly part of the job. But, a company willing to right (or quite literally write) these wrongs is a reputable one." Barton wrote that the newspaper has learned its lesson, has a system in place to recognize AI-generated stories and will "have longer conversations about how AI-generated stories are not acceptable." The Enterprise didn't have an AI policy, in part because it seemed obvious that journalists shouldn't use it to write stories, Bacon said. Poynter has a template from which news outlets can build their own AI policy. Bacon plans to have one in place by the end of the week. "This will be a pre-employment topic of discussion," he said. Copyright 2024 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[8]
Reporter admits using artificial intelligence to create fake quotes and stories before resigning, editor says
A quote from Wyoming's governor and a local prosecutor were the first things that seemed slightly off to Powell Tribune reporter CJ Baker. Then, it was some of the phrases in the stories that struck him as nearly robotic. The dead giveaway, though, that a reporter from a competing news outlet was using generative artificial intelligence to help write his stories came in a June 26 article about the comedian Larry the Cable Guy being chosen as the grand marshal of the Cody Stampede Parade. "The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy's most beloved figures," the Cody Enterprise reported. "This structure ensures that the most critical information is presented first, making it easier for readers to grasp the main points quickly." After doing some digging, Baker, who has been a reporter for more than 15 years, met with Aaron Pelczar, a 40-year-old who was new to journalism and who Baker says admitted that he had used AI in his stories before he resigned from the Enterprise. The publisher and editor at the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, have since apologized and vowed to take steps to ensure it never happens again. In an editorial published Monday, Enterprise Editor Chris Bacon said he "failed to catch" the AI copy and false quotes. "It matters not that the false quotes were the apparent error of a hurried rookie reporter that trusted AI. It was my job," Bacon wrote. He apologized that "AI was allowed to put words that were never spoken into stories." In a separate editorial, publisher Megan Barton wrote that the newspaper now has a system set in place to detect AI-generated stories. "We take extreme pride in the content that we put out to our community and we trust that the individuals hired to accurately write these stories are honest in gathering their information," Barton wrote. "So, you can imagine our surprise when we learned otherwise." Journalists have derailed their careers by making up quotes or facts in stories long before AI came about. But this latest scandal illustrates the potential pitfalls and dangers that AI poses to many industries, including journalism, as chatbots can spit out spurious if somewhat plausible articles with only a few prompts. AI has found a role in journalism, including in the automation of certain tasks. Some newsrooms, including The Associated Press, use AI to free up reporters for more impactful work, but most AP staff are not allowed to use generative AI to create publishable content. The AP has been using technology to assist in articles about financial earnings reports since 2014, and more recently for some sports stories. It is also experimenting with an AI tool to translate some stories from English to Spanish. At the end of each such story is a note that explains technology's role in its production. Being upfront about how and when AI is used has proven important. Sports Illustrated was criticized last year for publishing AI-generated online product reviews that were presented as having been written by reporters who didn't actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the once-powerful publication's reputation. It remains to be seen how much of an impact AI will have on the overall job market, Last year, a firm asked the "generative AI" tool ChatGPT how many workers it expects to replace. The answer: 4.8 million American jobs. In his Powell Tribune story breaking the news about Pelczar's use of AI in articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, "Obviously I've never intentionally tried to misquote anybody" and promised to "correct them and issue apologies and say they are misstatements," Baker wrote, noting that Pelczar insisted his mistakes shouldn't reflect on his Cody Enterprise editors. After the meeting, the Enterprise launched a full review of all of the stories Pelczar had written for the paper in the two months he had worked there. They have discovered seven stories that included AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other stories. "They're very believable quotes," Bacon said, noting that the people he spoke to during his review of Pelczar's articles said the quotes sounded like something they'd say, but that they never actually talked to Pelczar. Baker reported that seven people told him that they had been quoted in stories written by Pelczar, but had not spoken to him. Pelczar did not respond to an AP phone message left at a number listed as his asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that had reached out. Baker, who regularly reads the Enterprise because it's a competitor, told the AP that a combination of phrases and quotes in Pelczar's stories aroused his suspicions. Pelczar's story about a shooting in Yellowstone National Park included the sentence: "This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene settings." Baker said the line sounded like the summaries of his stories that a certain chatbot seems to generate, in that it tacks on some kind of a "life lesson" at the end. Another story - about a poaching sentencing - included quotes from a wildlife official and a prosecutor that sounded like they came from a news release, Baker said. However, there wasn't a news release and the agencies involved didn't know where the quotes had come from, he said. Two of the questioned stories included fake quotes from Wyoming Gov. Mark Gordon that his staff only learned about when Baker called them. "In one case, (Pelczar) wrote a story about a new OSHA rule that included a quote from the Governor that was entirely fabricated," Michael Pearlman, a spokesperson for the governor, said in an email. "In a second case, he appeared to fabricate a portion of a quote, and then combined it with a portion of a quote that was included in a news release announcing the new director of our Wyoming Game and Fish Department." The most obvious AI-generated copy appeared in the story about Larry the Cable Guy that ended with the explanation of the inverted pyramid, the basic approach to writing a breaking news story. It's not difficult to create AI stories. Users could put a criminal affidavit into an AI program and ask it to write an article about the case including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the preeminent journalism think tank. "These generative AI chatbots are programmed to give you an answer, no matter whether that answer is complete garbage or not," Mahadevan said. The Enterprise didn't have an AI policy, in part because it seemed obvious that journalists shouldn't use it to write stories, Bacon said. Poynter has a template from which news outlets can build their own AI policy. Bacon plans to have one in place by the end of the week. "This will be a pre-employment topic of discussion," he said.
[9]
Wyoming reporter caught fabricating quotes and stories with AI was forced to resign - Times of India
A Wyoming journalist has resigned after a competitor revealed that he used artificial intelligence to create fake quotes and stories, including quotes attributed to the state's governor. The incident involved Aaron Pelczar, a 40-year-old rookie reporter at the Cody Enterprise, who admitted to using AI in his articles, according to a report by ABC News. Powell Tribune reporter CJ Baker grew suspicious of Aaron Pelczar's work when quotes from Wyoming Governor Mark Gordon and a local prosecutor felt "slightly off" and "nearly robotic." His concerns peaked with a June 26 article about Larry the Cable Guy leading the Cody Stampede Parade, which included odd lines like, "The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence" and, "This structure ensures that the most critical information is presented first." After noticing these inconsistencies, Baker met with Pelczar, who admitted to using AI to generate stories and quotes. Pelczar said, "Obviously, I've never intentionally tried to misquote anybody," and promised to correct the errors. He resigned shortly after. Cody Enterprise editor Chris Bacon took responsibility for the oversight, admitting, "It matters not that the false quotes were the apparent error of a hurried rookie reporter that trusted AI. It was my job." Bacon apologised, stating, "I failed to catch the AI copy and false quotes." The newspaper, founded in 1899 by Buffalo Bill Cody, has since launched a full review of Aaron Pelczar's work, discovering that seven of his articles contained AI-generated quotes from six different people. Some of the fabricated quotes were attributed to Governor Gordon. Michael Pearlman, a spokesperson for the governor, told ABC News, "In one case, Pelczar wrote a story about a new OSHA rule that included a quote from the governor that was entirely fabricated." Pearlman added that another story combined a real quote from a news release with a fabricated one. Bacon observed that the quotes were "very believable," but none of the individuals quoted had actually spoken to Pelczar. To prevent future incidents, Cody Enterprise is implementing an AI policy, with plans to have it in place by the end of the week. "This will be a pre-employment topic of discussion," Bacon stated. At TOI World Desk, our dedicated team of seasoned journalists and passionate writers tirelessly sifts through the vast tapestry of global events to bring you the latest news and diverse perspectives round the clock. With an unwavering commitment to accuracy, depth, and timeliness, we strive to keep you informed about the ever-evolving world, delivering a nuanced understanding of international affairs to our readers. Join us on a journey across continents as we unravel the stories that shape our interconnected world.
Share
Share
Copy Link
A reporter for the Cody Enterprise in Wyoming has resigned after being caught using artificial intelligence to create fake quotes and stories. The incident has raised concerns about the use of AI in journalism and its potential to undermine trust in news reporting.
In a shocking revelation that has sent ripples through the journalism community, a reporter for the Cody Enterprise in Wyoming has resigned after being caught using artificial intelligence to generate fake quotes and stories 1. The incident came to light when the newspaper's editor, Amber Peabody, discovered discrepancies in the reporter's work 2.
The reporter, identified as CJ Baker, had been employed at the Cody Enterprise for nearly two decades 4. Baker admitted to using AI to fabricate quotes and even entire stories, a practice that had apparently been ongoing for some time. The extent of the fabrication is still being investigated, with the newspaper reviewing Baker's past work to determine the full scope of the deception 3.
Upon discovery of the misconduct, Baker promptly resigned from his position at the Cody Enterprise. The newspaper has since issued an apology to its readers and has committed to reviewing and correcting any inaccuracies in previously published articles 5.
This incident has sparked a broader conversation about the role of AI in journalism and the potential risks it poses to the integrity of news reporting. While AI tools can be valuable for tasks such as data analysis and research, their use in content creation raises serious ethical concerns 2.
The revelation has prompted discussions within the journalism community about the need for clear guidelines and ethical standards regarding the use of AI in reporting. Many news organizations are now reevaluating their policies and considering implementing stricter oversight measures to prevent similar incidents in the future 5.
The incident has raised concerns about the potential erosion of public trust in journalism. As AI technology becomes more sophisticated and accessible, there are fears that such deceptive practices could become more prevalent, making it increasingly difficult for readers to distinguish between authentic reporting and AI-generated content 1.
As the journalism industry grapples with the implications of this incident, there is a growing consensus on the need for transparency, accountability, and ethical guidelines in the use of AI tools in newsrooms. The Cody Enterprise case serves as a cautionary tale, highlighting the importance of maintaining journalistic integrity in an era of rapidly advancing technology 3.
Reference
[1]
Quartz, owned by G/O Media, has been publishing AI-generated news articles, sparking debates about accuracy, sourcing, and the future of journalism.
2 Sources
2 Sources
A Polish radio station's experiment with AI-generated content, including an 'interview' with a deceased Nobel Prize winner, ignites debate on ethics and the future of media.
2 Sources
2 Sources
Il Foglio, an Italian newspaper, has published the world's first AI-generated edition, raising questions about the role of artificial intelligence in journalism and its potential impact on the industry.
7 Sources
7 Sources
Recent tests reveal that AI detectors are incorrectly flagging human-written texts, including historical documents, as AI-generated. This raises questions about their accuracy and the potential consequences of their use in academic and professional settings.
2 Sources
2 Sources
Some US police departments are experimenting with AI chatbots to write crime reports, aiming to save time and improve efficiency. However, this practice has sparked debates about accuracy, racial bias, and the potential impact on the justice system.
11 Sources
11 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved