Go to main contentsGo to main menu
Friday, November 22, 2024 at 3:04 PM
funeral

Pulling the plug on AI

These days, it’s getting harder to tell what’s real and what’s not, both in words and images.
Pulling the plug on AI
Pulling the plug on AI

These days, it’s getting harder to tell what’s real and what’s not, both in words and images.

Thanks to technological advances coming so fast it makes one’s head spin, software programs can now convincingly duplicate voices, falsely enhance or create totally realistic-looking landscapes that never existed with people who were never there and even show “live” images of actors, politicians and others speaking or moving that are completely fabricated but are very hard to detect as deceptions.

These are called deep fakes. As the name implies, they are misleading, disingenuous and dangerous.

However, the misdirection doesn’t stop there. Thanks to generative artificial intelligence, or generative AI, the same can be done to produce news stories and press releases.

Generative AI refers to models or algorithms that create new output, including articles, other copy, pictures and videos. The process generates new content by sampling or referring back to original data or source material an algorithm has been programmed to repurpose.

In short, some folks think a computer program can take the place of flesh-and-blood reporters and editors and churn out news stories. Recent research has taught us this is a practice fraught with pitfalls.

Software that corrects spelling mistakes or notes when you use an adverb too often are one thing, but information filtered through an AI system to create articles or other news content risks missing the mark for accuracy and impartiality.

That’s why we have trained, professional journalists covering events and writing articles, not machines. Only people, not algorithms, understand the full range of human emotion, human frailty and human subtlety.

Using generative AI to detail sweeping cinematic stories about superheroes and spaceships might be fine on the big screen, though it’s also one of the reasons actors and writers recently went on strike.

Using the same process to relate stories about real people and real life does not work, at least for us. Other media companies may be doing it, telling readers it’s about better data collection, better grammar and better curation, but in many cases it just seems like an excuse to cut costs and trim personnel.

This newspaper, and our parent company Granite Media Partners Inc., is banning the sweeping use of generative AI to help with or create any news content or images.

We’re making a pledge to our readers that your stories will be told by real people — folks who live, shop, gather and worship in your communities.

In other words, our staffers are your neighbors.

No robots need apply. Across all of our parent company’s publications, publishers and ranking editors have told their staffs: “The use of generative AI to create or edit content at Granite Media Partners Inc. and its properties is expressly forbidden. At this time, the company feels the use of such a tool to generate text or images in any of our news products or other material is shortchanging our audience, which expects accurate and balanced reporting through the talents, observations and nuances of a trained and competent journalist.”

The same goes for using AI to edit stories or moderate content.

Over time, this newspaper will continue to review the technology, its application and whether it can ever play a role regarding our newsgathering.

For now, we’re going to keep our very human link to the communities we cover without help from artificial intelligence.

Edwards is the vice president of content for Granite Media Partners Inc., which owns this newspaper. He can be reached at [email protected].


Share
Rate

banderapaintandbody
hillcountryaudiology
picopropane
DOWNLOAD OUR APP
Google Play StoreApple App Store