Artificial intelligence coding companies are facing copyright and plagiarism lawsuits.
The new tech uses information that is repackaged and repurposed.
Authors, artists, content originators, and internet publishers are going to potentially seek a cut for the use of their work by AI tools as the technology ramps up, The Wall Street Journal reported Sunday.
"This will not end well for creatives," bestselling author James Patterson told the Journal.
Patterson is lashing out at the "frightening" reality of the unregulated use of his work to train AI tools without permission or compensation — using his income-earning work for free — all while potentially getting the code-writer and company paid for it.
Elon Musk has already sniffed it out, moving to limit the number of tweets a user can view as a way to keep AI bots from scraping Twitter — now called X — to grab content for free.
There was an effort earlier this month by thousands of authors, including Patterson and Margaret Atwood, in an open letter to AI companies to obtain permission and provide compensation for use of their work.
Comedian Sarah Silverman is a party of a lawsuit against OpenAI and Meta for allegedly using work from her "shadow libraries" on the Internet.
The Associated Press has signed a deal with OpenAI to use its news archive for its AI tools, while News Corp., The New Yorker, Rolling Stone, and Politico are among the companies seeking compensation, sources told Journal.
OpenAI and Alphabet, the parent of Google that launched ChatGPT, say they use "publicly available" content for its AI tools, but this is still the Wild West days of AI, experts told the Journal.
"The cases are new and dealing with questions of a scale that we haven't seen before," Yale Law School's Information Society Project's Mehtab Khan told the Journal. "The question becomes about feasibility. How will they reach out to every single author?"
Tech companies are seeking to use the "fair use" legal doctrine and AI advocates are pleading to keep the free flow of ideas running through AI developments.
"If a person can freely access and learn from information on the Internet, I'd like to see AI systems allowed to do the same, and I believe this will benefit society,'" Stanford University's Andrew Ng, an AI investor and researcher, told the Journal.
But AI is unfair to the originator, according to Silverman's lawyer Matthew Butterick, depending "entirely on having a data set of quality work, made by humans, and if they collapse that market, their systems are going to collapse too.
"They can't bankrupt artists without bankrupting themselves," he told the Journal.
Palantir CEO Alex Karp also warned in The New York Times last week the technology's limitless pathways could be dangerous, potentially even revealing, if not producing, a direct threat to humanity.
"It is not at all clear — not even to the scientists and programmers who build them — how or why the generative language and image models work," Karp warned in the Times.
Related Stories:
Eric Mack ✉
Eric Mack has been a writer and editor at Newsmax since 2016. He is a 1998 Syracuse University journalism graduate and a New York Press Association award-winning writer.
© 2024 Newsmax. All rights reserved.