Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(53,118 posts)
12. It can be clear, before a legal resolution, when companies and individuals are in the wrong.
Thu Apr 11, 2024, 10:05 AM
Apr 2024

And your comparison of training an LLM to a human learning is also wrong. No human has the time to absorb all the data being crammed into LLMs.

There have also been a lot of copyright cases already that humans lost.

The AI companies are trying to argue that they should be exempt from the laws protecting the artists and others they stole from, while at the same time insisting on their own rights.

AI proponents are already complaining about this bill Adam Schiff introduced. See reply 2 here:

https://www.democraticunderground.com/10143222977

Their basic arguments are that

1) it would have slowed down AI development too much if they'd had to get permission to use all that intellectual property

2) it's too much trouble now to divulge everything they stole and revealing what's in the dataset will take company time that will cost them money, which will reduce profits

3) it will reduce their profits too much if they have to compensate the intellectual property owners

4) revealing what data they stole will cost them a competitive advantage, what a lawyer called their "secret sauce"

5) they'll get sued by at least some of those whose IP they stole

6) so if we want to have AI, we have to let the AI companies have all the data they want, for free.

Not the most ethical arguments to make.

Which is why they're trying to keep datasets secret. Why they're putting clauses in their TOS making users assume full responsibility for copyright violations from a dataset whose content the users aren't allowed to know. Why they're lobbying governments around the world to exempt AI from most standard regulation, especially copyright laws.

And why they initially made so many AI tools free, in the hope that more and more users will be so enchanted with them that they'll

1) give the AI companies lots of free advertising (students using ChatGPT to cheat created most of the early hype), and

2) take the side of the AI companies despite the grand theft and all the harms done, and put pressure on courts and lawmakers to let them get away with it.

Recommendations

0 members have recommended this reply (displayed in chronological order):

Latest Discussions»Culture Forums»Artists»David Bowie Painting Styl...»Reply #12